mirror of
https://github.com/xemu-project/xemu.git
synced 2024-12-02 16:46:59 +00:00
88d8d5555d
Fix the helper_fpscr_clrbit() function so it correctly sets the FEX and VX bits. Determining the value for the Floating Point Status and Control Register's (FPSCR) FEX bit is suppose to be done like this: FEX = (VX & VE) | (OX & OE) | (UX & UE) | (ZX & ZE) | (XX & XE)) It is described as "the logical OR of all the floating-point exception bits masked by their respective enable bits". It was not implemented correctly. The value of FEX would stay on even when all other bits were set to off. The VX bit is described as "the logical OR of all of the invalid operation exceptions". This bit was also not implemented correctly. It too would stay on when all the other bits were set to off. My main source of information is an IBM document called: PowerPC Microprocessor Family: The Programming Environments for 32-Bit Microprocessors Page 62 is where the FPSCR information is located. This is an older copy than the one I use but it is still very useful: https://www.pdfdrive.net/powerpc-microprocessor-family-the-programming-environments-for-32-e3087633.html I use a G3 and G5 iMac to compare bit values with QEMU. This patch fixed all the problems I was having with these bits. Signed-off-by: John Arbuckle <programmingkidx@gmail.com> [dwg: Re-wrapped commit message] Signed-off-by: David Gibson <david@gibson.dropbear.id.au> |
||
---|---|---|
.. | ||
alpha | ||
arm | ||
cris | ||
hppa | ||
i386 | ||
lm32 | ||
m68k | ||
microblaze | ||
mips | ||
moxie | ||
nios2 | ||
openrisc | ||
ppc | ||
riscv | ||
s390x | ||
sh4 | ||
sparc | ||
tilegx | ||
tricore | ||
unicore32 | ||
xtensa |