diff options
author | Paul Burton <paul.burton@mips.com> | 2018-08-18 01:36:24 +0300 |
---|---|---|
committer | Paul Burton <paul.burton@mips.com> | 2018-08-18 02:40:11 +0300 |
commit | cfd54de3b0e4f82b3f2b80230726f74a6bec964f (patch) | |
tree | a39924a60e70e0a93f6eab4f1f9caf71f26b5593 /arch/mips | |
parent | 4bcb4ad6634ea1084faa630fe35d72f75b5e80fa (diff) | |
download | linux-cfd54de3b0e4f82b3f2b80230726f74a6bec964f.tar.xz |
MIPS: Avoid move psuedo-instruction whilst using MIPS_ISA_LEVEL
MIPS_ISA_LEVEL is always defined as the 64 bit ISA that is a compatible
superset of the ISA that the kernel build is targeting, and is used to
allow us to emit instructions that we may detect support for at runtime.
When we use a .set MIPS_ISA_LEVEL directive & are building a 32-bit
kernel, we therefore are temporarily allowing the assembler to generate
MIPS64 instructions. Using the move pseudo-instruction whilst this is
the case is problematic because the assembler is likely to emit a daddu
instruction which will generate a reserved instruction exception when
executed on a MIPS32 machine.
Unfortunately the combination of commit a0a5ac3ce8fe ("MIPS: Fix delay
slot bug in `atomic*_sub_if_positive' for R10000_LLSC_WAR") and commit
4936084c2ee2 ("MIPS: Cleanup R10000_LLSC_WAR logic in atomic.h") causes
us to do exactly this in atomic_sub_if_positive(), and the result is
MIPS64 daddu instructions in 32-bit kernels.
Fix this by using .set mips0 to restore the default ISA after the ll
instruction, and use .set MIPS_ISA_LEVEL again prior to the sc. This
ensures everything but the ll & sc are assembled using the default ISA
for the kernel build & the move pseudo-instruction is emitted as a
MIPS32 addu instruction.
We appear to have another pre-existing instance of the same issue in our
atomic_fetch_*_relaxed() functions, and fix that up too by moving our
.set move0 such that it occurs prior to use of the move
pseudo-instruction.
Signed-off-by: Paul Burton <paul.burton@mips.com>
Fixes: a0a5ac3ce8fe ("MIPS: Fix delay slot bug in `atomic*_sub_if_positive' for R10000_LLSC_WAR")
Fixes: 4936084c2ee2 ("MIPS: Cleanup R10000_LLSC_WAR logic in atomic.h")
Patchwork: https://patchwork.linux-mips.org/patch/20253/
Cc: James Hogan <jhogan@kernel.org>
Cc: Joshua Kinard <kumba@gentoo.org>
Cc: Ralf Baechle <ralf@linux-mips.org>
Cc: linux-mips@linux-mips.org
Diffstat (limited to 'arch/mips')
-rw-r--r-- | arch/mips/include/asm/atomic.h | 4 |
1 files changed, 3 insertions, 1 deletions
diff --git a/arch/mips/include/asm/atomic.h b/arch/mips/include/asm/atomic.h index 3ccea238be2d..f8793f1f2670 100644 --- a/arch/mips/include/asm/atomic.h +++ b/arch/mips/include/asm/atomic.h @@ -122,8 +122,8 @@ static __inline__ int atomic_fetch_##op##_relaxed(int i, atomic_t * v) \ " " #asm_op " %0, %1, %3 \n" \ " sc %0, %2 \n" \ "\t" __scbeqz " %0, 1b \n" \ - " move %0, %1 \n" \ " .set mips0 \n" \ + " move %0, %1 \n" \ : "=&r" (result), "=&r" (temp), \ "+" GCC_OFF_SMALL_ASM() (v->counter) \ : "Ir" (i)); \ @@ -190,9 +190,11 @@ static __inline__ int atomic_sub_if_positive(int i, atomic_t * v) __asm__ __volatile__( " .set "MIPS_ISA_LEVEL" \n" "1: ll %1, %2 # atomic_sub_if_positive\n" + " .set mips0 \n" " subu %0, %1, %3 \n" " move %1, %0 \n" " bltz %0, 1f \n" + " .set "MIPS_ISA_LEVEL" \n" " sc %1, %2 \n" "\t" __scbeqz " %1, 1b \n" "1: \n" |