summaryrefslogtreecommitdiff
path: root/arch/x86
diff options
context:
space:
mode:
authorJosh Poimboeuf <jpoimboe@redhat.com>2016-05-16 23:16:18 +0300
committerHerbert Xu <herbert@gondor.apana.org.au>2016-05-17 09:26:52 +0300
commit4a6b27b79da5ccc6b85dc05bbe6a091e58be896a (patch)
tree3b6c2e16d410844d288165218e62c037cb0e8ca3 /arch/x86
parent256b1cfb9a346bb4808cd27b7b8f9b120f96491e (diff)
downloadlinux-4a6b27b79da5ccc6b85dc05bbe6a091e58be896a.tar.xz
crypto: sha1-mb - make sha1_x8_avx2() conform to C function ABI
Megha Dey reported a kernel panic in crypto code. The problem is that sha1_x8_avx2() clobbers registers r12-r15 without saving and restoring them. Before commit aec4d0e301f1 ("x86/asm/crypto: Simplify stack usage in sha-mb functions"), those registers were saved and restored by the callers of the function. I removed them with that commit because I didn't realize sha1_x8_avx2() clobbered them. Fix the potential undefined behavior associated with clobbering the registers and make the behavior less surprising by changing the registers to be callee saved/restored to conform with the C function call ABI. Also, rdx (aka RSP_SAVE) doesn't need to be saved: I verified that none of the callers rely on it being saved, and it's not a callee-saved register in the C ABI. Fixes: aec4d0e301f1 ("x86/asm/crypto: Simplify stack usage in sha-mb functions") Cc: stable@vger.kernel.org # 4.6 Reported-by: Megha Dey <megha.dey@linux.intel.com> Signed-off-by: Josh Poimboeuf <jpoimboe@redhat.com> Signed-off-by: Herbert Xu <herbert@gondor.apana.org.au>
Diffstat (limited to 'arch/x86')
-rw-r--r--arch/x86/crypto/sha-mb/sha1_x8_avx2.S13
1 files changed, 11 insertions, 2 deletions
diff --git a/arch/x86/crypto/sha-mb/sha1_x8_avx2.S b/arch/x86/crypto/sha-mb/sha1_x8_avx2.S
index 8e1b47792b31..c9dae1cd2919 100644
--- a/arch/x86/crypto/sha-mb/sha1_x8_avx2.S
+++ b/arch/x86/crypto/sha-mb/sha1_x8_avx2.S
@@ -296,7 +296,11 @@ W14 = TMP_
#
ENTRY(sha1_x8_avx2)
- push RSP_SAVE
+ # save callee-saved clobbered registers to comply with C function ABI
+ push %r12
+ push %r13
+ push %r14
+ push %r15
#save rsp
mov %rsp, RSP_SAVE
@@ -446,7 +450,12 @@ lloop:
## Postamble
mov RSP_SAVE, %rsp
- pop RSP_SAVE
+
+ # restore callee-saved clobbered registers
+ pop %r15
+ pop %r14
+ pop %r13
+ pop %r12
ret
ENDPROC(sha1_x8_avx2)