diff options
author | Linus Torvalds <torvalds@linux-foundation.org> | 2022-01-13 03:31:19 +0300 |
---|---|---|
committer | Linus Torvalds <torvalds@linux-foundation.org> | 2022-01-13 03:31:19 +0300 |
commit | 64ad9461521b1a357846ef6cedc4bccd48a046e0 (patch) | |
tree | f134404b6c6df89198a4a2f2be6fef21af133c73 /arch/x86/lib/checksum_32.S | |
parent | 8e5b0adeea19309c8ce0e3c9119061554973efa9 (diff) | |
parent | 9cdbeec4096804083944d05da96bbaf59a1eb4f9 (diff) | |
download | linux-64ad9461521b1a357846ef6cedc4bccd48a046e0.tar.xz |
Merge tag 'x86_core_for_v5.17_rc1' of git://git.kernel.org/pub/scm/linux/kernel/git/tip/tip
Pull x86 core updates from Borislav Petkov:
- Get rid of all the .fixup sections because this generates
misleading/wrong stacktraces and confuse RELIABLE_STACKTRACE and
LIVEPATCH as the backtrace misses the function which is being fixed
up.
- Add Straight Line Speculation mitigation support which uses a new
compiler switch -mharden-sls= which sticks an INT3 after a RET or an
indirect branch in order to block speculation after them. Reportedly,
CPUs do speculate behind such insns.
- The usual set of cleanups and improvements
* tag 'x86_core_for_v5.17_rc1' of git://git.kernel.org/pub/scm/linux/kernel/git/tip/tip: (32 commits)
x86/entry_32: Fix segment exceptions
objtool: Remove .fixup handling
x86: Remove .fixup section
x86/word-at-a-time: Remove .fixup usage
x86/usercopy: Remove .fixup usage
x86/usercopy_32: Simplify __copy_user_intel_nocache()
x86/sgx: Remove .fixup usage
x86/checksum_32: Remove .fixup usage
x86/vmx: Remove .fixup usage
x86/kvm: Remove .fixup usage
x86/segment: Remove .fixup usage
x86/fpu: Remove .fixup usage
x86/xen: Remove .fixup usage
x86/uaccess: Remove .fixup usage
x86/futex: Remove .fixup usage
x86/msr: Remove .fixup usage
x86/extable: Extend extable functionality
x86/entry_32: Remove .fixup usage
x86/entry_64: Remove .fixup usage
x86/copy_mc_64: Remove .fixup usage
...
Diffstat (limited to 'arch/x86/lib/checksum_32.S')
-rw-r--r-- | arch/x86/lib/checksum_32.S | 27 |
1 files changed, 7 insertions, 20 deletions
diff --git a/arch/x86/lib/checksum_32.S b/arch/x86/lib/checksum_32.S index 4304320e51f4..23318c338db0 100644 --- a/arch/x86/lib/checksum_32.S +++ b/arch/x86/lib/checksum_32.S @@ -127,7 +127,7 @@ SYM_FUNC_START(csum_partial) 8: popl %ebx popl %esi - ret + RET SYM_FUNC_END(csum_partial) #else @@ -245,7 +245,7 @@ SYM_FUNC_START(csum_partial) 90: popl %ebx popl %esi - ret + RET SYM_FUNC_END(csum_partial) #endif @@ -260,9 +260,9 @@ unsigned int csum_partial_copy_generic (const char *src, char *dst, * Copy from ds while checksumming, otherwise like csum_partial */ -#define EXC(y...) \ - 9999: y; \ - _ASM_EXTABLE_UA(9999b, 6001f) +#define EXC(y...) \ + 9999: y; \ + _ASM_EXTABLE_TYPE(9999b, 7f, EX_TYPE_UACCESS | EX_FLAG_CLEAR_AX) #ifndef CONFIG_X86_USE_PPRO_CHECKSUM @@ -358,20 +358,11 @@ EXC( movb %cl, (%edi) ) adcl $0, %eax 7: -# Exception handler: -.section .fixup, "ax" - -6001: - xorl %eax, %eax - jmp 7b - -.previous - popl %ebx popl %esi popl %edi popl %ecx # equivalent to addl $4,%esp - ret + RET SYM_FUNC_END(csum_partial_copy_generic) #else @@ -439,15 +430,11 @@ EXC( movb %dl, (%edi) ) 6: addl %edx, %eax adcl $0, %eax 7: -.section .fixup, "ax" -6001: xorl %eax, %eax - jmp 7b -.previous popl %esi popl %edi popl %ebx - ret + RET SYM_FUNC_END(csum_partial_copy_generic) #undef ROUND |