from cron@feddit.org to cybersecurity@infosec.pub on 07 Aug 21:31
https://feddit.org/post/16995868
In his groundbreaking new research, HTTP/1.1 Must Die: The Desync Endgame, Kettle challenges the security community to completely rethink its approach to request smuggling. He argues that, in practical terms, it’s nigh on impossible to consistently and reliably determine the boundaries between HTTP/1.1 requests, especially when implemented across the chains of interconnected systems that comprise modern web architectures. Mistakes such as parsing discrepancies are inevitable, and when using upstream HTTP/1.1, even the tiniest of bugs often have critical security impact, including complete site takeover.
This research demonstrates unequivocally that patching individual implementations will never be enough to eliminate the threat of request smuggling. Using upstream HTTP/2 offers a robust solution.
I just read this article in a marketing blog from portswigger, the maker of the penetration testing tool burp suite.
Can someone with more insight explain what we’re supposed to do? Completely disabling HTTP/1.1 is probably not doable for many organisations.
threaded - newest
Sort of a self-answer, now that i read more about this issue. The problem is not on the frontend (browser --> server), but with shared connections in the backend. E.g. you have a reverse proxy in place. Whats relevant is that the connection between the reverse proxy and the backend server should be HTTP/2.