Researchers have revealed a new fuzzing tool for detecting unique HTTP request smuggling strategies. The ‘T-Reqs’ tool was developed by a team from Northeastern University. The researchers detail how they uncovered a slew of new vulnerabilities using the fuzzing tool in a white paper, which they claim can be used by researchers and bug bounty hunters.
HTTP request smuggling disrupts how websites process HTTP request sequences obtained from users. Load balancers (also known as reverse proxies) send several HTTP requests to back-end servers in a row over the same network connection. Attackers may sneak disguised requests via the proxy if there is a difference between the front-end and back-end servers. It might have far-reaching repercussions, including account hijacking and cache poisoning, among other things. The Content-Length and Transfer-Encoding headers were the focus of a previous study on vulnerability.
T-Reqs is a grammar-based HTTP fuzzer that produces HTTP requests and applies mutations to trigger probable server processing oddities. It sends the identical modified request to two target servers and examines the results to find inconsistencies that lead to smuggling attacks. More information on the vulnerabilities and technical specifics may be found in the white paper.
The intriguing aspect about request smuggling is that it is a system issue. Even if researchers devised a magical development technique and began producing faultless servers, they would still fail miserably in the face of request smuggling. Secure components do not always equal a secure system; security is an emergent attribute of the entire system.
Researchers haven’t always viewed security through this perspective, but that’s changing because of attacks like smuggling, cache poisoning, and cache deception, which have lately gained popularity. The key to defeating the next wave of web attacks is to take a systems-centric approach.