-
-
Notifications
You must be signed in to change notification settings - Fork 401
Rule 932200, now inspecting Referer headers, matches any query string that contains spaces #3180
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I vote for simply removing |
I currently using the following rule as an exclusion. It handles the above situation and another situation that my personal blog faced.
|
The exclusion rule regex can be simplified to I think the regex should be improved to leave out the alphanumeric character range and a |
I also think we should remove Referer again. |
If you're going to remove it, then atleast take into consideration what @theseion suggested. There are cases in which the Referer header can be exploited by attackers and we need to have separate rule targeting common bypass techniques. |
I thought this over and I think removing is totally not an solution. Rather, there should be a separate rule specifically targeting common bypasses. |
another really bad match is letsencrypt User-Agent...
|
How did you get the Let's Encrypt FP? I've set Certbot to automatically update Certificates at the first of every month and have never faced any problems. |
@Xhoenix I installed the newest v4 version on several servers and they had all the same problem... if this won't get fixed we definitely will exclude this rule for User-Agent for our deployment. |
There's already a separate issue for this: #3163. |
@emphazer I think the problem is while installing new certificates and not while renewing them. Did the FP also shows up when renewing certs? |
@Xhoenix i didn't try it with new certificates yet. we run everything at PL2 detectiononly with the newest dev/4 release for testing purposes on production machines. |
@RedXanadu How about this:- In the above regex if I've used the For a basic test:- echo "$testing/t so-me.php?isanothertest" | grep -oP "/[ a-zA-Z0-9._-]+" Is there any reason this shouldn't be used? I'll be happy to open a PR if you want this. 🙂 |
Okay, the hyphen character( |
@Xhoenix Not sure if that would work, you'd have to test it. (I'm not working on a fix for this issue, just FYI.) The outcome from the meeting discussion was to try and fix the existing regular expression. If that does not work then 'Referer' inspection for this case will be moved to paranoia level 3 as we cannot afford these false positives at PL 2. |
New rules 932205, 932206 to handle the Referer header explicitly. The regular expression in 932200 leads to false positives against URLs with query strings (due to the `?`). 932205 uses an additional prefix in the regular expression that matches the first `?` so that the following expressions will only match question marks that are part of the payload. 932206 uses an additional prefix to match only when the Referer value is not a URL (which is illegal). 932206 is thus equivalent to 932200 but is required to distinguish the case where the Referer header does actually contain a URL. Fixes coreruleset#3180
Small typos in rule 932200 regex. Updated regex can be viewed here:- https://regex101.com/r/w3kLqB/1 Checking the other rules. |
As far as I can tell, your version is identical to the one in this PR...? |
@Xhoenix There's a CRS rule that states that forward slashes are not escaped in regular expression patterns, in the interests of making patterns slightly easier to read. That might help explain why the patterns in the PR look the way that they do 🙂 @theseion I've suggested (via a PR to your PR) to also add tests to check against the original rule that was reported to have false positives. I've also asked a question in the PR review as I don't follow what part of the PR is doing. |
@RedXanadu I can't find the PR that you say you've opened. |
Found it. Since the PR has already been merged, I've created a new PR with your changes: #3309. |
Description
Rule 932200 was recently updated to add the following items for inspection:
REQUEST_HEADERS:Referer|REQUEST_HEADERS:User-Agent
Unfortunately, with the Referer header, this rule now causes false positives for the following scenario:
+
) and/or URL includes one or more space characters (%20
).For example, these headers both cause a false positive:
Referer: http://www.example.com/page?param=test+test'
Referer: http://www.example.com/page%20test?param=test'
This header does not cause a false positive, as there are no encoded space characters:
Referer: http://www.example.com/page?param=test_no_space_char'
Cause
Rule 932200 has a regular expression looking for one of a number of optional patterns. This is the problematic pattern:
/[^/]+?[*?`\x5c']
<forward slash><non-forward slash chars><question mark>
-> this will match URLs that feature a query string, e.g./some/dir/page?foo=bar
The presence of this chained rule:
SecRule MATCHED_VAR "@rx \s" "t:none,t:urlDecodeUni
means that a space character must also be present.
How to reproduce the misbehavior (-> curl call)
Testing the CRS Docker container:
=>
[Thu Mar 30 11:13:44.997007 2023] [security2:error] [pid 40:tid 140616113563392] [client 172.19.0.1:54008] [client 172.19.0.1] ModSecurity: Warning. Pattern match "\\\\s" at MATCHED_VAR. [file "/etc/modsecurity.d/owasp-crs/rules/REQUEST-932-APPLICATION-ATTACK-RCE.conf"] [line "942"] [id "932200"] [msg "RCE Bypass Technique"] [data "Matched Data: /page? found within REQUEST_HEADERS:Referer: http://www.example.com/page?param=test test"] [severity "CRITICAL"] [ver "OWASP_CRS/4.0.0-rc1"] [tag "modsecurity"] [tag "application-multi"] [tag "language-multi"] [tag "platform-multi"] [tag "attack-rce"] [tag "paranoia-level/2"] [tag "OWASP_CRS"] [tag "capec/1000/152/248/88"] [tag "PCI/6.5.2"] [hostname "localhost"] [uri "/"] [unique_id "ZCVu6EXwaRSBQCgL9kHo9wAAAIw"], referer: http://www.example.com/page?param=test+test
Your Environment
Confirmation
[x] I have removed any personal data (email addresses, IP addresses,
passwords, domain names) from any logs posted.
The text was updated successfully, but these errors were encountered: