8000 Sec-CH-UA / Sec-CH-UA-Mobile request headers are excluded from validation · Issue #2027 · coreruleset/coreruleset · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
8000

Sec-CH-UA / Sec-CH-UA-Mobile request headers are excluded from validation #2027

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
somechris opened this issue Mar 6, 2021 · 5 comments · Fixed by #2028
Closed

Sec-CH-UA / Sec-CH-UA-Mobile request headers are excluded from validation #2027

somechris opened this issue Mar 6, 2021 · 5 comments · Fixed by #2028

Comments

@somechris
Copy link

Description

The Sec-CH-UA and Sec-CH-UA-Mobile are simply excluded from rule 920274 and not validated.

Sec-CH-UA-Mobile is a Structured Header boolean and can be validated by rule 920275.
Sec-CH-UA is most likely collateral damage from excluding Sec-CH-UA-Mobile, and Sec-CH-UA does not need to get excluded from rule 920274 at all.

For example:

curl --header 'Sec-CH-UA-Mobile: foo' https://$YOUR_SITE/

fails to trigger a rule on paranoia level 4.

Your Environment

  • CRS version (e.g., v3.2.0): v3.4/dev at e2839fe
  • Paranoia level setting: 4
  • ModSecurity version (e.g., 2.9.3): 3.1.0
  • Web Server and version (e.g., apache 2.4.41): apache 2.4.38
  • Operating System and version: Debian Buster

Confirmation

[x] I have removed any personal data (email addresses, IP addresses,
passwords, domain names) from any logs posted.

@dune73
Copy link
Member
dune73 commented Mar 15, 2021

Thank you for reporting this @somechris. This is an error in the rule set.

sec-ch-ua-mobile

I agree, this should not be ignored by 920274 and that it can easily be added to the allow-list in 920275.

sec-ch-ua

Given this header is a string or list of strings, I see a big potential for FPs with benign use of this header on 920274. So why do you think it should be removed from the ignore-list in 920274?

@csanders: Do you care to chime in here as well?

@somechris
Copy link
Author

Given this header is a string or list of strings, I see a big potential for FPs with benign use of this header on 920274. So why do you think it should be removed from the ignore-list in 920274?

Most headers in HTTP traffic are strings that could potentially hold values for FPs and trigger 920274. Yet only a few of them are ignored from 920274.
E.g.: The Accept header allows % in its definition[1]. Benign use could potentially trigger 920274. Yet Accept is included in 920274 and I have yet to see a FP for it.

The thing is that in benign real-world traffic, the Accept header is free of %. This makes rule 920274 useful.
Seeing strange, unexpected characters in headers makes them stand out and look suspicious.

The Sec-CH-UA header is just like the Accept header. It allows a special character (?) that could potentially trigger 920274 in benign traffic. But in real-world traffic, it does not occur. At least on the servers that I'm involved with. So seeing a ? in Sec-CH-UA sticks out and makes the HTTP request look suspicious. I would like to be alerted about that. Especially at paranoia-level 4, which 920274 is in.

As benign, real-world Sec-CH-UA headers are currently free of ? and other characters that would trigger 920274, I'd argue it should be taken off of the ignore list.

If at some point benign, real-world traffic comes with Sec-CH-UA headers that trigger 920274, we should definitely ignore the header from 920274. Same as we should do with the Accept header, if at some point there is wide-spread use that given FPs for 920274.

[1] The Accept header is RFC7231. It is defined as allowing type, which is defined through token, which in turn is defined in RFC7230 through tchar, which allows %.

@dune73
Copy link
Member
dune73 commented Mar 17, 2021

Thank you for your extensive explanation @somechris. This makes a lot of sense.

This sounds like you are really running at PL4. Is that a substantial amount of traffic? If yes, that's very interesting for us, since it is very rare (I only have customers at PL4 with relatively little traffic).

How do you assess ignoring the User-Agent in 920274 (and referer for that matter)?

@somechris
Copy link
Author

Is that a substantial amount of traffic?

No, it's not. It's just some 5-10M reqs/month.
And the sites are custom built from ground up and mostly static. So they are a far cry from today's modern webapplications.
And already for these basic sites, I need to adapt rules quite a bit (which is expected. No complaint here.) to make it work in PL4.

While the 5-10M reqs/month is not much, it still covers enough ground to see problematic browser updates in the logs early on.
If only one had the time to monitor log files more closely/act on them. 🤷
Luckily, users still bear with me and ping me, when things break after they updated their browser.

How do you assess ignoring the User-Agent in 920274 (and referer for that matter)?

I don't have current, robust numbers on User-Agent / Refer [1]. Not sure though such numbers are really needed in this case.

From experience of $PREVIOUS_DAYJOBS, ignoring both User-Agent and Refer in 920274 make a lot of sense.
One encounters everything in them. I'd definitely want to keep them ignored.

Leaving all other things aside, many proper, good bots come with User-Agents that would violate rule 920274's @validateByteRange. Just capturing live traffic on one of my sites immediately gave a hit with the SemrushBot. From the few requests that I saw, the bot seems to behave properly. And it comes with an URL to a very detailed page describing it. Looks like a proper, good bot to me. Yet, its User-Agent contains SemrushBot/7~bl. The ~ (126) in there would give a FP with 920274.

And for Referer, failing to ignore it in 920274 would backfire immediately as well. If you're on a site https://example.com/foo.html?bar and click on an internal link like ./baz.html link, the referer would be https://example.com/foo.html?bar and the contained ? (63) would trigger a FP with 920274 as well.

So failing to ignore either User-Agent or Referer would backfire and yield many FPs with rule 920274.

[1] Both User-Agent and Referer are ignored in rule 920274 and hence do not see them in modsecurity log files (except for matches in rule 920300 and the like. But that's not representatitive). And also, I've configured webserver's log format to only hold needed information, so both User-Agent and Referer get dropped in my webserver logs.

9948

@dune73
Copy link
Member
dune73 commented Apr 2, 2021

Thank you very much for your extensive discussion of the various headers. This is very valuable for us. Closing the issue here (and merging #2028 which fixes this).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants
0