Does not get detected or replaced: https://www.nndc.bnl.gov/nudat2/jquery-3.4.1.min.js
Also weird thing is it never gets cached (some js files do, but not all), even when I make caching rule using Header Editor extension in Firefox.
Domain is detected as OK by your online tool.
uBlock origin is not showing any hits on that url
Either you did not add rule for this? But then why some js files don't get cached?
My header editor rule is:
name: fix bnl.gov some parts not cached rule type: Modify request header match type: URL prefix match rules: https://www.nndc.bnl.gov/ execute type: normal header name: cache-control header value: max-age=31557600
I noticed request header does get changed properly, but still not cached.
I know my rule is perfect, because I tested it on other sites to fix caching such as:
I know. Bit off topic. Could it be that domain is OK but URL is not OK? Maybe you should add URL testing tool instead of domain testing too because maybe not all files on domain have same restrictions (csp, crossorigin, service workers).
Filtering HTML did not help, but maybe rule never existed for this. Maybe you only made for CDNs?
That's fine, because the website owner hosts this resource himself. From a privacy point of view, it is only critical if the stuff is externally hosted.
Example: The website
www.stackoverflow.com embeds an external resource from a CDN:
https://ajax.googleapis.com/ajax/libs/jquery/1.12.4/jquery.min.js. This resource will be replaced by LocalCDN. But if Stackoverflow hosts the resource itself, i.e. at
www.stackoverflow.com/jquery-1.12.4.min.js then that will not be replaced by LocalCDN. Why should it? The operator of the website already knows you.
You can find all supported CDNs here: /core/mappings.js
Deleting a branch is permanent. It CANNOT be undone. Continue?