-Searching with a particular search engine through the URL bar, used to take one touch gesture. Now you need to press "Search engine", scroll down to your selection (e.g. wikipedia isn't visible by default), press it, and then select among a weird collection of recommended search alternatives.
And, as op said, you don't even have the option to use keywords and be done with this nonsense.
Ooh, nearly forgot: you know how Firefox is supposed to be the ethical, privacy-respecting browser? They ship an on-by-default advertising data-collection SDK now!
Cases per capita is still very misleading without accounting for number of tests though, no?
Looking at some countries that offer more detailed data, you can sometimes see the cases double, while hospitalizations and deaths grow at a much lower rate (if at all).
The situation is similar to Google Chrome/Chromium: Most of it is open-source, but the release binaries contain some proprietary bits for telemetry, marketplace access, etc.
> Page caching is way too aggressive. It's not acceptable to show days old versions of a website in a new tab. Yet, this happens to me on pretty much all news websites and blogs that I visit. Including HN. I have to reload freshly opened news websites to get the current version. To me this feels like a bug and I hope this is not actually intended behavior (that would be seriously misguided).
Interestingly, most comments are talking about the cache/tech capabilities and not the privacy concerns.
I mean the way it was shown is that WebBundles allow the author of that content to arbitrary hide and package content in such a way where you cannot filter specific content; so either you view a page with ads and trackers or you don't see anything at all.
WebBundles can't hide content any more than a regular server can, and blockers can block individual resources from a bundle like they can individual URLs from a server.
remember that chrome / chromium hamstrung the ability to block at the loading stage; only firefox, ff derivatives and hacked up chromium derivatives can still block prior to loading with complex rules.
All claims seem to be based on the same URL mapping ability that's ascribed to WebBundles and for some reason, not servers or edge workers.
The infrastructure needed to randomize and remap URLs for bundles is basically the same as for endpoint URLs. You can already serve all requested content, including things a blocker might want to block, as first party, meaningless URLs. https://my-site.com/1g276dn4kas for everything. Store the URL map in the session data rather than bundle.
I don't even think the hard part in either case is mapping the URLs and serving the content, but rewriting all the references to the URLs in the source files.
No, the tech claims were wrong but since remaking bundles is proven cheap and easy then the bundles can quickly be modified to have malicious payload (for some definition of malicious)
Looks like they moved it into the Tools -> Web Developers menu. Ctrl+U still works too!