So an extension breaks a site, and the web developer is to blame for sloppy coding? How can he ever predict all extensions and different versions of extension visitors might have on their systems?
Taking Google Analytics as an example, it guarantees the availability of `window._gaq = []`, even if the external resource fails to load. If an extension were to just detect and remove the script block, it'd kill the site.
This is an innocent and unlikely example, because no extension actually does this. But extensions do other bad things much like it, and this post points out exactly one of those very bad things. It simply looks for a specific DOM element, then goes and deletes its parent from the page.
But yes, websites that have an abundance of trackers are crap. Remember that it may also not have been the developers decision.
The only thing I can say is degrade gracefully as possible.
I am completely unwilling to unblock a sites main items (noscript user) unless there is a pressing need for their content, and there rarely is. (Also, his content works fine with noscript on. I dont get why ghostery would want to inject more content into a site, that does seem a bit ridiculous.)
However, I dont blame the person who makes the site, we just have different priorities.
Taking Google Analytics as an example, it guarantees the availability of `window._gaq = []`, even if the external resource fails to load. If an extension were to just detect and remove the script block, it'd kill the site.
This is an innocent and unlikely example, because no extension actually does this. But extensions do other bad things much like it, and this post points out exactly one of those very bad things. It simply looks for a specific DOM element, then goes and deletes its parent from the page.
But yes, websites that have an abundance of trackers are crap. Remember that it may also not have been the developers decision.