I don't see much of problem with loosely typed markup. Because markup is not supposed to be written by only the highly skilled engineers.
An average Joe is supposed to feel great about writing something that renders the way he/she wants without having to go into deeper stuff like semantics, validity or even cross-browser.
This is something that should be left to people who need to know it in their strata, isn't it?
> An average Joe is supposed to feel great about writing something that renders
Indeed HTML is great for that, but the problem is that you never "level up". Once your content renders, you are done. A lot of Joes may be interested in how things work behind the scenes or in making things "correct" more than "just working". It would be great, from a pedagogical point of view, to have browser render Joe's content (for instant gratification) but also to point out that "Ehi, on line 32 you closed </p> before </i>. It should be the other way around because of this rule called nesting, have a look at it". I think we are wasting a lot of man-years around the globe for the lack of such warnings.
In the education of many people, compiler errors and warning had exactly this function: they made you do whatever you wanted (as long as decent) but they would also pointed out the basic mistakes ("Ehi, on line 14 you print the variable prg_name, but that variable has not been initialized, beware").
The downside is the average Joe thinks this stuff is easy enough and he should do it professionally. And then:
- Market floods with professionals who antagonize proper web developers, since, to unknoweledgeable clients, the result appears to be the same
- Web floods with sites that behave in unpredictable ways in different browsers
- Proper developers carry the burden of dealing with idiosyncrasies of various browsers
- Browser developers carry the burden of trying guess through amazingly creative atrocities
Although, to be honest, things don't look as grim as they used to regarding the middle two.
I just wish some people would stick to HTML and stopped brute-forcing javascript to work.
Except all of those things started happening in about 1994. And that's why 15+ years later we have spec which defines error conditions rather than just the 'proper' way.
(If you weren't online/sentient back then, sites commonly had these 'Best viewed in Netscape' badges on them.)
An average Joe is supposed to feel great about writing something that renders the way he/she wants without having to go into deeper stuff like semantics, validity or even cross-browser.
This is something that should be left to people who need to know it in their strata, isn't it?