“You’re about to see the mother of all flamewars on internet groups where web developers hang out. This upcoming battle will be presided over by Dean Hachamovitch, the Microsoft veteran currently running the team that’s going to bring you the next version of Internet Explorer, 8.0. The IE 8 team is in the process of making a decision that lies perfectly, exactly, precisely on the fault line smack in the middle of two different ways of looking at the world. It’s the difference between conservatives and liberals, it’s the difference between ‘idealists’ and ‘realists’, it’s a huge global jihad dividing members of the same family, engineers against computer scientists, and Lexuses vs. olive trees. And there’s no solution. But it will be really, really entertaining to watch, because 99% of the participants in the flame wars are not going to understand what they’re talking about. It’s not just entertainment: it’s required reading for every developer who needs to design interoperable systems. The flame war will revolve around the issue of something called ‘web standards’.”
I just read this and I love it. It’s a great breakdown.
I’ve suggested this, to unpopular response before[1], but I still think Microsoft will not ship IE8 defaulting got standards mode. Too much of the internet will break. They have a responsibility to clean up the internet properly, and for those involved in the real world with online business and not sheltered by the tech-savvy, you can’t just do it like a band-aid pull. It’s going to take time and money.
[1] http://www.osnews.com/permalink?297391
Actually, it won’t break, if Microsoft keeps to its word. The only reason so many sites are so horribly broken in IE 8 now is because IE 8b1 is itself horribly broken.
If you go back and read the actual rationale for the original decision, you’ll see it’s not really about the public web at all: it’s about corporate environments (intranets and the like), where the software running on the workstations is already tightly-controlled. (For ^aEURoeapplications embedding the IE engine^aEUR, Microsoft came up with the solution for that one long before the IE engine was ever embeddable^aEUR”each incompatible version of a control is supposed to have a new CLSID, so that old versions continue to work^aEUR”if Microsoft is supplying the IE 7 engine anyway, they can just give that to applications which request the IE 7 COM object as they have previously).
All, except for a tiny minority, of public sites that would have broken as a result of the IE 8 standards-mode flip would be broken right now in IE 7: the suggested (and widely-deployed) fix for those sites was to do a ^aEURoeif IE 7^aEUR conditional comment, which itself will be ignored by IE 8 (the correct behaviour).
Of course, scripting engine incompatibilities wrinkles this a little, as IE 7’s DOM is very similar to IE 6’s, but from what I can tell so far IE 8’s is still a fairly close relative. Except to see updates to things like TinyMCE and other WYSIWYG editors, but not a lot else will need significant changes.
Easy way for them to solve things: change user agent. Don’t call it MSIE anymore, call it MS Internet Explorer 8 or something else – and the Javascript exceptions won’t be the wiser. In addition, don’t accept ANY old css hacks, and force the browser to be interpreted as “an other browser”. As long as they have MSIE in their user agent, javascript code checking navigator.userAgent will assume “hacky patchy” mode.
Optionally, look for hacks in the page and go into IE7 mode that way.
Personally I side with the idealists. Drop the user agent and tell the world “IE8 _really_ is a new browser”. Opera renders stuff correctly, not because people *check* for Opera, but because it doesn’t say “I’m MSIE!”. In my experience, IE is the only browser that barfs on most standard compliant websites, not any of the others. If IE8 is so good, then let it assume a new identity as well.
Agreed 100%. Any hack that relies on bugs to “activate” won’t if the bugs aren’t present so that only leaves hacks that activate by detecting the user agent.
Of course, the fact that doing that is a huge mistake and has been continually been shown everywhere didn’t stop lousy web developers from copy & pasting javascript hacks from 2001… So the only way to do something as radical as implementing a real standards mode is by doing something else just as radical: a completely new user agent that reflects this. One that won’t be caught by old javascript – this isn’t “IE” anymore, so don’t enable any IE-only hacks and workaround for non-standard behavior.
Why would this work? Easy: every example mentioned by Joel that looked “broken” on IE 8 (which is still a beta btw) renders correctly on other standard-compliants browsers like Firefox, Safari & Opera. They only break in IE because they check the browser and assume: “Hey, it’s IE, let’s do things in a non-standard way to work around its non-standard behavior”.
Hay anyone tried the beta and figured out if there is a way to change the user agent to do some testing?
Many have suggested this (including me), but if it were really that simple, wouldn’t Microsoft just do that? I would think that at least one person at Microsoft thought of doing this but the IE team found it doesn’t work for some reason.
I’m not a web dev (Allah be thanked), but I read earlier today at the reddit.com and JoelOnSoftware discussions that web devs don’t rely on the user-agent to trigger CSS hacks:
http://reddit.com/r/programming/info/6cdsr/comments/c03gufa
http://discuss.joelonsoftware.com/default.asp?joel.3.606422.3
That’s something different. The user agent has to do with the javascript hacks, which are very often used in combination with css to shape a site layout to fit IE. So with changing user agent, you’d fix the javascript problems.
The only thing IE has to do to fix the css hack problems is to stop reading the hacks, like all the other browsers out there. Very simple.
If the IE team has thought of this and found it wasn’t a good solution, they would be surprisingly dumb. It is obvious that it would fix all their problems in regards to their browser working with as many sites as Firefox, Opera or Safari does.
Their “quirks” mode should be enabled only when the site specificly checked for it, with the old useragent. You’d be able to figure out if the website checked for an old version of MSIE by doing a simple substring search, and I’m sure the good devs at Microsoft would be able to do that.
That huge-ass piece of analogy can NOT be read in 15 minutes. I’m putting it on my vacation to-do list right between the Gentoo Handbook and The Holy Bible.
I’m here to tell you, it can. And you should.
Covering two to three chapters a day, the Bible can be read in one year. Now, that might seem insane to people, but I’ve been finding so easy to give 10 minutes to it a day, that I wonder why on earth I put it off so long before.
As for the Gentoo handbook…
From the article:
If I were gay, I’d ask him to marry me. Seriously.
Edited 2008-03-17 20:39 UTC
Quoted directly from Sam Ruby http://intertwingly.net/blog/2008/03/17/Martian-Mindsets
“* It is not true that 98% of the world runs IE today. Nor is it true that 98% of the world that runs IE runs IE7. Nor is it likely to be true that 98% of the world that today runs IE7 will install IE8. I personally even doubt that 98% of the world that today runs IE7 and installs IE8 will make that particular statement.
* Web pages like Google Maps work on other browsers. Not because of a mythical, platonic “standard” in scare quotes. But because of standards that are actually implemented compatibly. And because in standards mode, these other browsers don’t implement the non-standard IE only Javascript objects that Google Maps checks for.
* Joel’s argument works both ways. Why can’t Google Maps developers be pragmatic, touchy feely, warm and fuzzy engineering types? “Can’t we just default to IE7 mode? One line of code … Zip! Solved!”. The real question is what is the right default for the long term. There is no need to resort to name calling (Trotskyist, left wing, America’s Toughest Sheriff, pink pajamas). If the default bothers you, write the one line of code and move on.
* Defaulting to IE7 mode doesn’t work.
* If people want web browsers that work with actual web sites, they still have three choices.”
This whole article is just uppity. People are trying to solve difficult problems, and that is the long and short of it. The more you make web-standards into an armageddon debate, the more you leave behind the reality of sitting and trying to design these systems and standards accordingly. Get a grip, is all I can say.
edit — osnews ^aTMyen unicode
Edited 2008-03-18 07:33 UTC
So only users of Microsoft products deserve to have a working web, eh?
Actually, the consumer is. The same way that I’m an idiot when it comes to nuclear physics or string theory.
Joe’s right though, what matters is if it works or not.
But…
On the other hand, if you apply the same to, say, car standards it’s “the consumer does not care if the car is safe as long as it gets where he is going”.
Can be a dangerous attitude.
blah blah blah blah…
I tried to read the entire article, I really did…
End result? I use Windows, and Vista (ugh), but use FireFox 3.0 and thought, “15 minutes of my life just went down the tubes…”
When should they draw the line. If MS does not go for standards mode do designers have any incentive to change their websites?
Say when IE 9 comes out designers will say:
“oh crap I can’t make it go in standards mode because all those IE8,7,6,(5?! -probably not a joke) users will complain that my site won’t work”
and then MS has to say:
“we can’t go standards mode because nobody has designed sites to work with standards mode IE”
This ends up being a vicious circle that never gets broken. So I ask when is there a good time to draw the line and dump all the cruft ?
The trick is that before designers had no choice but to write for a broken IE.
Now, assuming a “fixed” IE, designers will be motivated when they improve their sites to support the “new” IE, in IEs new “standards mode”.
So, now designers have a path out and away from IE reliance. Right now, they don’t have that option. IE 8 will have that option and give designers (and users) a path to upgrade through.
Right, but look at the trouble we had trying to support IE5 and then IE6 when it used an entirely different box model, and then eventually after the chaos, the right hacks were found and we managed to do it.
Then IE7 came out, and we had the same problem, we had to find hacks and ways of coding to support IE5, 6 & 7.
The same will be of IE8 – it may be Acid2 compliant, but that’s almost irrelevant; it’s still another huge set of changes that we have to somehow try support with one set of code, whilst still working with IE6 & 7.
We are in no better situation until we can safely start a new project and say that IE8 is the absolute minimum requirement, and IE7 and below is not supported. That’s going to take a long time for some.
HTML 5 and XHTML 5
There you go!
The W3C and WHATAG should be getting HTML5 and such out right away. They fix many problems REAL web designers have and could tip the balance away from hacks like IE all together. Because of the continued disappointment of IE, there’s a lot of people ready to jump. Apple is helping with this by pimping Safari and iPhone which are firmly non-IE.
Frankly nobody has mentioned the obvious, just use one of the really cool unsupported IE DTDs, my personal favorite is XHTML1.1 + SVG + MathML served properly as XML. Combine with some Javascript that combo is 75% of what Flash can do and pure text. That should put IE in it’s place.
I’m a Linux user and all but I have to admit I’d fully understand it now if MS chose the solution that requires one additional line (probably a comment so that other browsers just ignore it).
It’s not like writing one little line to indicate that you’re actually one of the very few web developers who care about web standards is gonna kill anybody.
They just need to set IE8 to give it’s user-agent as “Mozilla/5.0 Gecko Firefox”. Then it won’t be subjected to all the non-standard IE hacks that break it so easily.
Seriously, it does sound as though if it didn’t declare itself as IE, it would work just fine.
I had that thought, too, but are there that many websites that care about standards though? I still come across websites whose designers don’t seem to know diddly/squat about them.
I can’t help but compare this to the real life situation in which a person has screwed up his own life so badly that there is nothing left but to assume a new identity and avoid admitting who he really is.
Edited 2008-03-17 23:03 UTC
Yeah, it’s called dying and being reincarnated
I too come to the conclusion that a new UserAgent is called for… the “Internet Explorer” brand is F’d anyway – Microsoft needs something new – call it “Tube Spelunker” or some crazy-ass-sh*t…
The world will rejoice – everyone will be happy. (or not..whatever)
Except for that person is known by 85% of the population, and there really is /nowhere/ to hide.
I wonder if the IE-team can fake their own deaths, and Microsoft could appropriately “being in” a whole new team of ‘interoperability experts’.
The funny part is that all the websites that only work in IE and sniff for it would start telling you to upgrade to IE.
Those are the same websites that fail in Safari, Firefox, Opera, etc?
Considering I’ve been surfing with Firefox since before it got popular, I’m pretty certain the number of these sites has diminished greatly over the last couple years. Those that didn’t know about Firefox before mostly worked anyway.
I’m thinking the “broken” sites are going to be those who assume a browser claiming to be IE is going to be broken like IE.
You’re still living in 1998. In the modern world no one, by which I mean a small minority, uses the user agent string to identify the browser.
For CSS hacks, which are AFAICT the #1 problem facing IE7/IE8 rendering problems, the user agent doesn’t matter. Things like * html and exploiting parser bugs (voice-family, anyone?) are the norm. Does IE8 Get rid of the extra root element? If it does, then fine, otherwise it picks up IE7/6/5 hacks. Does IE8 have the same parser bugs? If it does then it picks up the same hacks. Where conditional comments are concerned the situation is little better; a lot of condition comments I see do silly things like [if gt IE 6], instead if checking for each older version and supplying hacks specifically for it.
For JavaScript, which is the other major bugaboo for IE8 users, user agent sniffing is close to dead. Almost everyone does object detection; even if it’s simply if(document.all) IE=true; else IE=false; some kind of object detection is the norm. Does IE8 still use the monstrosity known as document.all? If IE8 Supplies an addEventListener that works then most event-related hacks wont catch it, except for the ones that check for attachEvent first (because you *know* IE will still support that).
So, while switching the UA string since like a simple, effective solution the reality is that it is not likely to help for more than a very small percentage of sites. Say, 5% or less.
“The plural of Lexus is Lexi.”
Now there’s a comment that makes sense.
The article is well written from a realistic point of view and provides lots of good points worth reading and thinking about. It is true that part of the heated talk about browsers and web standards is interesting only to developers and techno geeks… However, I would like to emphasize the nature of web standards as kind of fair play rules for the world wide web.
There are various sorts of web standards and technologies and very different scale problems related to them. I wouldn’t throw all those issues into the same bag and say that the problems are equal in importance. The importance of following web standards is not really so much about, say, the location of some graphical pixel on a web page – but about much bigger and also non-technical issues.
It is perfectly ok, and even useful, if browsers try to accept small coding and standards errors by trying to render also such slightly non-standard content as well as they can according to their own specific code error handling rules. Also it may not be so big big deal if some minor web page detail like a horizontal line might look to be a few pixels more up or down in various browsers, as long as you can still see and use the real content. Also no big deal if different browser use browser-specific tricks in order to overcome those small rendering differences and try to make web pages look about the same in all browsers. All those things still aim at the same goal with web standards: a well working a web for everyone, what ever technology they may use now or in the future to view that content.
But it is an entirely different scale matter if, say, someone tried to market – to everyone – some very proprietary, maybe also patented web technology that they could intentionally want to work well only in a certain browser and/or a certain OS, and that web technology and/or browser and/or OS might also cost a lot of money to use. Now think if many competing entities tried to do the same thing in a situation where there were no open web standards at all. That is, of course, a very extreme example. But to lesser degree such dangers exist if people (and browsers) don’t value the importance of fair play rules like open web standards.
The whole reason why we have one public Internet for everyone instead of thousands of smaller incompatible commercial and other webs is because of commonly accepted and valued open web standards.
Spolsky is claiming on a theoretical basis that standards do not work. He claims that standards-compliance is impossible when the standard is huge and ambiguous. I do not disagree with him there. But none of his analogies fit the facts. He says everyone is at fault because we all tried to interoperate and failed. but this is false because Microsoft never tried to comply with the standard, they tried to extend it to lock out Netscape, which started the arms race toward proprietary extensions and vendor lock-in. They DIDN’T WANT interoperability, and they didn’t give a damn about standards. Microsoft, leveraging their monopoly power unfairly as proven in a court of law, made the broken and non-standards-compliant IE the overwhelmingly dominant browser. Shoot forward several years, and even with other browsers with acceptable levels of standards-compliance, there is no excuse for a developer to make IE-specific sites to the exclusion of others. Yet, this is still done because there are thousands upon thousands of developers and project managers and designers that don’t give a damn about standards.
The Vista analogy is equally bad. How many Windows programmers actually read the MSDN documentation thoroughly? Raymond Chen for years now has written exactly why your shit doesn’t work from one version of Windows to the next, and it’s usually because, yet again, you didn’t follow the API documentation, you relied on undocumented behavior. What kind of defense is that? Yet again, not even trying to follow the rules breaks things.
So where does this leave us? It is not everyone’s fault, it’s the fault of people who explicitly DID NOT care about standards, namely MICROSOFT and BAD DEVELOPERS; that therefore Spolsky’s long, angry and well-reasoned case for pragmatism is based on false premises; that his “pragmatic” “solutions” (among others the ridiculous proposal that EVERYONE ELSE ON EARTH modify ALL their apps with a new meta tag) for IE8 break other browsers, which is good for Microsoft’s market share and bad people who are actually trying to be interoperable and did nothing wrong. “We have an obligation to our customers to not break their sites” is not a neutral position, because it continues the policies that have been hurting competitors unfairly.
I think I have a fair solution: IE8 is made standards compliant by default; everyone modifies their sites to be standards compliant; and all billable hours of those changes and bugfixing for IE going back to 1997 be billed to Microsoft. After all, it’s primarily their fault, and it’s not like they don’t have the money. But I guess I’m an idealist, because that’s just not going to happen. Instead, everyone else will continue to pay. Yeah, I’m sure that seems like a pretty pragmatic solution to Microsoft. Let’s go with “everyone else pays.”
Yeah, everybody would like it if every single web developer all of a sudden started writing perfect code. It’s just not gonna happen.
The problem is that without the vast majority of mediocre web devs the internet wouldn’t have become such a success in the first place.
While it’s true that Microsoft acted unfairly against Netscape, it wasn’t Microsoft which started the incompatibilities and the standards war… it was Netscape. All browser makers have introduced non-standard elements at one time or another, because they needed them or saw a competitive advantage. Microsoft is mostly guilty of being *inept*, implementing existing standards in laughable ways. Some times they are guilty of being too fast, implementing standards before they are finalized. Sometimes they *invent* what later becomes standardized, but the standard specifies slightly different behavior.
Personally, I think Microsoft’s JScript showModalDialog() is a good idea, but no one else ever adopted it. In a way non-standard additions like this should be seen as money-where-your-mouth-is proposals. If people like it others will adopt it; defacto standard.
The best standards are the ones which have two competing implementations *first* and specifications *second*. Consider XMLHttpRequest: it was invented by Microsoft, but now everyone uses it. It will, sooner or later, be in a standard that gets adopted by some body.
I think it’s a wonderfull irony that Netscape started the whole “lets extend html” disaster.
haha, that moron is layihg it out as though Microsoft has done nothing wrong, as if the reason for all this is that the w3c specs are extremely poor, and everywhere they “differ” from real browsers, its because the specs are incomplete…
When all begun Microsoft imposed IE, then imposed their bastard child as the standard, not caring a fig that W3C told differently.
When Firefox beat them on their field ( they did, when you find in chats reposts like ” what is IE for?” “Downloading Firefox” you know it has) and W3C could finally start talking eyes to eyes again.
NOW that they are bashed by anyone with a sense of decency and of equality and, worst, have each judge on three out of five continents on their tails they decide that standards are good.
And the author sums it all up in telling us that what was created with a coup and years and years of ILLEGAL monopoly practice is better than legality because standards are not yet known by developers correctly?
I know I tend to have a rather law-school fashion of putting problems, but I tend to think that each discussion that involves standards needs to be brought back to WHY those standards where created and what was behind them.
IE is not a standard, it is an IMPOSITION.
Agreed 100%.
Some observations:
(a) the whole problem with IE8 and MS trying to conform more with web standards is uniquely due to Microsoft deliberately ignoring such interoperability standards in the past,
(b) the subject article hopelessly mired in a Microsoft-centric view of things
(c) any given “self compliant” version of IE … IE 6, IE 7 or IE 8 … has less market share than Firefox (all versions of which are self compliant in that you can make a w3c-compliant page that will render properly on firefox 1.x, firefox 2.x and firefox 3.x).
Firefox installs > IE 6 > IE 7 >> IE 8 (which is not released yet).
Personally, I say good. Everybody can start using a real browser in the interim while all those lame developers that coded to IE defects fix their crap code.
I stopped when I saw the Obama ’08 banner.
If you’re going to write opinion articles on technical subjects [in this case, web standards], pandering to any candidate gets me closing your site immediately.
You refuse to view someone’s opinion on a technical topic because you disagree with their political leanings? The banner you refer to is not specific to that article, it’s just one that appears on what is, after all, a personal site.
Childish.
I haven’t read it all (TL;DR), but I gather that his point of view is the pragmatic one: people want stuff that works and don’t care about the bodies in the closet.
However, standards and people who strongly defend them are important as well. As with many things, the best way to reach a middle ground is to let both camp each try to pull things as hard as possible towards their preferred direction.
In other terms being pragmatic is nice and all but you do need people pushing standards if you want things to ultimately (long term) converge toward being more standardized (as I hope even a pragmatic would agree that moving toward a more standardized and interoperable world is ultimately a worthy goal)
Besides, if you systematically deny the industry any opportunity to right itself (that is, get rid of non-standard legacy stuff) for the sake of avoiding some temporary inconvenience, you may suffer more in the long run (less choice, less innovation, etc.)
He speaks mostly to the idealist people, some of whom seem not to realize that just trumpeting standards is not enough, but I don’t think he is against their position. Ultimately it is the only one that leads to a solution. I get annoyed, some times, at my fellow just-follow-the-standards advocates who seem to miss the negatives of that position. It’s nice to see someone pointing them out.
Edited 2008-03-18 18:06 UTC
While I’ve mentioned Joel’s observation of the Chen and MSDN camps in Microsoft before, and the MSDN people think they can make the world flat again, Joel paints over that this is squarely Microsoft’s fault and they will stew in their own juice – unless people start rewriting everything in Silverlight or something (another stupid MSDN fallacy).
At the moment we have web pages loosely written for more standards compliant browsers – Firefox and Gecko, and Safari and WebKit. Incidentally, those browser engines have managed to come to more compatible agreements on web standards by virtue of them being at least partly open source. Those web pages then all have to make sharp departures for IE, and that’s no coincidence. Amusingly, it means that for most non-IE browsers things will continue as before. It’s only sites that try to accommodate successive broken versions of IE that will suffer.
Joel mentions that web standards are largely a mess, and they are, but that’s no excuse for the very large hole IE has dug itself. Other browser engines have mitigated the pain, and increased communication, by relying on open source engines that allow people to see precisely what everyone else is doing and be able to change it. There’s also no incentive for anyone to be incompatible with anyone else. We also have ACID that IE does terribly at, albeit isn’t a meaningful test suite yet.
I was also piqued by Joel’s tone in the article. He still writes as if IE is the browser everyone wants to be compatible with, and while people have to pay attention to IE’s quirks, they don’t want to do it.
I was amused by this comment from Dean Hachamovitch:
Errrrrr, IE might have a ‘standards mode’, but no one else does. Everyone else has a ‘make it work’ mode, and that’s all they work in.
What’s actually happening is that, despite good market share, the IE engine is being marginalised by the history of its engineers coming up with ever more ways for developers to have to write ‘if browser >= IE6’ etc. etc. so that web pages are less compatible with competitors [In reality, Microsoft hoped that developers wouldn’t write those if statements at all, and other browsers would have to recreate the IE engine. They failed]. It’s pretty obvious why this was done. Unfortunately, it makes IE, and IE alone, a real pain to write web pages for, and for everything else things are comparatively easier. Technically, things just get harder and harder to support the legacy stuff, but Microsoft have to because that’s what they rely on to keep people, developers and users, on IE. Things have shifted from everything working with IE to IE working well with everything else.
IE is being pulled through the floor with the milestone of legacy cruft around its neck, and Dean Hachamovitch’s pretty desperate reasoning and standard ‘Microsoft drone’ response behind the hole sorry mess just shows what a problem it is for them.
Joel is right that chickens will come home to roost, but it won’t be a flamewar. It’ll just be everyone else grabbing some popcorn, pulling up a chair and watching IE sink into the abyss of its own stew. Unless it changes its browser identifier to Gecko or WebKit or something, and existing web pages will then still work. Instant legacy mode! Other browsers will still work for web developers and users, so it’s not a problem for anyone else but Microsoft.
Edited 2008-03-18 10:23 UTC
http://www.mozilla.org/docs/web-developer/faq.html#layoutmode
Split hairs if you like over semantics, but every browser does have a “standards” mode and a “quirks” mode, whatever then names are and whatever the precedence or selection methods may be.
too much bla bla without any real progress on anything…just make the browser implement the standard 100% and let’s get on with our lives…
You didn’t read the article, did you.
You can’t just implement “the standard” because there’s no test suite for the standard.
The closest thing we have is probably the ACID tests.
Now if there was a real, comprehensive reference test, like what Java, OpenGL, and apparently DirectX have, then everyone could measure how standard compliant they are and would have a goal to work toward.
What incentive has Microsoft given in the past for this missing test suite?
None.
They didn’t benefit by co-leading a unified testing standard to validate and solidify the HTML/XML standards when they are not standards but just W3C Recommendated Specifications.
This goal was never a goal, by Microsoft.
They own the lion’s share of the market. Until they lose that lion’s share they won’t be shouting for “Testing Standards.”
That don’t prevent the W3C ( or anybody else ) to implement a browser engine that complies with the standard 100% so the web developers could test against so they can put an “else” in the “if” of browsers.
This should be a goal of the W3C and WHATAG for the new round of HTML5. There needs to be a way to say “this page is right” and the implementation is wrong, yet still allow some room for hacking the markup bits to do new cool things not “intended”. Acid2 is just a test of the breakage points, that only tells you if ERRORS are handled right, not NORMAL pages and layout. Other than testing lots of pages on lots of browsers, there’s no good reference other than opinion… of course MS’s opinion is that they are always right (the engineers choose which is easier to implement right now) and everybody else suffers.
and one of the best defenses for not promoting web standards that I will probably ever read.
That said, I’m still an idealist, I guess. The bottom line is that MS is going to have to face some pain eventually, or things are just going to get worse and worse. Imagine if the current situation was to continue for another 20 years. If you look at it from that point of view, the question is really how long MS can afford to delay, because the longer it takes the worse it’s going to be when they finally do it. I just don’t think there’s any smooth transition to be taken.
Once upon a time, there was a 5th level patent troll that claimed to own the concept of embedding objects via tags in HTML. But the great Redmond dragon did not recognize his authority, so Eolas went forth to slay the beast. In the end, Microsoft lived, but Eolas earned enough exp to attain 14th level, and more than $500 million in gold coins.
Now objects embedded via a tag (like Flash) behave differently in IE than before. You now have to click once to ‘activate’ an embedded object before you can interact with it. To (re)implement the previous behavior, websites had to be rewritten, and use Javascript.
And not only did several websites actually do this rewrite, but several browsers (I’m told) also changed to adapt to how IE handles things, even though Eolas promised(?) not to go after them. So it would be a worthwhile endeavor to investigate how quickly and widely this rather substantial change in ‘(sub)standard’ was implemented by web developers and browsers.