“As a team, we’ve spent the last year heads down working hard on IE8. Last week, we achieved an important milestone that should interest web developers. Internet Explorer 8 now renders the ‘Acid2 Face’ correctly in IE8 standards mode.” Insert freezing and hell joke.
If url == “http://www.webstandards.org/action/acid2/“ Then
Try
Contents.RenderByStandards()
Hell.Freeze()
Catch
Contents.ReplaceAll(“<img src=’acid2face.gif’>”)
End Try
End If
No, seriously, this is great news. MS is actually paying attention to web standards! Web devs cheer!
IE8 now renders the “Acid2 Face” correctly in IE8 standards mode.
What does it mean “in IE8 standards mode?”
One person brought up an interesting point on the blog (near the bottom as of this writing) – if IE8 has two modes, then it’s useless.
I assume [hope] it’s a default mode for all sites to render correctly. Then for any site which relies on old broken IE behavior, THOSE specific sites are rendered in “compatibility” mode.
It’ll be enabled when the right conditions occur, much as “standards compliance mode” is activated now.
Google will tell you a lot more, but it’s normally a case of selecting a strict doctype for the document, assuming the document is conforming.
Actually, my guess is that “standards mode” simply refers to the “usual” mode for rendering web sites which have a correct DOCTYPE declaration etc… (as opposed to the so-called ‘quirks’ mode, see http://en.wikipedia.org/wiki/Quirks_mode )
This strategy of having two modes is pretty common, not just with IE but also with other browsers.
Best,
danB
Ah, of course. I knew that, just wasn’t thinking.
That would be even better!
Yep. Firefox 2.0 has three modes even: standard, semi-standard (had a doctype but quirked) and quirks.
I know this because I found a site that rendered fine for me, but the other guy claimed it was broken in Firefox. It was … in Firefox 1.5, because 1.5 really enforced the claimed doctype.
Ever since version 6, IE has had a “standards-compliant” mode and a “compatibiltiy” mode equivalent to Firefox quirks mode. The quirks mode is basically laid out like IE4, and the standards-compliant mode is supposedly laid out according to web standards, though the actual compliance with those standards has been improving gradually. IE7’s standards-compliant mode is actually pretty good – though not all the way there yet, IE7 can render most pages pretty close to the way they would render in Firefox and such.
You have to trigger standards compliant mode by including a proper doctype declaration. There are pages on MSDN that list the doctypes that will trigger it, a pretty long list that includes all modern HTML 4 or XHTML 1 standards.
MSDN changes their linking structure pretty often, but this page seems to work right now: http://msdn2.microsoft.com/en-us/library/ms535242(VS.85).aspx
Edit: Got the initial version wrong (is IE6, not 5).
Edited 2007-12-19 22:57
Both IE AND Firefox have a quirks mode and a standards mode. If you use the correct DOCTYPE and have valid code, they are supposed to render the page according to W3C standards, otherwise they revert to a compatibility mode for older, non valid HTML. There’s lots of good reasons for this.
Every browser engine I know has at least two modes:
Standards compliance mode.
Quirks mode.
If a HTML file has a correct header, the browser usually tries to render the HTML file according the specifications in the header (like HTML 4.0 strict etc.). If the renderer detects lots of errors, it falls back into quirks mode. The quirks mode is the really hard part to program in a renderer. It tries to interpret common mistakes by web page authors the way they might have intended it and not how a standard decribes it. The quirks mode is the mode older KHTML versions and iCab lack. That’s also the mode Apple vastly improved in WebKit compared to older KHTML releases.
At least Firefox shows you which mode it’s using: Use Tools -> Page Info (“General” tab).
I thought that would be similar to Firefox’s “Quirks Mode”. I could be wrong on this, but I thought Firefox had a Standards and Quirks Mode.
Quirks Mode would be applied on sites that do not apply the proper doc type. At least that was my understanding.
With HTML there are a couple of doctypes at the top of the page. There is strict which is usually considered standard mode. Then their is traditional or loose which allows backwards compatibility. This is all dependent on the web developer not personal settings of the browser
Firefox and Opera must be useless then. If you are browsing OSAlert in Forefox right click on an empty part of the page, select ‘View Page Info’. Look for render mode. You will see it says ‘Quirks Mode’. I believe all browsers do this. I like developing in XHTML 1.0 Strict as browsers seem to have the least differences that way. All browsers need to have different modes. If the detect missing, bad or old documents types then they default to the ‘quirks’ mode. If it is well formed and modern they use the standards compliance mode.
…select ‘View Page Info’. Look for render mode. You will see it says ‘Quirks Mode’.
Now press CTRL-U or click View -> Page Source.
You’ll see that OSAlert does not supply a DOCTYPE tag. That’s why Firefox displays it in quirks mode.
too little, too late, been there and done that
It’s never too late. Standards are still important.
indeed, and I should never make a joke about it, thanks for reminding me
I modded you up. Because I do agree that it’s way too late.
I hate this penomenon where a company fails to deliver something and finally do so way beyond any reasonable time table and then people are even thankful for it.
Thankful for what? Not respecting standards for years?
Especially if this very same company not only disrespects existing standards (ISO26300 / ODF), but tries to elbow their own underspecified and overcomplicated file format into standards.
Exactly, it is Microsoft’s own fault we have to worry about millions of non-standards compliant web pages!! I remember when web-programming meant looking at the standards, and adhering to them properly.
If you had the wrong doc-type, you got the wrong result.
All the extra handling by the browsers means the internet will never become standards compliant, but will rather just work well enough ( creating greater obstacles to overcome in order to effectively compete with MS ( that is the idea anyway )).
I would rather have every web-page using exacting standardized methods than having my browser fall-back into a guessing game ( while consuming more energy and RAM ) just so pages designed for Internet Explorer ( or maybe even just malformed HTML X.x ) will look somewhat close to how they are meant to look.
Hrm, but no one is willing to go that far. (it would make it seem like your browser is the one that sucks, when it is in reality the web pages which suck ).
–The loon
Exactly, it is Microsoft’s own fault we have to worry about millions of non-standards compliant web pages!! I remember when web-programming meant looking at the standards, and adhering to them properly.
When did web-programming mean looking at the standards? Microsoft’s been around way too long for me to remember such times..And don’t be so bitter, there’s some delightful irony in this whole situation too! Think about it, for years Microsoft made sure that their browser would be The Browser to support and so the pages designed for it would not display as well on other browsers..Then came all the alternative platforms and browsers, pushing for standards-compliance, and now that Microsoft has finally given in to the demand they have to add quirks to their own browser to support the non-standards-compliant way their browsers used to behave That must suck :3
I modded you up because you are right.
too little
Indeed. Passing acid2 is mostly marketing. What we need is a commitment from Microsoft to support standards.
too late
Yea, work on standards should have happened years ago.
Now they just need to release IE8 for all Microsoft OS …
Poor linuxians… what they will tell people now to convince them to “Linux on desktop” ? Maybe something about “frozen bubble” being much more advanced than on Win ?
The fact that IE was not standards-compliant has never been an argument to choose Linux over Windows. It has been, however, an argument that Microsoft seeks to impose its own de facto standards over those of the W3C.
This is good news for everyone, as we all benefit from MS ahering to internationally-recognized independent standards. Please don’t try to start another flamewar over this.
Well, then see that comment please: http://osnews.com/permalink.php?news_id=19058&comment_id=292036
Also – if IE will pass acid test, what will Opera Software ASA do ? Sue Microsoft again, this time “for being a monopoly which complies to web standards” ?
Also – if IE will pass acid test, what will Opera Software ASA do ? Sue Microsoft again, this time “for being a monopoly which complies to web standards” ?
Hih, nice one The truth is, this announcement kind of undermines one of the biggest points of Opera’s claims against Microsoft. They _did_ claim Microsoft does not even try to be fully standards-compliant yet this proves it otherwise.
On a personal note: I don’t even care that Microsoft ships IE with Windows. It’s good that there’s atleast some browser available to those users who use Windows. Opera is a good browser as-is so why the need to sue others anyway..
What do you mean ‘again’?
“compliant” of course.
That comment was not an argument but an answer to your own tendentious question. And you do realize, anyway, that konqueror is not linux-specific, right?
Regarding Opera, considering they’ve not sued MS… I guess they will keep asking the same, which now might mean making IE8 available for all current MS operating systems.
Please explain exactly how that post has anything to do with what I said. I really tried hard to find relevance between a) someone answering your question as to whether it was possible or not to remove Konqueror from a KDE system (it is), and b) the fact that IE non-compliance with regards to standards is or not an argument to choose Linux over Windows.
I’d say you were grasping at straws, but it’s way beyond that. Please stop wasting my time, and that of other OSAlert readers. Thanks.
You told that compliance to web standard is no one argument for Linux adoption. Well it seems it is at least for some.
No, that post does not support your claim.
It doesn’t state that standard compliance is an argument to use Linux, just that Konqueror is a web-compliant browser. It could be interpreted as an argument for using Konqueror, but that on the POV. Personally I just see it as one of several positive side effects of using Konqueror.
Web standard compliance has never been an argument for switching to Linux. It has been an argument for switching to other browsers, but never other OS’es.
Malware (incl. WGA), illegal restrictions (according to Danish law that is), immoral behaviour of Microsoft – and the wish to control your own system are arguments for switching to Linux and/or *BSD.
And nothing seems to have changed in that regard.
I didn’t get that at all from the post you linked to. Explain to me how you did, with specific reference to what that comment actually said and what you thought that meant.
As dylanmisterjones said, at most it’s an argument for using Konqueror, which runs on other OSes than Linux: the BSDs, Solaris, other UNIX systems, OS X (through Fink) and soon even Windows.
Again, don’t waste our time.
See that article. Its about Microsoft being sued by one of not-so-able competitors of them.
There is no need at all to read and comment my thoughts if you do not like them.
First, you did not link to an article, you linked to a comment. As for Microsoft’s competitor, Opera, most of its business is on Windows – again, there’s nothing in there about choosing Linux over Windows.
Just admit you were wrong, at least that’ll help you regain *some* credibility.
I linked to the article. Check carefully. And again
Not in this particular thread you didn’t. I looked at our other exchange and all you linked to was an OSAlert comment. Not that I care, really; I don’t.
For the record, I think you got it backwards: while standards compliance of browsers has never been an argument in favor of Linux, the use of non-Windows operating systems *is* an argument in favor of having browsers that support standards – so that, say, people can all access the same sites even if they don’t happen to use IE.
See, you had a point there, it just meant the opposite of what you thought it did.
Call me ignorant but – what’s the point of the Acid2 test? Does it have any real world relevance or is it just a technical exercise?
I wonder what happened to the original idea of HTML – being a content markup language with the web browser being a user configurable reader. In the beginning, the web page would simply mark paragraphs, headers, lists etc as such and the web browser would display them using the fonts and colors the user specified. That way, everyone could read web pages using his favorite fonts using the colors that were best for his display and eyes.
Now everyone tries to nail everything on a web page to a precise pixel position, regardless of what screen resolution, web browser or system fonts a user has set. This is causing problems for users with eyes or monitors that are not as good as the designers’. Some people need large fonts and high contrast colors to read their pages. Why invent tons of new HTML features that take away the users’ freedom to pick their own fonts and colors?
It’s a (while not thorough) detailed test case for HTML DOM and CSS rendering.
It doesn’t cover everything in the specs, but if a browser can render it correctly, it’s seen as better than not.
The point of the test is to see if they are compliant with W3C HTML and CSS 2.0 specifications.
As for your points about accessibility, there are other tools for that, such as screen magnification (hey, I use the Compiz Fusion screen zoom all the time, and I’m not even visually impaired…). As for legible fonts and good contrast for text on a web page, these should always be a priority anyway.
You know there are options in most browsers to over-ride fonts and colors of text.
The point is, that sites are “designed” ie they are made too “look” a certain way. Much like a picture. You don’t go to the Museum with a bucket of finger paints to make every picture look the way you want it to… This is not a freedom of choice issue, the user has the freedom to choose not to ever return to that site or even look at it. As a designer it is “MY” choice on how that site looks and if “I” make bad choices and no one wants to look at it performs like crap on other people’s machines, then that is on “ME”.
See how that works? Your choice to or not to visit the site, My choice, how it looks because I made it.
I believe you are missing the point. I think the GP was referring to a time where HTML was just for content and people didn’t futz around with making it “look” a certain way. After all, all academic papers look the same. It’s about making it about the written content with HTML only being used to help mark out which part of the text files are what.
Now while I empathise with the GP’s position, I concede that it is your site and you can (sadly) do whatever you want with it and that’s your right. But please don’t misrepresent their point.
As an aside: I hate how when people want to “learn to make webpages” they mean all that glitzy shit and formatting and animations and flash and want to skip over semantic mark-up which (arguably) is what HTML is really about. And because of the over-abundance of crap and under-abundance of semantic information, scrubbing HTML for data is much harder than it should be.
To see how seriously I take this just have a gander if you will at these notes I made myself just for class http://tiny.cc/0AL0T . Have a gander at the HTML and think would you rather write a Perl script to search that or a Dream Weaver mess.
Edited 2007-12-19 23:38
After all, all academic papers look the same
Totally untrue. They differ greatly from subject to subject, journal to journal and year to year. Two papers published in the same journal on the same year probably look similar (assuming it’s one of the journals where the author cannot typeset their own paper), but beyond that there is nothing that says they have to even look vaugely the same. I also know plenty of academics who have gotten into massive fights over exactly how their paper should and shouldn’t look when printed.
Oh all that semantic stuff is bullshit given the technology. Unless you want your site to look like 1994, you have to mix presentation with content…at least if you can’t use technologies like XSLT and XSL-FO. CSS alone is NOT powerful enough to carry the burden of being the presentation “layer”.
Your website is fine…but it’s very simple content that doesn’t require much in the way of formatting (i.e. a toy website)…and it looks like 1994. What if I want to make a website that looks nice and is dynamic and has much more complicated info? Good luck doing that with purely semantic XHTML plus some CSS.
I’m making (yet another) internal web application at work, and although I strive to make the HTML very simple, it’s still not possible to actually produce semantic-only HTML. If I did, I’d have to start sacrificing functionality and the appearance would suffer as well.
I think the point of the OP is to get beyond looks and focus on content. I’ve used CSS and the colour actually has meaning: there is nothing just for show there.
Definitely don’t care that it looks like 1994, it does what it supposes and has nothing on the page that doesn’t have a purpose.
So I guess the original point is just a plea “can’t we just stop caring about how shit looks?”. You may not agree, and I wouldn’t force you to if I could, all I will do is ask.
And by looking at the code i see that you could put a little extra effort in the css without adding markup and make your site look beautiful.
And you, sir, represent the antithesis of the web. The entire premise behind it is to be inter-operable between different hardware architectures, operating systems and client output capabilities.
If you’re so concerned about how it looks –going back to the picture in a museum analogy–, just present a giant raster graphic. It’ll be inefficient and lack any semantic information, but it doesn’t seem like you care about any of that.
It’s not an issue of whether or not you have a right to design a site that is completely flashy, barely usable garbage, it’s a matter of whether or not such sites are crap.
Sites which utilize, for example, Flash-only animation, or have text over-running and overlapping borders and other text because I don’t run at some ridiculous short bus 1995 era resolution that can make use of tiny fonts (and therefore have to increase the minimum allowable font size in my browser settings), just *suck*. And this happens to a fair number of sites I visit – the sites are designed with inelastic widths, ridiculously miniscule text sizes (clearly designed with a “recommended resolution” – or worse yet, the resolution the designer happens by sheer chance to be running on their own workstation), so that they look like utter crap on my monitor.
Maybe some people like women with tons of makeup, music videos, advertising, and other such forms of flashy form-before-function “expression,” but I don’t. You have every right to design that kind of pretentious stuff, but we’re just discussing whether this is something good or not, and I say that it absolutely is not.
If with the flick of a switch right now I could basically revert the whole Web back to flexible HTML which displays properly on almost any browser or device but has no bling and requires few or no plugins, I’d do it in a heartbeat.
No offense and nothing personal.
On a semi-related note, I run this “Noscript” plug-in in Firefox. It’s fascinating. Basically I have it set to disable all scripts, and then I manually allow scripts to run as needed. On the sites I browse, on average one out of every four scripts that web pages try to run is actually required for the page to display properly. I’ve had a good time watching scripts *not* run as they were intended. So much web design displays nothing so much as the egos and excesses of web designers and the suits (who clearly don’t use the internet much) that they pander to.
I’m not opposed to the judicious use of CSS, javascript, and other such things, but I swear sometimes I think people use them just to use them.
Function before form. Substance before style. Information before flash. It’s not a matter of rights; it’s a matter of wankery vs. usability.
You know what looks great on my monitor? Wikipedia. More of that. Less allmusic.com.
I agree about the font thing. It can be done without losing your pretty website. With a thoughtful use of CSS and HTML, you can actually make a website that scales up and down with the font size. You might have to throw in a little JavaScript to get it to work, but that’s how the web is these days.
Oh I SO agree with you. I use noscript as well, and it’s absolutely astounding how many sites are unusable or even give you a blank screen when you browse without Javascript. Now, clearly there aren’t a huge number of people browsing without Javascript, but that is no excuse.
I’m a web developer and I develop all my sites to degrade so that you need neither Flash or Javascript to see all the content. I only use Flash and Javascript to add to the experience. Not only does this allow the sites to be viewable on the widest range of platforms, it’s helpful for search engine optimization and accessibility.
The Acid2 test is a test for CSS 2.0 rendering compatibility: if a browser does not render it correctly it might render other websites utilizing CSS incorrectly and even render some of the content there unreadable. CSS can be used to make websites not only more attractive but also to _enhance_ readability. So, if a certain website is clear and attractive on a CSS-compliant site it may even be completely unusable on a non-CSS-compliant one. That’s why it’s good to pass tests such as Acid2: it ensures your browser does indeed render CSS content properly. I hope you realize that CSS is not about rendering fancy pictures, it’s about enhancing overall experience of a website.
Oh, and as a side note: how dull and boring would websites be if you only used HTML and nothing else?
Oh, and as a side note: how dull and boring would websites be if you only used HTML and nothing else?
Boring, maybe, but they would be clean, readable, accessible, universal, etc..
Boring, maybe, but they would be clean, readable, accessible, universal, etc..
I find OSAlert very much clean, readable and accessible even though it too uses CSS, don’t you?
Have you looked at that? It looks like an uncontrasted mess. Ugh.
While I agree that OSAlert V4 is a lot less readable than the old one it is just the perfect example of how proper use of CSS can improve the readability of a website and not-so-good utilization of CSS could even completely ruin the site. With CSS one can make a lot more cleaner sites than with plain HTML but it’s up to the authors to actually design their sites properly. And remember that the web is used for a lot of different things, things that can’t be done with a static document, so CSS helps in organizing the elements on the site in a better way.
Boring, maybe, but they would be clean, readable, accessible, universal, etc.
I doubt it. CSS, AJAX etc. doesn’t make it harder to write a clean easy to read website. If you think back to the web back in the HTML only days I’m sure you’ll recall plenty on horrible unreadable, unaccessible and unusable sites, I know I do.
Anybody who wants to write a flashy but useless website that is all style over substance will do so no matter what tools you limit them to. People who want to make a clean, readable and accessible website will have no problem doing so while using all the latest in HTML, CSS and whatever.
Oh, and as a side note: how dull and boring would websites be if you only used HTML and nothing else?
About as boring as most video games today that rely too heavily on graphics/presentation and have nothing else of value to offer.
In the old days, having a website with great content used to be enough. Then, Generation iPod came along, and you know the rest.
In the old days, having a website with great content used to be enough. Then, Generation iPod came along, and you know the rest.
I hope you’re not implying I’m one of the “Generation iPod”.. Since I’m way older than that. And haven’t never touched or seen an iPod in action But well, I didn’t mean Flash and all such fancy stuff, I was only talking about CSS. Proper usage of CSS makes sites a whole lot more readable and easier to comprehend and browse than plain HTML. With HTML you’re forced to use tables etc if you’ve got lots of stuff to lay out on the page and you want it to be easily comprehensible. CSS allows you to easily lay them out clearly on the page, among other things. You do realize that it’s usually used for completely other stuff than the “flashy” things? It’s Flash that’s (abused?) used for those
Great content is still enough, but the web is used for many different things today which maybe the Generation Typewriter isn’t realizing.
Preferring a certain type of video game or website is one thing, but limiting yourself to only one kind is still likely to be a loss for you.
Doesn’t matter if the layout is ‘boring’ as long as the content is interesting. Why bother with ‘exciting’ but non-standard stuff like CSS and XHTML when HTML can be rendered on anything from a Commodore 64 on up?
If you use a web as a showcase of your company you probably don’t want it to look ‘boring’ (more so if your company is into design).
“Doesn’t matter if the layout is ‘boring’ as long as the content is interesting. Why bother with ‘exciting’ but non-standard stuff like CSS and XHTML when HTML can be rendered on anything from a Commodore 64 on up?”
Okay, you lost me here. CSS and XHTML are standard, with the standards set forth by the W3C, the same ones who standardize HTML. How are CSS and XHTML non-standard? I am willing to bet the commodore 64 can not display all HTML correctly either, as it has changed much since that was a usable system.
There’s a world of difference between being a standard, and being marketed as a standard.
“There’s a world of difference between being a standard, and being marketed as a standard.”
I agree with that completely. CSS and XHTML are official standards for web development, as much as HTML. If they had been created by, say Microsoft or Mozilla, and people just started using them, that would make them a marketed standard. When the documents are published by the official standards body, and claimed as a standard by the W3C, which is responsible for web standards, then they are standards.
The body producing them is not what causes them to be ‘standards.’
“The body producing them is not what causes them to be ‘standards.'”
Correct. In this case however the body publishing them is also the body that says what the standards are. And the W3C has claimed CSS and XHTML as standards.
Uh … What are you talking about? Are you posting this by any chance on Netscape 3 running on Windows 3.1?
Actually, better CSS support will help push things back toward that original idea of HTML – being a content markup language. HTML has been abused into the role of page layout language (misuse of tables mostly) because there was nothing like CSS to allow HTML authors the page layout control they needed.
Supporting CSS means HTML authors can separate the style (CSS) from the content (HTML). That means that we can better achieve the original intention for HTML that you talk about. A web page that uses CSS for its styling allows you to apply YOUR OWN css to the page for easier reading. For an example of this in action, visit http://www.csszengarden.com
CSS is NOT enough to make a semantic web. It is too inflexible, and too focused on stylistic details rather than overall formatting. XSLT and XSL-FO are (together) one option for creating a semantic web. Everything could be in XML + DTDs and then it could be converted to non-semantic HTML+CSS+JavaScript when viewed in a browser, but search engines would only see the XML. There could be other options. At the very least, CSS isn’t enough. You often have to add extra markup to the HTML to get the appearance you want because CSS has no way of adding additional content to the HTML. That’s where all these nested div’s and extraneous div’s come from. People also end up using lists for menus, tables for layout, etc. CSS is not up to the job.
The point is to see how well the browser handles abuse. Apart from that I fail to see any relevance of the Acid2 test, but hey! It gives all of them a new term to use in marketing.
XYZ Browser v. XXIII – now Acid2 compatible with accelerators and reduced CO2 emission (or what ever the marketing dudes come up with).
This is a CSS test, not a HTML test.
If a website is well-written, all the design rules are inside CSS, while the HTML itself only provides the topological information (this is a headline, this text is emphasized).
IE, if you use a “typical” browser like Firefox or IE, you want the website to be shown according to the CSS rules and you want that to be done right, how it is meant to be.
If you don’t want this, you can disable CSS or use a browser who doesn’t understand it, like lynx, and have the website “your way”.
The problem is, many web developers don’t really get it with the seperation of content+structure and design. That said, tests like ACID still make sense.
By the time this is going to actually matter my grand kids will be the ones enjoying coding for IE8. Until such time all this means is that now we need to worry about IE6, IE7, IE8 compatiblity. IE6 Why wont you die!!?!
Edited 2007-12-19 22:40
If only Firefox passed the test
I am running 2.0.0.11 and it fails.
]{
Gecko 1.9 passes Acid2 since ages. Firefox 3 (currently in beta) is based on Gecko 1.9.
I tried FF 3 beta 2 on my Linux laptop just today and it failed Acid2. It had some odd frame thing in the middle. Yep, there’s an OBJECT tag in the source with data set to the webstandards.org 404 Not Found page.
But FF3 loads the page into that OBJECT. Maybe it isn’t supposed to. What was that OBJECT tag supposed to test?
The Acid2 test is currently broken. A working mirror is here: http://www.hixie.ch/tests/evil/acid/002/
“Gecko 1.9 passes Acid2 since ages. Firefox 3 (currently in beta) is based on Gecko 1.9.”
That was my understanding, but from reading this story’s thread at http://programming.reddit.com/info/63gc9/comments/ , that isn’t the case.
This pic shows FireFox 3 Beta 2 not passing Acid2:
http://img502.imageshack.us/img502/5361/acid2od4.png
More surprising is that Safari 2 does pass Acid2, but Safari 3 took a step backwards and doesn’t pass:
http://img149.imageshack.us/img149/9407/picture8to6.jpg
Of course, neither of those failures is in the same league as IE7’s woeful “attempt” at Acid2:
http://upload.wikimedia.org/wikipedia/commons/d/d7/Ie7acid2.png
And of course, IE6 is even worse:
http://upload.wikimedia.org/wikipedia/commons/thumb/b/bd/Ieacid2.pn…
:p
Edit: Here’s the corresponding Channel 9 video. The beginning of the video is interesting, in that it shows how IE renders Acid2 as of August, then September, October, November, and finally passing Acid2 this December, so you can see the improvement over the last few months.
http://channel9.msdn.com/showpost.aspx?postid=367207
Edited 2007-12-20 03:29
If you paid attention to your own screenshots, you’ll notice that Safari 3 and Firefox 3 render the Acid 2 test almost identically. Me thinks that the web page itself is broken in some way.
Why not do some thinking? Do you really think that Mozilla and Apple are so stupid that they’d miss this huge flaw? I guess you’re just willing to slag the competition whenever you can just so Microsoft doesn’t look as bad as it really is. I guess their competitors have low market share becuase they’re full of idiots, right?
Linux has it’s zealots and so does Apple. But Microsoft has the most delusional zealots of them all. Their history of hindering competition and illegal practices is undeniable, yet they still have their fans. It’s amazing, really.
PS: I just checked Opera. Not surprisingly, it renders acid 2 the same way.
Indeed, the Acid2 page was broken (404)! Firefox beta, Safari, and Opera should now render the page correctly.
Edited 2007-12-20 06:52
===
Finalzone:
Right now, Konqueror doesn’t render it correctly either.
Konqueror used to!
Edited 2007-12-20 10:00
Try to use the browser on that page http://www.hixie.ch/tests/evil/acid/002/
It should works fine on most browsers that passed the test. It is definitely a problem with http://www.webstandards.org page as pointed out on:
https://bugzilla.mozilla.org/show_bug.cgi?id=289480#c176
Umm, it still does not render correctly Safari 3.0.4 running on OSX 10.5.1. Not sure what was “broke” but I rather think it is the browsers that are “broke”, if you can call it that based on Acid2 test…
Well, in a sense the web page *is* broken; that’s the express purpose of the test. From the Wikipedia page:
“Because Acid2 tests how web browsers deal with faulty code, the test is intentionally not written to W3C CSS standard specifications. Thus it will fail W3C CSS validation. This is expected and was the intention of its designers.”
http://en.wikipedia.org/wiki/Acid2
The definitive web site for how Acid2 is rendered in major browsers in their history can be found here:
http://www.howtocreate.co.uk/acid/
Archiesteel: mbot is right. I confirmed the eyes rendering on Acid2 Test is definitely broken. I have tested with Firefox 3 Beta 1 & 2, Opera 9, Safari 3 and XO laptop browser activity. Firefox 3 beta 1 passed the test until now.
Edited 2007-12-20 07:36
LOL
Calm down, friend. No need to get so defensive. I don’t particularly care about Acid2 nor do I pretend to be knowledgeable about it. I just reported what multiple posts at programming.reddit.com and slashdot are saying regarding Firefox 3 beta and Safari 3 failing Acid2. I guess all of those posters were idiots too.
Anyway, fine, the Acid2 web page is currently broken, I guess. But FWIW, it wouldn’t really surprise me if Firefox 3 beta 2 failed Acid2 even if beta 1 passed it, since it’s still beta. But as I said, I did find it surprising that Safari 3, a released product, would fail after Safari 2 passed.
Hey, maybe word got out that IE8 passed Acid2, so Acid2 was changed to rebreak IE8 but that broke everyone else in the process. LOL (In case you can’t tell, I’m just kidding. You seem to be wound up over this so I’m explicitly telling you that I’m kidding so you won’t flame me again. )
Edited 2007-12-20 09:26
Meh I pick on you because your posting history indicates that you defend Microsoft with absurd reasoning and claims.
You go to great lengths making excuses for Microsoft. Examples that I recall: You considered anti-competitive suits as a reason for Windows’ poor zip performance. That’s borderline tin-foil. Then there was that other post attributing Vista’s strong anti-piracy features as a reason for Vista’s low uptake. If you’ve been paying attention at all, you’ll know that Vista is easier to pirate and maintain genuine validation than in XP.
Just stop and do some thinking and fact-checking before you slag the competition or make your pro-MS posts.
——-
Anyway, I hope IE8 is released for XP and pushed out like crazy like XP SP2. I’m tired of being served substandard support in Outlook Web Access through Firefox and Safari. Hopefully, passing acid 2 is a small portion of what they’ve done to improve standards support.
So you’re “picking on me”, meaning that you decided to make things personal? Whatever. (BTW,I don’t recall saying that Vista anti-piracy measures resulted in slow uptake of Vista, but my memory isn’t what it used to be. As for my not knowing how easy it might be to pirate Vista, I don’t know and I don’t care. I don’t make it my business to know how to pirate things.)
According to this slashdot post, the Acid2 test is either broken or has been changed, but currently Opera 2.4 for Linux still passes it, while other Acid2-compliant browsers fail.
http://it.slashdot.org/comments.pl?sid=394442&cid=21763122
Can anyone confirm that Opera 2.4 does currently pass Acid2 on Linux? Maybe the test was changed slightly which broke a few browsers, but not all of them, which could indicate a less-than-robust CSS implementation.
(Oh, and mbot, I’m just speculating. No need to go all DEFCON 1 to defend a particular browser.)
Edited 2007-12-20 16:02
Can anyone confirm that Opera 2.4 does currently pass Acid2 on Linux?
No, Opera 9.24 doesn’t pass the test. It does however pass the test on the mirror, and so does konqueror.
Thanks for the info.
I just read a new thread at programming.reddit.com that the acid page is indeed broken, or changed, whatever.
http://programming.reddit.com/info/63h9m/comments/
But I also read there that apparently Firefox 3 Beta 2 still fails even the mirror:
http://img138.imageshack.us/img138/6642/2acid2az1.png
The error is different, but the smile incorrectly touches the border of the face, and the face border is screwed up around the mouth. Looks like the face is slightly “flat”, if you will.
Edited 2007-12-21 01:38
Hmm, Firefox 3 Beta 2 passed mirrored Acid2 test on Windows XP. Perhaps the issue is specific to Windows Vista.
http://farm3.static.flickr.com/2316/2126422896_6837ebceb2.jpg?v=0
Edited 2007-12-21 05:09
I have to say that I, a MacBook owner and user of both 10.5 and Vista/XP find nothing so offensive in the comparisons to warrant your harsh reply. Oh wait I see I am not the only one since you have been modded down to a -1. Anyway as a Safari user I too find it odd that Safari 3 on Leopard does not pass the test either since it was so widely publicized with version 2. And I don’t see how the test is currently “broken”.
Calm down as you are the one that looks like the fanboi not mollyc.
I could have sworn that firefox 3 earlier beta’s had past the test.
Safari 3 and Firefox 3 rendering result is very similar. I think that they have implemented some HTML5 and CSS3 support which break the ACID2 test rendering.
This version of Firefox passes (unless they broke it):
http://developer.mozilla.org/devnews/index.php/2007/12/18/firefox-3…
http://ajaxian.com/archives/firefox-30-passes-acid-2-css-test
http://www.fsckin.com/2007/11/25/acid2-test-firefox-3-beta-1-vs-pho…
That version of Firefox is considerably closer to final release than IE 8 is.
You can try it out right now if you like.
http://www.mozilla.com/en-US/firefox/all-beta.html
Edited 2007-12-20 03:02
I just downloaded latest nightly build of Minefield just to test ( I am a curious cat :3 ) and it does indeed fail Acid2 :O Wouldn’t have believed it but now IE is more compliant than FireFox
http://acid2.wikispaces.com/
I would love if MS would take the time to port IE or say its rendering guts to c#. The only reason I see this would make any sense would be for non windows users to be able to use IE again. And no I am not installing cxoffice or wine to run IE . There has to be a better idea then this. What every happened to IE for UNIX . They used to make IE for HPUX and a few other platforms.
It means that MS offers a certain capability for those that want it. What they may do is is create a superset capability for IE and get the major content generator tools to support it. The next step is for corporate America to embrace these tools (Bill Gates hugging) which means that other clients won’t be able to “keep up”. Those that do use HTML 4.01 strict and CSS will probably have no trouble on IE. The answer might be for the US and EU to standardize on a set of document types and pass laws that force content generators to not superset or invalidate them.
In the recent opera threads, I said about eight billion times that yes, IE < 6 was a blight on the web and made everyones job alot harder then it needed to be, but the new IE team made huge strides with 7, and IE8 was going to be even better. Very few people believed me.
The new IE team is actually listening to the community, both directly through their blog and by trawling the bigger web developer sites and forums. And we know this isn’t just talk and empty promises because of IE7s progress.
I am, and probably always be a diehard FF fan because of its extensibility (Firebug is like God’s gift to web developers). But FF is no longer the be all and end all of browsers. Webkit is shaping up to be a really great engine, IE is actually not stagnating anymore, and even though Opera is still pretty quirky, it still beats the pants off of it when it comes to speed.
I agree with what your saying about Firebug, these extensions and the ease by which they can be installed is the only thing keeping me using FF over Opera.
The rendering engine doesn’t play a huge factor anymore.
Ditto. The only thing that Firefox has that trumps Opera IMNSHO is AdBlock Plus. If Opera can manage to deploy an adblocker as efficient and easy to use as ABP and they will get a new convert rather quickly!
Now that they have done this they no longer need to develop IE8 further.
How about moveable toolbars. I dislike the current positions and so do my customers.
Thank Jesus! Finally some good news coming out of Microsoft. (X)HTML/CSS compatibility is my biggest complaint of all, and if they improve anything, anything at all, it’s a huge improvement.
Sorry if i’m repeating as I’ve not read all the comments, but the article makes a big point of how ” Internet Explorer 8 is to support the right set of standards“. The argument being that there has been a lot of work by web designers getting IE6 to render correctly and they don’t want to break that.
I’m no web developer so please can someone tell me how true this is? Are there sites that would break? I always imagined sites were written to the standards and a “if browser=MSIE” to correct any major issues?