“Joel Spolsky in his article ‘Font smoothing, anti-aliasing, and sub-pixel rendering‘ compares Microsoft and Apple ways of text rendering and explains why windows people don’t like Safari. Text in Safari looks too blurry and that must be why. I want to go further and sum up my experience and observations about it. I’m not an expert in digital typography, but I ‘have something to say’. At least, some ideas may be useful for the GNU/Linux community.”
I think this was heavily discussed in the Safari for Windows article. But I’ll repeat what I said there
“OSX’s font rendering will pay off big time come high-DPI screens and resolution independence.
On a high-DPI screen, Mac (and Safari on Windows) users will have crisp perfectly rendered text that looks just like text on a page, with no blur because of the added resolution; and Vista will look like ass with the letters too thin and the shapes mangled to fit into a pixel grid.”
It also miffs me greatly that Microsoft copped out on full resolution independence in Vista. Microsoft over-promised on every single aspect of Vista. Windows users are going to be stuck with kludgey software and weird fonts at high DPI for another five years.
To those who don’t agree. Try setting your DPI to 120 in XP or Vista and see how mangled and thin the fonts look on your desktop. In Vista, if you then disable “Use XP style fonts” in the DPI setting window, a lot of programs will become severely blurry, and simple text in tray icon menus becomes almost unreadable. Vista completely craps out trying to balance resolution independence with backwards compatibility.
We were promised a vector UI for Vista, only Apple & KDE4 are actually delivering one. On any Mac you can hold Ctrl and scroll to zoom the whole screen; in Compiz & Leopard when you zoom, the enlarged widgets and fonts are rendered at the screen resolution instead of being a simple image zoom giving incredibly crisp graphics and readable text. Vista? Well, you can guess.
Try setting your DPI to 120 in XP or Vista and see how mangled and thin the fonts look on your desktop.
That was exactly what I did the first delay I loaded Vista. I don’t see any problems.
I agree with the Kroc though, not having full resolution independence is a serious shortcoming. I just hope that all new Mac notebooks ship with high res displays i.e. WUXGA. If they do that it may just tempt me to go back to Mac.
“To those who don’t agree. Try setting your DPI to 120 in XP or Vista and see how mangled and thin the fonts look on your desktop.”
I worked with my Vista laptop at 120DPI for many months. Fonts looked great. With the higher DPI setting the fonts actually get bigger you know, so misalignment with the pixel grid is less of an issue. ClearType renders fonts even better at 120DPI, because well, that’s closer to the actually DPI of my laptop’s display – go figure.
I must assume you have absolutely no idea what you are talking about.
OSX’s font rendering will pay off big time come high-DPI screens and resolution independence.
On a high-DPI screen, Mac (and Safari on Windows) users will have crisp perfectly rendered text that looks just like text on a page, with no blur because of the added resolution; and Vista will look like ass with the letters too thin and the shapes mangled to fit into a pixel grid.
In other words: TODAY, Vista looks great and MacOS looks like ass, right?
We can talk about tomorrow when we get there.
Doooh… try running 1600*1200 run a 17″ CRT-monitor. At such resolutions ClearType doesn’t stand a chance when you work professionally with DTP.
Doooh… try running 1600*1200 run a 17″ CRT-monitor. At such resolutions ClearType doesn’t stand a chance when you work professionally with DTP.
1600×1200 on 17″ CRT? How about 4000×3000 on 12″ CRT??
Most people are not doing pro DTP work on their computers.
Irrelevant.
Looking forward to that – though I prefer larger monitors – imagine 6000*4500 on a 17″ CRT-monitor (or with that resolution also LCD). Now, that would be something.
The downside is the size of the CRT-monitor – and the heat generation. I’m looking forward to real high-res LCD-monitors – with proper colors.
Looking forward to that – though I prefer larger monitors – imagine 6000*4500 on a 17″ CRT-monitor (or with that resolution also LCD). Now, that would be something.
Yeah, but as I said, let’s talk about TODAY.
But that’s what he’s talking about! You are stuck in ‘today’ for 10 years! Technology today is moving REALLY fast and there’s no technical reason why we couldn’t have extremely high resolution displays next year for example. And that was also the thing he mentioned – Windows looks like crap on hi-res monitors so there’s no motivation for display manufacturers to actually rev up the progress. Neat, huh?
Technology today is moving REALLY fast and there’s no technical reason why we couldn’t have extremely high resolution displays next year for example.
Next year? How about TODAY?
See, that is the problem. MacOS font rendering looks like crap, yet you keep talking about what might happen next year, etc.
Reality Distortion Field, obviusly.
I think the problem is that the general body of people dumping on the font rendering in Safari on Windows are using crappy CRT’s or low end low spec’d panels, OSX’s font rendering looks just fine on my iMac, and after all, Apple write for the computers they make.
Their method might not be the best but it is also a trade-off for speed, and it at least provides for scalability moving forward, unlike some competing technologies…
I tried it on 3 machines: Work (Dell LCD), Home (Expensive ViewSonic LCD) and my MacBook Pro running Windows. All looked the same to me — crappy.
Cost to develop, cost to deploy, and cost to manufacture — not to mention very high-resolution displays could readily outpace the ability of our video cards to support them.
Technology changes, but technology changes predictably and incrementally. Revolutionary ideas and discoveries do happen, of course, but they are only deployed — you guessed it — predictably and incrementally.
Actually, at my workplace I selected a laptop type (z60m) with a 1680×1050 resolution, or about 130dpi. I love that :-). Clear, crisp, high-fidelity text in KDE once I turn hinting off. Most of my colleagues who are saddled with my choice of laptop run it at 1024×768. Because otherwise the fonts are too small.
Oh, good grief… I would love the text quality, but I dread having to make Krita perform with that number of pixels on-screen. (And I guess Gimp and Photoshop developers have similar nightmares
Why would you even try? 1600X1200 is way to small to be useful, even with clear type turned off. Now 1600X1200 on a 19inch CRT with cleartype looks great.
Of course. 1600*1200 on a 19″ CRT results in lower res so it is better O_o
And of course 10 point Helvetica (or Bitstream Humanist777) is too small. Sure. Doooh. It works great. 8 point fontsizes is a different issue but usually only used in some Windows dialogs. And that’s not a problem. But anywaym I’m not using Windows for DTP. It’s only for gaming today – with occasional use of Visual Studio when “porting” mono projects to Microsoft .NET
Mac OS looks great today, thanks. I’ve been using it for two years. It takes a few days to adjust to the different style of font rendering, but once you get used to it there’s nothing wrong with it, just different.
Mac OS looks great today, thanks. I’ve been using it for two years. It takes a few days to adjust to the different style of font rendering, but once you get used to it there’s nothing wrong with it, just different.
“Just different” ????
LOL
No need to be politically correct here. Just say it: “It sucks.”
I totally agree, though in my case it’s freetype with hinting turned off. 1280×768 on a laptop and it reads very well.
I don’t know what your problem is. Even on my old iBook (no high-DPI screen) all fonts look just fine.
That’s a very short-sighted answer.
If you care only about today, you get hardware review with comments like this ‘DPI on the screen is too high which makes reading text difficult’, this is true only because currently XP font scaling sucks big time.
I thought that Vista was fixing this (and it was IMHO the biggest feature of Vista) and I’m disappointed to learn from the article that this isn’t really true.
Kudos for Apple to try breaking the vicious circle, I hope that they’ll use high-DPI screen on their laptop soon to make Windows users jealous (though it may be an issue for stupid web pages).
Note that the OLPC has a 200DPI (black and white) mode, so children in poor country will have better font rendering that people in rich countries, due to Microsoft stupid font rendering preventing progress of screen..
“OSX’s font rendering will pay off big time come high-DPI screens and resolution independence.
On a high-DPI screen, Mac (and Safari on Windows) users will have crisp perfectly rendered text that looks just like text on a page, with no blur because of the added resolution; and Vista will look like ass with the letters too thin and the shapes mangled to fit into a pixel grid.”
Well, the way web design works today, by the time we have 600dpi screens a letter on a webpage will be about 3 angstrom high (I’m looking at you osnews – although osnews’s layout at least doesn’t break when using the minimum font-size to correct the problem. Braindead font sizes break a lot more pages for me nowadays than browser troubles ever did).
Similarily a lot of application UIs in Windows break when you don’t use the default dpi in Windows. Of course the OS can compensate for this, but it seems that at least the next few years a high-dpi monitor is more of a problem than a solution.
Can anyone tell me how Leopard will handle this problem?
Is the UI in Mac OS truly dpi/resolution independent?
Can *Safari* -on Windows or Mac- trick webpages into thinking that they’re ruining your eyes while dastardly showing you the page at readable sizes?
EDIT: I probably should add, that what I mean here is: Does Safari have a high-quality scale algorithm?
Opera can zoom webpages but while it fixes the font-size/layout problem, the quality of scaled images is so bad that it’s not usable as default setting.
Edited 2007-07-09 17:43
Do you turn every article where Apple is not praised into an OS X vs Vista debate?
The OS X font rendering can look perfect in a high DPI situation, but what about the rest of the users?
You can’t tell me that Apple’s target audience are people with high DPI screens.
“To put it another way, if Apple was Target, Microsoft would be Wal-Mart.”
… what? Is there some deep, metaphysical commentary about the states of the two companies in that quote from the article that I’m just not getting?
He might as well have said “To put it another way, if Apple was a business, Microsoft would be another business.”
To put it in British terms “If Apple was Sainsbury’s, Microsoft would be (I’m Chavin’ It) Asda.” It’s about ‘class’.
To put it in British terms “If Apple was Sainsbury’s, Microsoft would be (I’m Chavin’ It) Asda.” It’s about ‘class’.
So Linux is Waitrose then?
Linux is a haberdashery: Lots and lots of choice, no idea what some of the stuff does and everything is individually packaged and designed to be used somehow with everything else, you just have to know what to do with it all
Oh, and there’s odd looking staff who always chirp “I’m Free!”
Edited 2007-07-09 17:44
Brilliant! ROTFL.
Target is widely considered a classier place than Wal-Mart. Not exactly upscale, but let’s just say that you’re (stereotypically) more likely to encounter people with all their teeth at Target than you are at Wal-Mart. While I don’t entirely agree with this assessment, I’m very surprised you’d never heard this, since it’s common knowledge.
I’ve often compared Microsoft and Apple in my head as Wal-Mart vs. Target.
Crappy healthcare is more likely
OK, that’s off-topic but I would like to point out that if a given group of 100 subjects has subjects with a quality A and 95 of this group frequent place 1 and only 3 frequent place 2 your chance to meet people with quality A is quite obviously higher in place 1 than in place 2, don’t you agree? Oh, and it says nothing about “class”.
You assume everyone who reads this are american. Never heard of anything but Wal-Mart, and that’s just because of the internet.
I use freetype in my os, and this was really great and to the point info. Thanks!
hhe
I would advise anyone, especially those who find font rendering on Linux lacking, to read the response article. He comes up with what seems like a great solution that would make Freetype beat MS and Apple, and rival Adobe.
Well, what was interesting was the information about using freetype with unhinted text (that’ll be the special hinting information for small fonts – not hinting in general). Looks like font rendering on my monitor (same approach) and looks like font rendering from OS X
… I kinda prefer the non-MS anti-aliasing, even on MS Windows. I say non-MS because I like the way Apple or some Linux distros do the anti-aliasing (I use MS Windows and some Linux distros). On low res displays it seems blurry, but on high res it looks much more nice and ‘correct’. I always thought that most fonts looked bad on MS Windows, even with AA on. Now I know why…
I feel the same way, but I’m environmentally challenged because I’ve been accustomed to postscript-fonts in OS/2 and Mac OS Classic and strict correctness in font rendering.
You’re right. It requires quite high resolution to work.
Thanks for posting these awesome links. And comments really show that – people choose the most familiar thing. It’s a fact that though that people have to be taught about what’s actually right, and in the technology world we have all the quantifiers for that but all too often the quality, well-researched material (second link) explaining that stuff to people is mysteriously absent.
It’s not about looking like ASS or whatever, it’s about doing everything properly the first time so that you never ever have to return to that problem again. Sure, there’ll be tradeoffs but in the end everyone will benefit. Now Microsoft actually drags the industry backwards and it will take years to resolve all the legacy crap they chose to put into Windows…
PS. I doubt the font fuzziness really affects anyone’s productivity out there. It’s a matter of getting used to seeing it, but the fonts remain perfectly legible despite the claimed ‘blurriness’.
And by the way, anyone can repeat the experiment on their own computer. It’s really exciting and you get to see what he’s talking about in action.
Edited 2007-07-09 15:42
Here are some links that might be usefull.
http://www.tldp.org/HOWTO/Font-HOWTO/index.html
http://gentoo-wiki.com/HOWTO_Xorg_and_Fonts
http://www.debianadmin.com/install-microsoft-corewindows-truetypeub…
Here is the Windows font howto:
1. Drag font to font folder.
Kinda straight forward.
You can do that too with Gnome.
1. Drag font to font folder (fonts:///)
Kinda straight forward.
EDIT: Remember – on *BSD/Linux the fonts will be locally installed if you do it as your normal user.
—
However, the approach only installs the font – it doesn’t allow for extra modifications.
Edited 2007-07-09 15:49
Myself, I usually just double-click on the font or right-click on the font and choose the appropriate option. I don’t bother with the fonts folder unless something is acting up (or I just want to go spelunking in the dark depths of my C drive).
Now that’s actually a funny thing to do – though one shouldn’t do it at summertime. Summer is meant for letting the eyes grow accustomed to day light
Kept low on the political bullshit, high on the information. Thanks for the good read.
Finally somone here who read that lengthy article instead of just joining in telling everybody their actual taste.
Not as low on political BS as I would have liked:
But a very informative article from Maxim nonetheless.
I hope Stephan from Haiku is making notes!
“But a very informative article from Maxim nonetheless.”
Yes very good indeed.
“I hope Stephan from Haiku is making notes! “
I’m not sure whether he is, but I am. I’ve been keeping up with all this recent talk about font rendering and plan to do what I can to get nice font rendering on Haiku. Of course what we have now isn’t too bad.
– Ryan Leavengood
Its been nearly 17 years since RISC OS first had Anti-Aliasing, and the alternatives are still not as good.
thats the interesting bit imo.
as in, how do we know that the training is correct when it comes to selecting something? especially when it comes to “taste”. as in, selecting “good taste” cant be trained, but it can be indoctrinated. that is to say, “good taste” isnt static. it changes from society to society, and when the “goodness” of it is questioned, people react because they use it to define the “goodness” of their lives.
in the end, taste isnt a science like math or physics, no matter how many times its pretended to be that way…
This is absolutely true. However, I think a good argument can be made based on the idea that computer displays ought to look most like what people are used to reading, which is to say, printed text. Looked at in this context, sharpness, proper spacing, and consistent letter thickness are all important, which means that none of the current solutions cut it, be they Apple, MS or FreeType (MS messes up the spacing, Apple the sharpness, and Freetype the letter thickness). This is exactly why the second article argues for a new system that provides a compromise between all possible factors. It is ultimately a matter of taste, but it’s based on criteria that we’ve been going by for hundreds of years.
Edited 2007-07-09 19:43
wierd, he was trying to say that the fonts looked wierd on linux, and the w,v and y was dirty looking, i diddn’t see it. then he showed me a the little circle with the gamma correction and he said that #2 looked better. well not on my monitor. i use linux, and on my monitor i dont see any dirt on my w v or y’s and the first one looks better. In the last one with the gamma 2.0, the curves look lighter than the regular strokes.
I can see the problems with w,v,y and also had a similiar impression before. I just thought, they aren’t very easy to render.
I agree that gamma 1.0 looks way better on the curved rectangle
I don’t see it either. To me the fonts on Linux have always looked better than ClearType on Windows XP.
I used to have a Matrox Parhelia that had built in glyph anti-aliasing. The fonts in Linux to me look like the Parhelia’s did. Cleartype has always appeared sort of reddish to me, which is annoying. I think the Matrox GlyphAA used just the grayscale method, which looked great to me. The only time I want color in my fonts is when I’m actually using a colored font…..
This may be a stupid question, but are you using an LCD? Because Cleartype, and subpixel rendering in general, is really designed for LCDs only.
Well, actually I am using a CRT, but I was just doing a comparison between how the fonts looked up close with no subpixel rendering and with. Zoomed in up close, you could see the colors in ClearType but not on the Matrox GlyphAA. On the CRT, ClearType does look blurry, but the Matrox implementation looked fantastic.
The fonts on Linux are very diverse. I really like the default look in Fedora 7 (blurry), yet Debian and Ubuntu have a very different look to their fonts.
Likewise, KDE3 seems to prefer big round fonts that look different than GNOME’s fonts.
It all depends on how Fontconfig is set up I guess.
Wierd, because to me the second one (with gamma 2.0) looks WAY better. In the first one, the issue is very subtle, but basically the w, v and y just look a bit thicker than the other letters. I’m using a 1024×768 Toshiba laptop LCD screen, btw.
I generally agree with the author.
However, with OS X 10.5 we should see true resolution independence, which would be nice on 1920×1200 displays.
On the whole, I will reserve my judgment until the day i see it (and hopefully can compare different OSs side-by-side)
For some strange reason when i install the ms font on linux the look awful. Fonts for me on linux look fine.
On my mac they are a bit blurry but not too bad.
When it comes to vision, what we like has more to do with what we are used to than we might realize. In 1896, George Stratton experimented with eyeglasses which completely inverted the image on his retinas. Initially, it was quite disorienting, as you would expect. But after an adaptation period of serveral days, it seemed quite normal to him. When he finally removed the glasses, the world seemed upside-down to him for a period.
By comparison, a little initially perceived blurriness around the edges of characters on a computer screen seems quite tame.
People like me, who are only a little near-sighted, and don’t bother with glasses, get very good at interpretting slightly blurry characters, not just on a screen, but on street signs, etc.
I do sometimes wear glasses for a period, and things seem sharp enough. But when I take them off, things seem quite blurry. But only for a while.
So users preferring the font rendering technology in the OS that they most use is likely more than just fanboyism, or even a case of “liking what they are used to” in the usual sense of the phrase. There is probably a strong case to be made for a physiological cause.
Edited 2007-07-09 17:56
In fact you can get used to everything, but at what (constant) price?
For me the most important question is: What is easiest to read for your brain?
I don’t want to waste “cycles” of my brain to read something in a certain font if it could be done a more economical way using another font or type of rendering the font.
I’m shortsided as well and I allways had headaches. The moment I got my glasses the headache went away.
My brain simply had to do less work. Less work to make pictures understandable, less work to make signs readable.
I think the same goes for fonts and rendering fonts differently as well. You get used to it but your brain has to do more or less work depending on the readability.
This is awesome info! How do I make my Ubuntu setup use that kind of font rendering? (I understand that this was an experiment, and that there may not be a way to do it yet – but I’m still hoping there is!)
Anyone fill out a feature request on the Ubuntu tracker yet? (To make this the default font rendering method, or at least to make it an easy option – I disabled hinting, and while not hard to find, it was especially easy either – and I don’t think it achieved quite the results shown at the end of this article.)
There are some requests about either enabling the autohinter, or fixing the GNOME font applet, but maybe you should make a new one. It would be much appreciated I’m sure
I know that Mark Shuttleworth seems very interested in fixing the font situation based on this blog post:
http://www.markshuttleworth.com/archives/119
“Fonti-ification
Anybody else frustrated with the state of fonts in Linux today?”
Like I’ve said, I always go with either the autohinter or no hinting (via the GNOME font config applet) because I think that blurry is so much easier on the eyes.
this article condensates all my thought and considerations about fonts rendering ant the ugliness of Linux font rendering. About time someone wrote this!
Hope Freetype developers and Distro packager (as well as anyone involved into creating a good Desktop Linux experience) read VERY carefully this and put it into practice.
It takes less than 10 seconds to change the settings in Gnome to make the fonts look as well as described. It’s a matter of configuration on a per-user setting. Nothing more.* And it can be done thru the GUI. And then font rendering is great.
* With exception of the few distributions which are shipping mutilated versions of FreeType2.
Old news…
I saw the same thing with the amiga in the late 80’s.
I thought that ocs/ecs looked so much better than vga, because of what i saw.
My verdict then, was… Pc’s for still-pictures and amiga’s for moving pictures.
Because i saw a much sharper contrast on the pc.
Amiga’s were like… Everything was more rounded in the pixels or something.
Edited 2007-07-10 01:27
For me, one isn’t necessarily better than the other because it is all subjective in the end – its up to the individual, what the individual expects and the hardware they have.
For me, I’ve got a laptop here running Windows Vista (1280×800) and certainly don’t see anything wrong – considering the end user doesn’t diddle around with DPI settings, someone whining about setting it to 120dpi is pretty much a non-issue.
As for my Mac experience, when I had an iMac G5 (20inch, 1.8Ghz), compared to Windows, sure the fonts appeared to have a ‘fuzziness’ but it doesn’t necessarily mean that it is inferior; I could still ready the screen for long periods of time, the accuracy was nothing to be sneezed at.
Now, in the case of Safari on Windows – that might cause some problems; when the end user is used to one kind of rendering and experience something else, they might think there is something wrong with Safari – hence, I am confused as to why Apple didn’t integrate it properly to use Windows built in technologies. It isn’t as though they could justify it by saying, “the targets technology is inferior, therefore, we need to use our own”.
Oh, and regarding Windows Vista, some fonts may appear to be mangled due to the application usingn the old bitmap based GDI+ rather than the new avalon rendering engine. It will take time, just as fonts looked crappy in Quickdraw based applications, so will fonts look on GDI+ applications until vendors pull finger and actually start using Microsoft new resolution independent API’s.
Edited 2007-07-10 01:39
Avalon’s not really taking off, Kaiwai. It really looks awesome to see the UI scale, but I don’t think Avalon’s going to go into the mainstream because it’s managed-only. It’ll be good for custom apps and LOB tools, but mainstream programs are not going to move over to it. That stuff will go unmanaged sooner or later (The next Windows or the one after it).
Pardon? there is nothing stopping someone from mixing their code, both managed and unmanaged.
The problem is that if they go avalon, they break compatibility and thus lock themselves out of the Windows XP market – it would be a death sentence for a company if they did something like that.
With that being said, contra to the ‘doom and gloom’, Windows Vista sales are going pretty well; for products like Adobe, it uses its own custom rendering engine. As for Microsoft, I’d assume they use the default widgets which sit ontop of Avalon.
It all takes time – just as it takes time on MacOS X for applications to move to Quartz.
I don’t like Apple’s way but the screenshot i made shows how bigger sized fonts look like on Mac and on Windows.
The top picture is the ClearType one, just check the font edges (these have sub-pixel rendering but there’s no anti-aliasing?
http://i18.tinypic.com/66jl4jd.png
Good point. My apologies.
At higher resolutions on screen, cleartype STILL looks better – that whole 1600×1200 at 17″ arguement being utter rubbish as that’s STILL not enough resolution for the so-called ‘advantage’ of Apple’s rendering to matter… on top of which who the **** is going to turn the resolution on a Mac up that high when you cannot resize ANY of the menu or title text elements? Let’s face it, OSX doesn’t let you resize jack. (though allegedly Leopard was SUPPOSED to change that – and then we’ve heard nothing on that front for over a year)
As someone who runs two 21″ CRT’s at 1600×1200 and a 17″ at 1280×960 – and jumps the center 21″ to 2048×1536 when working in Max or Blender – cleartype looks gorgeous… while if I switch the KVM to my DellMac it looks like crap even at the same resolutions – and from a practical standpoint I can’t even think about running higher than 1280×960 on the 21″ because theres no provision for making the menus a usable size, much less the text in dialogs.
But the core of the problem is none of that IMHO – it’s that BOTH sides of the arguement use the same renderers for print and for screen… that’s BULL.
Why can’t we have the rendering method that looks good on low resolution devices like screen used for screen, and the method that looks good on print used for print? Here’s an idea, let the APPLICATION choose which to use! Typeface preservation means jack to someone EDITING or READING text on a screen – Screen legibility means shit to someone actually doing LAYOUT.
Of course one also has to keep in mind that Windows users too lazy to bump the resolution a notch or three, or running thier LCD’s at less than native resolution because they haven’t discovered the ‘large fonts’ setting scream and shout about cleartype being blurry – so the OSX rendering is going to be completely unacceptable to the folks who don’t even like font hinting in the first place.
… and frankly, if the KERNING and LEADING add up to the exact same spacing of characters and lines, it shouldn’t matter what the render technique of each letter is… This is where we call bullshit on the whole arguement and say use the per character rendering that works best for each media type – just be sure to preserve the render-box width of entire words and the spacing BETWEEN words and BETWEEN lines. Do that, and you could render it on-screen with the wrong font and still have the layout preserved for print… WHICH IS SUPPOSED TO BE THE POINT… and where things like Freetype and the font kerner in OoO are complete failures since both don’t even render the same word the same twice in a row, with the ‘jumping letters i and l’ bei ng the most not icab le… a hefty part of why a lot of people who work with text for a living prefer Word over OoO even if they aren’t ware of it.
Oh, and Target is nowhere NEAR elitist enough and nowhere NEAR as full of overpriced particleboard covered in cheep veneers, with reviewers ranting and raving over the quality while anyone who actually KNOWS hardware opens one up and is reminded of a Packard Bell… and target allows too much choice as to what you can buy: Apple is IKEA, not Target.
Thanks to BrettW over on the parallels forums for this little gem:
For those not in the know …. “IKEA is a fully immersive, 3D environmental adventure that allows you to role-play the character of someone who gives a s**t about home furnishings. In traversing IKEA, you will experience a meticulously detailed alternate reality filled with garish colors, clear-lacquered birch veneer, and a host of NON-PLAYER CHARACTERS (NPCs) with the glazed looks of the recently anesthetized.”
Sounds like Apple to me.
Edited 2007-07-10 18:49
it has been posted here by several people that the windows way looks worse the higher the resolution goes.
but how i understand it, the higher the resolution, the less i need all of this hinted-subpixel-kerning-antialiasing-madness
or do you realy believe that on a 1200dpi display you will notice any difference between text rendered with and without all this cpu-cycle-wasters?
my 1200dpi laser printer at least has non of this features and non of this problems
>> but how i understand it, the higher the
>> resolution, the less i need all of this
>> hinted-subpixel-kerning-antialiasing-madness
BINGO!!! Give the man a cigar.