Phoronix compared the performance figures of Mac OS X 10.5.5 with those of Ubuntu 8.10. They conclude: “Apple’s Mac OS X 10.5.5 Leopard had strong performance leads over Canonical’s Ubuntu 8.10 Intrepid Ibex in the OpenGL performance with the integrated Intel graphics, disk benchmarking, and SQLite database in particular. Ubuntu on the other hand was leading in the compilation and BYTE Unix Benchmark. In the audio/video encoding and PHP XML tests the margins were smaller and no definitive leader had emerged. With the Java environment, Sunflow and Bork were faster in Mac OS X, but the Intrepid Ibex in SciMark 2 attacked the Leopard. These results though were all from an Apple Mac Mini.”
Why in all world would you do a benchmark-comparison like this on a Mac Mini?
If you’re testing OpenGL performance, maybe – just maybe – you should test cards that are actually built for 3d performance.
I know from my experience that 3d OpenGL apps in Linux easily outperform their OSX counterparts. Talking about real world results…
“I know from my experience that 3d OpenGL apps in Linux easily outperform their OSX counterparts. Talking about real world results…”
Strange, I’ve never seen any 3D app running on Mac OS X outperforming OS X, i am sorry but OS X has a much better Open GL implementation than Linux, pure and simple.
Also as a side remark, with the last Xcode development environment version (3.1), Gcc 4.2 is provided among Gcc 4.0 and from what i saw using it, it is significantly faster than Gcc 4.0 for compilation.
And i wonder if the small differences between Mac OS X and Ubuntu x86_64 in the encoding benchmarks are not simply due to the fact that Lame, Ogg encodind and FFmpeg are not compiled in 64 modes on Mac OS X. On Ubuntu x86_64, they take advantage of the additional registers in 64 mode particularly beneficial for this sort of benchmarks.
That may be true in certain cases but I have yet to see anything really push opengl on a Mac. Even then if you were say do this benchmark comparison on a Mac Book Pro SR or later, Linux would win hands down because while the Nvidia drivers on Linux aren’t perfect they are a hell of a lot better than the sluggish, buggy drivers included in OSX. If they use ATI hardware that’s a totally different story because ATI’s hardware has better drivers under OSX. Intel hardware just all around sucks. I do agree with you that the opengl stack in OSX is better than Linux, but a lot of that has to do with the way X’s architectureand how it relates to the hardware, this is being worked on at the moment.
Try Maya, to give you one example. The Linux version will give you a much higher framerate than the Mac version. OpenGL in OSX on Nvidia cards is sluggish and slow, IMO.
//Strange, I’ve never seen any 3D app running on Mac OS X outperforming OS X,//
What?
Well, his statement is factually correct… even though it does not make much sense
He probably meant to say ‘Linux’ instead of ‘OS X’ (at the end of the sentence)
I don’t see why it would be such a bad idea to perform a benchmark like this on the Mac Mini.
After all, it’s supposed to be plenty fast enough to run both OSes on, so as long as you’re doing an apples to apples comparison, it can be on as powerful (or as crippled) hardware as one likes, right?
And as always, Apple has the advantage of a ‘home game’ (since they own the Mac platform, they can optimize for it)
Agreed on my part. The same hardware with different OS’s would be a fare test. Did you use a timer to come to these results?
I mean, what’s the point of the comparison if you’re not doing it on the same hardware?!?
Are you serious? What’s the point of commenting on an article if you didn’t even read it?
We had used Apple’s BootCamp to install Ubuntu 8.10 “Intrepid Ibex” on the Mac Mini.
Two things left a major impression on me as I read the results. People usually complain about two things when it comes to OS X and performance. 1) It’s a microkernel, thus according to Linus it must suck. 2) HFS+ is a relic, slow, etc.
Now both of these may be true, but they aren’t reflected in actual tests. If microkernels were so bad, you would assume that it would trail behind an OS like Linux. Yet, it doesn’t. It keeps up and at times exceeds it. The SQlite benchmark just shows how large the performance delta can be. If HFS+ was so useless compared to “modern” filesystems, you’d expect it to be left in the dust. It isn’t, and in bonnie++ it actually exceeds ext3 by a sizeable margin.
As for OpenGL performance, I’ve always found Mac OS X to have excellent OpenGL support and performance. The only application I’ve used where it was slower on OS X than on Linux, has been Matlab.
All in all, I found that article very educational.
People usually complain about two things when it comes to OS X and performance. 1) It’s a microkernel, thus according to Linus it must suck … those may be true, but they aren’t reflected in actual tests. If microkernels were so bad, you would assume that it would trail behind an OS like Linux. Yet, it doesn’t
OSX does not use a microkernel.
from http://www.usenix.org/publications/library/proceedings/bsdcon02/ful… : At the heart of Darwin is its kernel, xnu. xnu is a monolithic kernel based on sources from the OSF/mk Mach Kernel, the BSD-Lite2 kernel source, as well as source that was developed at NeXT … xnu is not a traditional microkernel as its Mach heritage might imply … the kernel is in fact monolithic – Loius Gerbarg, Apple Computer
Whatever thse benchmarks may say, they say nothing at all about the performance of microkernel vs monolithic kernel.
Before this discussion degrades to yet another useless Vista-vs-Linux-vs-OSX-vs-BSD, I should point out that the Intel graphics card drivers have been more-or-less rewritten in the past couple of months and have yet to reach stability. Further-more, there’s a known performance regression that, AFAIK, currently blocks the release of Fedora 10.
Tracker bug: “intel driver is slow”
https://bugzilla.redhat.com/show_bug.cgi?id=469690
Both Fedora and Ubuntu are fast moving targets; performance will degrade and improve wildly (!!) between releases and as such, performance figures gained by comparing either one should be taken with a grain of salt. (E.g. My Fedora 9 is 10% slower in games than CentOS 5.2 running on the same machine – mostly due to a known regression in Alsa.)
Having said all that, I think we should all commend Phoronix for releasing a series of comprehensive performance reviews. (Let alone for developing the first comprehensive OSS benchmarking suite)
– Gilboa
Time before deploying the Fanboi Automatic Response Team (F.A.R.T) – it’s a tie!
Well the 3d benchmark here doesn’t mean much, doing comparisons with a performance of about 10fps. It only tells us that you can’t play those games, ie: they don’t work properly on this machine, with either OS.
Those bench should have been discarded as invalid.
Having a 4-12 fps difference on a game that is crawling like hell could be due to a lot of things. But we certainly can’t say that Urban Terror, thus OpenGL, is 400% faster on OSX than Ubuntu Their conclusions are totally flawed.
I don’t know which OS would win, but i suppose that on the hardware it’s meant to be run, the differences would be still between 5-15 fps but on a base of 60 to 90 fps. That would give a 5-15% difference. That’s roughly the difference i have here (depending of the game) between Windows and Ubuntu on a 8800GTS.
Also i’m not sure it’s totally fair to disable compiz. Ok it’s not on by default on Ubuntu, and could worsen the benches, but OSX has this feature on by default (quartz) or am i missing something?
And quoting the conclusion “Moving past the graphics benchmarks…”
Where are the 2d benchmarks?
edit: typo
Edited 2008-11-06 15:27 UTC
2D should be faster too, OS X has Quartz 3D acceleration for 2D graphics. Compositing has been standard with the OS for several years now, and is at a better level of optimization than Compiz, in spite of having not as many flashy effects offered.
You’re probably right about the level of optimization, but i want numbers! And saying “faster _too_”, that’s supposing the 3d benches were valid
Between display PDF and Quartz 3D compositing for 2D graphics, I’m fairly certain it would trounce Linux at that; especially since compiz under most workloads doesn’t really speed things up, but rather just adds eye-candy, while Quartz’s 3D acceleration does. Of course, most systems have fairly fast 2D graphics to the point of being visually indistinguishable. Sorry, no numbers.
Mac Mini is a nice piece of engineering and is clearly tested for OS X. The advantages are obvious on this machine. Mostly on the graphical stack. The machine seems speedy for the hardware it offers. Still if you use memory hungry applications, the feeling become sluggish.
Is sad that the Mac Mini have a less impressive multi-threaded performance as much as Linux have nice algorithms on this side.
Look here: http://www.anandtech.com/mac/showdoc.aspx?i=2520&p=8
I feel OS X that works nice as much I run ITunes with it and I mirror some things on the OS X Leopard’s dock. Launching Firefox and some (more tabs) make all Leopard experience slow with 1G of RAM! Was slower than Vista for the same things! (really, I have a similar laptop at PM 1.7GHz with 1.5 G RAM compared with C2D 1.83 with 1G)
This benchmark proves one thing: for one task and for graphic stack OS X stands very well on OS X! For optimized 64 bit applications and for heavy load, for sure Linux shine.
(edit: update the link to the right spot)
Edited 2008-11-06 19:45 UTC
Yeah, because we know that benchmarks of a 3 year old OS X is relevant today… Apple aggressively develop OS X, and the scheduler in Leopard is a lot better than the one in Tiger.
An Audi RS4 vs a Ricer!
If by ricer you mean Evo…. then the Ricer wins in the real world everytime….. only loses in the quality of interior fittings.
heh
Always like a good walking frame race.
Kind of reminicent of Top Gear trying lap times in the Chev Lacetti
Would be nice to see them try it again on some neutral territory, and with bigger specs. Maybe a hackintosh Alienware box?