Mozilla Firefox has been listening to recent memory complains, and as a side effect tested the browser’s scalability to the extreme with memshrink’s improvements.
The results are shocking: For 150 tabs open using the test script, Firefox nightly takes 6 min 14 on the test system, uses 2GB and stays responsive. For the same test, Chrome takes 28 min 55 and is unusable during loading. An optimized version of the script has been made for Chrome as an attempt to work-around Chrome’s limitations and got an improved loading time of 27 min 58, while using 5GB of memory.
Interesting benchmark. I’m glad that firefox is really making a push to take back the various performance crowns, and I hope that we do see some good gains out of it. But I still wonder if these improvements make that much of a difference to the average user. I know that some people always seem to pop up in browsers threads complaining that they have had 75 tabs open for 4 weeks and the performance is lousy so that means browser X sucks and for these users benchmarks like this are encouraging. But I know very very few people who actually use a browser like this. Is there a difference in performance with a more reasonable number of tabs, like 20 or so. And of course the big issue with firefox for me has always been memory leaks. Firefox has always beat the shit out of chrome is straight up open tabs and check memory usage. But in the real world your not just opening one set of tabs, your opening and closing them all day. And I have always found firefox to use more and more memory as the day goes on, while chrome keeps going back to the baseline memory usage. When Firefox fixes that memory problem I want to read a blog post on that.
That was exactly my first thought, also. Fortunately, about a month earlier on the same blog, we have:
http://gregor-wagner.com/?p=27
The changes won’t be in FF6, but will be in FF7.
And that’s not even the end of it. Under the leadership of Nicholas Nethercode Mozilla is making a strong push towards reducing Firefox’ memory usage as much as possible without sacrificing performance. Further memory reductions are already in the pipeline for post-FF7 releases. The best place to follow the effort is by reading Nethercote’s weekly MemShrink progress reports on his blog: http://blog.mozilla.com/nnethercote/
The strange thing is that…well, people have been telling Mozilla devs for YEARS that Firefox is eating up way too much memory, and they’ve just denied the whole thing all this time. So, are they now admitting that they’ve been in denial, or are they claiming that Firefox has just now very recently started doing that?
As with everything, performance requires compromises.
A simple example: do you keep cached images uncompressed in memory so that they you can load they very fast, or do you keep them compressed?
On a system with less memory, you probably want them compressed (maybe completely discarded and you reload them when needed). On a system with lots of RAM you probably want them uncompressed, so that the loading time is virtually 0.
Yea, staying on the sidelines makes it easy to comment… Hindsight is also 20/20…
Image decompression is rather fast, though – so it might make more sense to keep compressed versions cached, and decompress as needed (and possibly keep decompressed cached for a bit as well)?
How about this, start with the low hanging fruit – when someone closes a tab how about reclaiming the memory then you might actually find that many of the complaints regarding Firefox would evaporate. The problem has always been the inability for memory to be reclaimed after a tab or window has been closed but the Firefox developers keep denying what is really the problem in favour of spending time in trivialities.
Btw, I once again expect Firefox 7 to be an entirely giant disappointment on Mac OS X just as previous versions have shown – once again unless you’re a Windows user you’re pretty much shit out of luck if you expect something half decent on your platform of choice.
That’s exactly what the previously linked to blog was discussing. The issue was that memory got highly fragmented, so that even when a tab was closed and the memory freed a lot of it stayed around because there were 1 or 2 chunks still in use by the browser UI in that chunk of memory.
FF7 addresses that issue by moving all the browser UI memory into it’s own chunks separate from the page content, and they saw massive improvements.
Of course, that wasn’t the only problem and they are still going through lots of new ones, but perhaps before you talk about how they should start going after low hanging fruit you should do the research to find out that’s exactly what they are doing.
Also, as far as the image decoding goes – i think the new plan is to leave them uncompressed for 10 seconds, after that the cache gets flushed. I think they have to be decompressed for things like scrolling, so even a fast decode is going to be slow if you have to do it 100 times a second.
Edited 2011-08-04 00:25 UTC
That is all very well and good but it still doesn’t address the fact that non-Windows users are treated like second class citizens; high CPU utilisation, lack of NPAPI pepper extensions (which leads to craptastic Flash performance when compared to the plugin running with Safari), then there is responsiveness issues, lack of hardware acceleration, the various Mac OS X specific bugs with the only excuse “well, its a problem with Mac OS X” (ignoring the fact that other browser developers don’t seem to have the same problems).
Like I’ve said, I really do want to see a viable alternative to Safari on Mac OS X but the complete lack of any drive to make Firefox on Mac OS X a first class citizen equal to that of Windows makes it a non-starter for me. Visualise this, I’m sitting here saying, “hey Firefox I really want to use your browser but you’re not treating the platform I like using as a equal to Windows” with the reply back from Firefox developers being ‘fuck off and leave us alone” – delightful!
Huh? Without more specifics I can’t reply to that. But most Firefox developers use non-Windows platforms, I’d say by decreasing order it’s Mac, then Linux, then Windows. So when they profile and optimize stuff, more often than not it’s on Mac or Linux.
I actually don’t know what these are but the main guy working on NPAPI is full time on Mac… so I don’t think it’s an afterthought. File a bug maybe?
Specifics?
On Mac there has been compositing acceleration since Firefox 4. Content acceleration is not yet available on Mac as there is no Direct2D equivalent there. Direct2D means that on Windows, the hard work is already done for us, that’s why Windows got content acceleration first.
On Linux, there’s been XRender content acceleration for a long time, but it’s not always great. Compositing acceleration is still disabled by default as due to texture_from_pixmap weirdness it’s harder there than on other platforms, but there’s a good chance to finally have it on by default in Firefox 9. Content acceleration by OpenGL is still not available for same reasons as on Mac.
Content acceleration on Mac and Linux is on the radar, follow the Azure project. Whenever it gets either a OpenGL or Skia backend, that will give us that.
If that is the case then they’re doing a horrible job as programmers – the performance is horrible, lack of integration, out of place GUI etc.
According to this: https://wiki.mozilla.org/NPAPI:Pepper
Which now points to: http://code.google.com/p/ppapi/
It allows the plugin greater access to things such as hardware acceleration – Flash for example on Snow Leopard and higher (which has NPAPI Pepper extensions) uses Core Animation for example to speed up performance and reduce CPU utilisation.
Running heavy under a load. It comes back to the fact that they use XUL which could have been avoided had they had a small core and then built a native GUI on top of that rather than trying to write one GUI and cater it to the lowest common denominator.
QuartzGL has existed and can be used on a per-application basis – there are ways to accelerate the content but Mozilla developers so far have been unwilling to dedicate the same amount of resources to Mac OS X as they do when it comes to their Windows builds.
Btw, there is nothing special about Direct2D/DirectWrite, you could do the very same thing using OpenGL – it would require more code but it is doable but it raises the greater question why wasn’t there any move to create an API layer that sat on DirectX/OpenGL that delivered hardware support on all platforms at the same time?
Again, I look at the official Mozilla blog and again I see nothing in the wake of the Lion release, no information on the role that OpenGL 3.2 will play in future development, no mention about the enhanced sandboxing technology included in Lion, no mention of Xcode 4.2 moving 100% to LLVM and the role that’ll play in future Firefox builds. So not only is there an issue with product that is neglected there is a complete lack of communication with the wider enthusiast community.
I like FF4 on Linux (Fedora 14 x64)… Am I an alien ?
You must have the patience of saint because Firefox on Linux is so bad in my experience it makes Firefox on Mac OS X look awesome. I remember when I was using Linux back in the day I tended towards using Opera – site incompatibilities aside, it was the least painful of all the browsers.
Well, not much of a difference as compared to Windows 7 for me. The machine is relatively powerful though (Core i5 M430 + 4GB RAM), maybe I need slower hardware to feel the pain.
I’ve got a MacBook Pro (2011) with an i7 Quad Core 2.2Ghz so the performance is ok on this machine and my iMac but mine is at the high end of the spectrum where as running Firefox on my old Core 2 2.53Ghz wasn’t an enjoyable experience when compared to Safari. The simple fact of the matter is that if Safari, Chrome and Opera all run well on the same hardware and Firefox runs horribly then it is only logical to conclude that Firefox developers need to pull finger and do something about the issue.
nah you just dont follow trends for the sake of following trends. id say its a rather good thing
I’m the last person to claim that Firefox never had problems with its memory usage having been afflicted by them for years myself (on OS X at least). I’m not blaming Firefox alone for that since various tests showed that at least a vanilla 3.6 has superior memory usage characteristics so Mozilla’s claim that its partly caused by extensions is certainly not completely unfounded. But I do blame Mozilla that they’ve been pimping their extensions ecosystem as one of Firefox’ major selling point for years without sufficiently educating the user about their potential hazards (just check the in-browser extension manager. Warnings: none). At the same time they offered no easy to use tools for debugging and also didn’t want to take responsibility for extensions wrecking havoc. That’s a classic case of trying to eat the cake and keep it.
But I’m also having trouble with blanket statements such as that Firefox has been using too much memory for years. “For years” encompasses vastly different versions of Firefox with completely new or rewritten subsystems and features. Each of these can be a source of regressions and improvements. Mozilla has also been never in denial that Firefox 4 regressed in memory usage (partly because of the new but not fully optimized JIT compiler) but decided to release it nonetheless due already having amassed a delay of several months.
I’m glad that they are making such a public push to improve their reputation here. They still have a sizable market share despite having come under heavy pressure from Chrome and they still employ talented developers and they are the only organization fighting for an accessible Internet. I hope that the latest efforts are not a case of too litte, too late. I’d certainly miss the old fox.
Firefox 4 and beyond has changed their memory management from 3.5/3.6 and it started to use a lot of memory again, hence the big push they’re doing now.
This push is also more important than when they analyzed memory issues and fixed them in FF 3.5/3.6 so they’re going as deep as they can with it.
It seems it does pay off with more than just “memory savings” and I’m glad.
I’m kind of sad that Firefox 4 got released like that, but they just had to release it. The FF4 release was just taking to much time.
I don’t think there were that many user-visible changes in FF5. Which made people question the reason for the release. I think they maybe could have waited, maybe called it 4.1 release or something.
Also they want Firefox Mobile to work, obviously those devices have a lot less memory to work with. That could really help on the desktop too.
I do think it is the wrong test though.
Having a test where he would open 150 tabs, wait 10 minutes, close many, open some more wait 10 minutes and so on would be much better.
I’m on the daily builds (thus FF8) and I think having FF open all day long works better now. Also the about:memory page now shows a lot more detail.
Anyway I’m glad they are working on it now. I think the problem was, they couldn’t reproduce the problems very well.
They also have a reduce-UI-latency project, can’t remember the name, which I think is maybe even more important.
Atleast they are now gathering realworld numbers from real users:
http://blog.mozilla.com/tglek/2011/05/13/firefox-telemetry/
Hopefully that gets them the information they need.
Which would be… charming, in a way; them not using their baby in a thorough way (alternatively, knowing full well what is the “right” way to use it)
what I’m hearing is that FF4/5 don’t *leak* memory as much as their predecessors – if you open and close tabs, memory use goes up and down as you’d expect. But they do seem to be using a lot more of it than FF3 – it might not be leaking but it’s still a problem on machines that don’t have much to spare (like my little netbook). Sounds like that’s what they’re addressing at the moment…
It’s always sucked at having many tabs open, so this is no surprise.
If you had 150 tabs open each one would only be a few pixels wide, making the browser completely unusable anyway, and Google has never shown any indication of caring about this.
That’s the primary reason I’m still using Firefox, actually.
So OK, FF is better than Chrome if you open 150 tabs simultaneously, but who does that??
Nobody: you open tabs, close some, open other, etc and in the end you may have a big number of tabs open (though 150 seems excessive to me: 50 would be more interesting).
This would be a better test, and the results may be different: the architecture of Chrome may allow better resource management than thread oriented FF..
Also he should try his benchmark with different settings: it’s possible to configure Chrome to use only one process, I wonder if the resource usage would be very different..
*Ahem* Not quite ‘nobody’. My wife had 173 tabs open in Firefox last evening. I don’t think she’s the only one in the world to open that many tabs.
I have a 112 tabs open in Firefox right now!
On some days I have over 400 tabs open in about 20ish browser windows spread out over IE9, FireFox 6 beta, and Chrome 13 beta running at the same time under windows.
Right now I am running Slackware, I have those 112 tabs on firefox plus about 30 rekonq tabs, and about 80 tabs open in Opera.
I am just a ocd power user!
Edited 2011-08-04 09:59 UTC
I cringe every time process separation is mentioned as a mean of resource management. It actually is only useful to hide inefficiencies by hoping that the process will be shut down before its resource usage causes noticeable problems. I’d rather have the browser developers discover and fix memory leaks and optimize their data structures before relying on such heavy-handed “solutions”.
Oh really? So explain to me: how do you know which tab use 99% of CPU/too much memory in Firefox?
With Chrome it’s easy: it has an integrated “task” manager.
Look: the default “share nothing” of processes is much better from a security, resource management POV than “share everything” of threads: threads are an optimisation, that’s all.
In Nightly (Firefox 8), in about:memory you can see how much memory each tab uses. In general it’s true that multiple processes make it easier for the user to know which tab is using what.
Firefox is eventually moving to a multi-process architecture a la Chrome (though that’s probably still 6 months away). So for sure Mozilla agrees that’s the better approach. But notice that not even Chrome can rely on that for security, since above a certain number of open tabs, multiple tabs end up sharing the same process. I also wouldn’t say that “threads are an optimization, that’s all”. Threads are light-weight processes, i.e. processes that share the same address space.
Edited 2011-08-04 13:48 UTC
actually its such statements that lead to incomprehension.
multiprocess is not technically the better solution but its the one that works best in practice, because it provides a higher isolation.
thats why chrome works well. its actually a pretty crappy browser in SOME regards. it crashes a lot, its quickly slowing down, etc
since its multiprocess it doesnt matter too much, everything is working properly again as soon as you close the tab
current firefox cannot do that. it cannot afford anything going wrong because resetting things back means restarting the complete browser