If there’s one thing that will make even the most powerful computer feel like a 7 year old rig, it’s Adobe Lightroom paired with RAW files from any high-megapixel camera.
In my case, I spent over a year of spare time editing 848GB worth of 11,000+ 42-megapixel RAW photos and 4K videos from my New Zealand trip and making these nine photosets. I quickly realized that my two year old iMac was not up to the challenge.
In 2015 I took a stab at solving my photo storage problem with a cloud-backed 12TB Synology NAS. That setup is still running great. Now I just need to keep up with the performance requirements of having the latest camera gear with absurd file sizes.
I decided it was time to upgrade to something a bit more powerful. This time I decided to build a PC and switch to Windows 10 for my heavy computing tasks. Yes, I switched to Windows.
I love articles like this, because there is no one true way to build a computer for any task, and everyone has their own opinions and ideas and preferences, making sure not one self-built PC is the same as anyone else’s. Add in a healthy dose of urban legends and tradition, and you have a great cocktail for endless discussions that never go anywhere.
It’s clickbait without actually being clickbait.
See my recent posts about moving a printer from one USB port to another. Or there’s the sluggish start menu / taskbar. Or the fact that various WIndows constantly jump in your way and take focus away from whatever task it is you were doing. Or the horrendous Windows updates. It’s really the little things that add up.
I’m not saying it isn’t usable. But Windows is way more frustrating to use in general than macOS.
Also, WSL is a nice idea, but it has too many compatibility issues to be a true replacement for a real Unix terminal.
Oh, and then there’s the horrendous high gamut color profile support. If you are editing photos in sRGB, then I guess it’s fine, but good luck on a higher gamut screen…
Edited 2018-01-24 17:45 UTC
One solution is what I am doing with my Windows 10 laptop. I download files/programs with my Windows 7 machine and then transfer them to my Windows 10 laptop using USB sticks.
Because the system is *** NEVER *** attached to the internet there are no automatic downloads going into my machine. And Windows 10 is always trying to download something no matter what you do.
So once I got Windows 10 working the way I wanted it to work it does not change unless I add the changes myself.
I mean, you could just use another operating system, and not have to go through all that.
As simple as that. If you think doing all kind of weird stuff just to be able to copy a couple of Gigs from one computer to another is “OK”… then you _DESERVE_ to be a Windows user.
It’s 2018 people…
Could you elaborate about:
– Security updates to Windows?
– Updates to the installed applications?
Oh oh oh oh! Me! I know this one!
It think it’s a new printer and you have to install the drivers again, right?
That is actually dependent on the printer model. If the printer provides a USB serial number, then Windows will know it’s the same printer and will not create a new queue.
If the printer does not provide a serial number (it’s an optional part of the USB spec, manufacturers can choose), then Windows will assign one based on the port it’s physically connected to, which obviously changes when connected to a different port, making it look like a different printer of the same type.
For bigger items, like a printer, manufacturers should really set a serial number, but that can have unwanted side-effects too.
Enumeration of multiple, identical devices over USB has problems on every OS, not just Windows, certainly Linux and Mac OS have trouble too, in different ways.
That’s all fine and well with the serial number – Linux and macOS on the same machine don’t have this problem, with the same printer. I plug it in, it’s almost instantly ready – in any USB port – but not in Windows…
The reason for this is that USB device enumeration order at boot is not guaranteed. If devices lack serial numbers, the settings and device location are tracked by connection, so if you move ports, its a different connection, and treated as a different device.
Say you have two identical printers, one configured for photo printing, the other just black and white, but also shared to the network. If Windows didn’t do it this way, when you rebooted, the devices might be enumerated in the wrong order, so that now your printer configured for photo printing is shared to the network, and any software set to default to the photo printer will now be set to print to the black and white printer.
This undesirable and unreliable behavior. This is a bad thing, even worse, than waiting a few seconds for a device to be reinstalled when you plug it into the port for the first time.
Edited 2018-01-26 07:45 UTC
The only reason I am still on Windows, having left Mac, is for the Adobe suite. There is just no comparison with the competition and I wish Adobe would just remove the head from the sand and port it to Linux.
I wonder how hard would it be to do it in conjunction with other companies: Adobe talks to Amazon, Steam and Canonical or Red Hat. Presto, instant distro with support for great apps.
Edited 2018-01-25 11:36 UTC
Adobe’s approach to Linux is generally broken (and stupid). With Flash, they would only support what was in the kernel, and not rely on useful libs that exist in most distros – what a bonkers way to interact with Linux.
When Valve ported Steam, they did it the right way – they started with a full distro (Debian I think it was, maybe Ubuntu), and got everything working there. This allows it to run on anything, because all another distro has to do is port the necessary libs to their own distro, and it’ll start working.
I wouldn’t expect Adobe’s engineers to ever figure this out. They are too far up Microsoft’s arse to notice better ways of developing code (and their suite is shite on macOS – like, why am I paying them?)
Mac OS X is a beautiful OS to use, but Apple’s hardware is just awful nowadays. Until they wake up to the fact their hardware sucks if you want to do anything with it, i’ll have a Windows (7) machine to do my heavy computing and gaming tasks, and a Macbook to do web browsing and casual gaming.
Apple’s software sucks for gaming (really it’s their third party driver policy, but they also refuse to modernize their OpenGL implementation). But their hardware is pretty good – most PC gamers actually use crappy under powered Windows laptops – THOSE are crappy hardware.
What Appple does lack as far as hardware, is a nice middle tier hardware package, with a beefy GPU (or the ability to upgrade the GPU). If you want a decent GPU, you have to spend big coin to get the top end hardware (and they aren’t even offering top gaming GPUs any more in their laptop lines). That’s not ideal for gamers – but what they do offer isn’t super terrible either – not compared with generic Windows PCs…
Do you have a source for this? If you mean casual gaming like strategy and MMO games, I can believe it. But quad core laptops are rare and much more expensive than an equivalent desktop machine, and AAA games are increasingly requiring more than two cores to even launch. Hell, Far Cry 4 won’t run on anything less than four cores without major hacks and the bugs those introduce, and it’s old by gaming standards.
Some background. I am a serious amateur photographer. Until last year I used Nikon DSLR kit, I had a 36 megapixels D800 and a ton of lenses. But I got fed up, all the Nikon stuff just weighed so much and was so big that I mostly never took it out, certainly not for casual outings. Hiking with all that weight was horrible. Although the D800 took great photos I just never really warmed to it. So last year I jumped ship, sold all the Nikon kit and bought a Fuji TX2 and a set of fuji glass. What a difference, so much lighter and smaller that I actually I use it all the time, the 24 megapixels images are really nice and the retro camera design with all the physical dials is just a real joy to use. Photography is fun again.
I have used Photoshop and Lightroom ever since each was released. I love what the software is capable of but Lightroom in particular feels very unoptimised. I run it on my nearly 10 year old 2008 Mac Pro, upgraded with an SSD drive, USB 3 card and a new graphics card and it does surprisingly well. But even so waiting for the interface to respond in the Development Module is tedious and importing and building large previews just takes too long. Plus my machine is now too old for system updates and there are irritants like the PCI USB 3 card doesn^aEURTMt support booting from an external drive, etc.
I actually have a recently built duel boot Hackintosh but now I just use it as a fast Windows 10 machine because the hacked OSX side of things always just felt too flaky and unreliable. So I have continued to use my Mac Pro for serious stuff like photo editing because I could not bear to do anything of substance in Windows 10. I know this might rankle with some people but a simple but important difference between my Windows and Mac experience is that all (and I mean all) Windows PCs I have ever used have crashed and frozen at various times, whereas my Mac experience is that system crashes are vanishingly rare and usually mean some sort of serious physical problem with the hardware (motherboard/hard drive failure etc). I also find the whole design of Windows 10 (which is a great improvement over it predecessors) is just still not as good as OSX, its still badly designed, inconsistent and often quite ugly.
Now after saving for while I am just about to pull the trigger on a fully specced out top of the range 27inch 5k iMac. Very excited if a little apprehensive – changing my PC after 10 years is a big deal and its a lot of money. I am not expecting miracles from the new iMac but I am expecting a significant speed up and the 5k screen will be very, very nice.
I just wish Adobe would do something about rewriting some of the core code in Lightroom Classic but I fear its attention may be elsewhere.
Go open source and use Gimp and Inkscape.
Ever considered a career in unfunny stand-up comedy?
Just do yourself a favor and trial Capture One Pro. My raw files are a mixture of Nikon NEF (D810/850) and Fuji RAF (X-Pro2). I run a maxed out 5K iMac, and Lightroom still runs like garbage. It not the file sizes, but simply years of Adobe legacy cruft holding back the Win/Mac LR experience. Raw processing/rendering the same files with Lightroom on an acquaintance’s iPad Pro, and it’s just so much faster.
Edited 2018-01-25 16:42 UTC
I get the strong feeling that Adobe software engineers are paid by lines of code…
I agree that the complete and utter lack of feedback from the Windows 10 start menu is inexcusable.
But everything else you describe sounds like a seriously compromised windows installation.
Have you ever tried to use a wide gamut screen on Windows? It’s pretty terrible. If there is a way to make it so the Windows UI’s reds don’t gauge your eyes out, I’d love to read about it…
Also, I only listed like 3 things. I didin’t even get into the weaknesses of Windows Explorer, and Edge (oye) stealing my input focus, and other forms of total ineptitude. The fact that IE still exists, and still opens up for some damn reason on occasion – and now loads not one slow webpage, but two by default!
And then there’s the slow as hell task manager, which like everything in Windows, has years of design waffling, and goes just far enough to make it’s default appearance, completely useless.
Oh, and it constantly shoves all the tablet stuff in my face – all the damn time. I’m on a desktop – why are you showing me an onscreen keyboard? And why is it covering the bottom half of my screen – which is usually where the stuff I want to click on is?!
I think people get used to the abuse, to the point they just don’t know they are being abused.
And the constant updates…
The case is lacking in airflow due to large piece of glass on the side, and the lack of fans on the front. I like the new trend of having nice to look at cases, however many trade functionality for aesthetics.
There is an actual benchmark here:
https://www.gamersnexus.net/hwreviews/2709-nzxt-s340-elite-review-te…
There seems to be +7C extra heat compared to a better case (Corsair 570X). It also has a large slab of glass on the side, however uses triple front fans to provide air to all components.
Just by switching the case the author could have saved the need for liquid cooling (there are really good air coolers for latest i7s), and reduced the noise level at his workplace.
While having a beautiful case to look at is nice, it is better to have a case with better airflow.
You aren’t going to be running a 5.2 GHz, 6 core overclock on air cooling.
And since it is water cooled, which is also quieter, he doesn’t need much airflow. The PSU fan is plenty.
While he probably wasted less money (compared to say, a maxed out iMac or entry level iMac Pro), it’s still a ton of money wasted.
Reason: Whatever machine you spec, Lightroom will still run like a dog.
TL;DR – F*ck you, Adobe!!
Edited 2018-01-25 03:16 UTC
“Cases with no 3.5″ and 5.25″ drive bays!”
So exactly when was the last time this guy built a PC?
About 5 years ago I moved from a 20 Megapixel camera to a 24 Megapixel one, I remember how suddenly my editing process (Linux based) became painful.
Now I still use the almost 5 years old camera (professionally) and I am not compelled to upgrade for Megapixel sake. Dynamic range, low light capability would be reasons to upgrade, Megapixel count is not.
Stopped reading after: “Disable User Account Control:”
you read like 60% of the article, at that point, why bother?
Agreed. People won’t run a linux or mac system with no password required for sudo, but they’ll cheerfully disable UAC.
And then they wonder why their systems are crap.
On my son’s account, I actually set it up to mirror modern systems like Linux and macOS. If you log on to an unprivileged Windows account, and turn UAC up – when you try to install something, it works remarkably similar to how it works on mac OS. It’ll prompt you for an admin password, then let you install or change whatever thing you needed admin access for. And it requires you actually type in the password (or pin), instead of just mindlessly clicking “accept” – which is the horrid default in Windows…
This is how i work as well, It takes no time, and how often do people install stuff anyway?
“Lightroom sucks”.
Anyone tried a serious comparison with DarkTable recently?
I keep reading Linux and macOS users complain about Wndows10 and am left confused. They use the words “slow”, “input lag”, “annoying UI stuff”. Windows 10 for me works and works *really* well. Fast, stable, intuitive.
Then I remember the confused faces of my macOS friends when I complain about my experience using my MacBook Pro (I got one cause I need to do iOS dev stuff). I use the same words : “slow”, “input lag”, “annoying UI stuff”. They hear me say these words and they are also left confused.
So, basically, what I grew up using feels natural and perfect to me and everything else feels not as good. Same goes for Linux and MacOS/OSX/macOS users.
I used Windows as a primary dev machine for 10 years before switching to macOS. I agree that the transition was painful.
So why did I switch? Because Windows has a ton of little annoyances, which when I was trained to actually deal with, I happily did deal with – like updating every morning before doing anything else. It was like a fun game of whack a mole – update all the stuff, then get to work. Except, it was a huge waste of time, and each of those annoying little tasks and popups, and short interface idiosyncrasies, that you don’t really notice until you use something else for a while, eats up your brain sugar – your good decisions.
It’s amazing how much more mental effort I feel like I spend on my own problem domains, rather than on operating environment problems, just by using macOS instead. (I like Linux, but it felt like a lateral movement – just a different set of OS things to manage.)
To be fair, I am a web developer, and many web development tools simply work better on *nix machines.
/opinion of a former Windows user.
Edited 2018-01-25 17:39 UTC
Sadly the availability of components is not that great these days, especially with GPU’s.
For some months already I’ve yet to see any high end GPU’s, let alone any at decent prices. Same for RAM.
It’s becoming more and more easy and affordable to buy a prebuilt, as sad as this may sound from a diy solution.
What’s good and beefy usually goes directly into mining farms and/or ai compute nodes in data centers regardless if it’s marked with consumer range or not..
Edited 2018-01-25 18:46 UTC
Are you living somewhere beyond the reaches of online shopping?
Harumph! My dual Xeon rig with 24 CPU cores and 86 GB of RAM running Linux laughs at your “new” Lightroom editing rig. My philosophy is go big or go home.
Edited 2018-01-26 12:00 UTC
I really liked his in-depth look at the reasons for his choices and the detail on his build.