I volunteer as tech support for a small organization. For years we relied on Ubuntu on our desktops, but the users didn’t like it when Ubuntu switched to the Unity interface. This article tells about our search for a replacement and why we decided on Xfce running atop Linux Mint.
The users’ desktop computing requirements are straightforward:
1. Low to no-cost hardware and software
2. Easy to use
3. Stable and bug-free
4. Easy to set up
5. Easy to support
Cool visual effects, high-end graphics, the latest features, geeky
apps, and rolling updates aren’t important. Easy, simple, stable,
and cheap are what we’re after.
The Ubuntu Era
Starting in 2006, we used Ubuntu. It ran well on low-cost used
equipment. Even Pentium 4’s give decent performance, and its 5G
footprint fit any old hard disk. Ubuntu has huge free software
libraries, and the initial install supplies the most popular apps.
Ubuntu’s big user community can answer any question. Its LTS (Long Term Support)
desktop releases get updates for three years, since increased to
five years with version 12.04. Best of all, once the PCs were set
up, our users could employ Ubuntu on their own, without training,
and without seeking help. This is critical because they are not
computer sophisticates; some are only occasional computer users.
Ubuntu served us well for years. But Canonical’s switch to Unity
caused discord. The users saw no point in making their big desktop
displays imitate handhelds. Accustomed to fast roll-over menus, they
found typing queries into empty boxes like Dash and Head-Up Display
slow and awkward. One former Windows user typed “Task Manager” into
HUD and retrieved nothing in response… instead of the System
Monitor. Unity expects you to know its keywords. For a while I
played around with changing Unity to be more like the old GNOME 2
interface. It was a fun project but probably wasn’t worth the
effort. (Read the results in the
OSAlert article How to Undo Unity.)
As a support tech, I had my own
complaints about Ubuntu. Canonical would routinely introduce
new features to the product without protecting existing users from
their impacts. To recount just two quick examples: Ubuntu upgraded
GRUB to GRUB 2, but failed to provide an easy way to edit the
start-up menu. Instead of editing the menu.lst text file, you had to change bash scripts.
Another example: suddenly you could no longer manage the display by
editing the xorg.conf file.
These changes would have been fine if the means were provided for
transitioning to them — but none were. Ubuntu routinely upgrades
without insulating users from disruption. Why? We started looking
for an Ubuntu replacement.
The Search Is On
Since our main issue was Unity, we started our search with
Ubuntu-based distros with different user interfaces. We tried
Xfce-based Xubuntu, but its performance disappointed. (Distro
reviewer Dedoimedo documented its shortcomings in his reviews of Xubuntu
9.10 and 11.04.
He finds that Xubuntu
12.04 has since fixed the issues and calls it “a most pleasant surprise.”)
We
also
tried the LXDE-based Lubuntu. I liked it and wrote it up in OSAlert, but organizational
timing (unrelated to Lubuntu) prevented us from switching to it. We
never considered Kubuntu, assuming that KDE might be a bit resource
heavy for our older equipment.
We didn’t consider Windows or Mac OS, due to their high costs and
licensing restrictions. Also, new Windows versions impose a learning
curve for little apparent benefit. My users who tried Windows 8
complained about it. As one summarized, “Why on earth do they keep
changing Windows?”
Please keep in mind, you who are reading this are expert computer
users; my clients are not. You and I look forward to new Windows
versions and new Linux distros as a chance to play and learn. But
what we consider interesting, my users see as a waste of their time.
They look at computers the way most of us look at driving a rental
car. You should be able to hop in and go. If you have to read
instructions or ask a lot of questions, something’s wrong.
After a brief hiatus, our distro search resumed in 2012. Based on a
glowing review
in the Register, we tried
Linux Mint. Mint retains many of Ubuntu’s advantages, including its
solid fundamentals and huge free software repositories. It has a big
user community and good support: Mint 13 LTS receives updates
through April 2017. In contrast to Ubuntu, Mint ships ready-to-run
straight out of the box, complete with codecs and multimedia
support. The project’s biggest attraction is that its developers
have a knack for identifying where Ubuntu falls short and providing
alternatives. Don’t like Unity? Mint’s got both 32- and 64- bit
versions fronted by:
- KDE
- Cinnamon
- MATE
- Xfce
Cinnamon
is forked from the GNOME 3 shell. Its features Compiz-like desktop
effects including animations, transitions, compositing, and movable
panels, and is modifiable by themes, applets and extensions. You can
drag-and-drop with the menu and activate Expo Mode via a hot corner.
Cinnamon requires 3D acceleration and employs the Clutter graphics
toolkit for its slick features.
We tried MATE
because it’s based on GNOME 2. It sounded most similar to Ubuntu
back when it used GNOME. MATE includes GNOME 2 applications that are
forked and renamed: Caja file manager (from Nautilus), Pluma text
editor (from Gedit), MATE terminal (from GNOME Terminal),
Marco window manager (from Metacity), and Eye of MATE image viewer
(from Eye of GNOME).
Xfce: Simple Hits the Spot
My users liked MATE, but then I downloaded Xfce and added it to our
base install. Bingo! Xfce was an instant hit. With its simple,
straightforward desktop, you can see why. How to use Xfce is
obvious, regardless of whether one comes from a Windows, Mac, or
Linux background. Even beginners can use it without help. Xfce
buries the old canard that Windows is easier to use than Linux once
and for all.
Xfce Menus: Simple, old-fashioned …
and exactly what many end users still prefer.
Xfce is easy to customize. I moved the top panel down to the bottom
of the screen with just a mouse click and a drag-and-drop. You can
quickly add, remove, and alter panels. And you can easily add quick
launch icons and applets to either panel(s) or the desktop. Xfce
runs light. Current computers handle any OS + UI combination with
ease, but we still have some old machines. Mint 13 with Xfce runs
runs fine on a Pentium 4 and rarely swaps to disk even with only
512M memory. It really flies on a dual-core machine with a gig or
two.
You can add quick launch icons to the panel as easily as in Windows.
Xfce doesn’t try to jam an interface designed for touchscreens onto
your desktop. This Register
review
summarizes why our users like it: “…
Xfce isn’t planning to try “revolutionising” the desktop
experience… The focus is generally on improving existing
features…rather than trying to out whiz-bang the competitors…
If you’ve felt left behind by
GNOME’s attempt to redefine the desktop experience and just want a
desktop that works the way it always has, Xfce fits the bill.”
Xfce is missing a few things. It comes with an “App Search”
function, but I couldn’t find a “File Search” or “File Content
Search” tool. No problem, just download one with the Synaptic
Package Manager. Gnome-search is spare and simple, or try
SearchMonkey or Catfish for more features. I also downloaded the
gnome-system-tools package to manage user id’s. You might need to
update the Xfce menu, as I found it placed one or two applications
in odd menu positions after I installed them. Alacarte does the job.
Finally, Xfce bundles lightweight apps. You may favor some
alternatives, which you can get through the repository.
Of course, Xfce’s biggest “shortcoming” is that it doesn’t have the
cool new interface of a Cinnamon or MATE. My users like it that way.
But others will prefer Mint’s more featureful, state-of-the-art
GUIs.
Mint’s Quirks
Mint also has its quirks. You must define a swap partition during
the install, even if your system has a ton of memory and might never
need it, or you could have install difficulties. If you really don’t
need disk swap space, use the kernel’s zRam
feature to define memory as swap. Or reset the swappiness control variable from
its default of 60 to a low value like 10. The lower the value the
less the system swaps. (You can eliminate swapping altogether by
setting swappiness to 0 but
then the system will crash if it needs to swap and can’t… a
problem when it Suspends or Hibernates, unless you’ve made advance
plans.) To view your swappiness
value, enter:
$ cat /proc/sys/vm/swappiness
To permanently change swappiness,
edit the file /etc/sysctl.conf
as root. Add or change the
line with variable vm.swappiness
to your desired value:
vm.swappiness=10
Then reboot for the change to take effect. (A simple logoff/login
will not effect the change as this is a system-level parameter.)
Other issues? The Mint Update Manager has no version Upgrade button.
We only upgrade from one LTS release to another, so this is no
problem for us. For those who prefer frequent upgrades and install
intermediary releases, this is a feature that Ubuntu has and Mint
lacks.
The biggest issue with any Linux distro is whether it will work with
your hardware. Certain laptops and odd video cards are the usual
culprits. Our computers are all desktops, and out of twenty-odd
machines, the sole problem we encountered was with the Suspend
function on a couple early dual-core AMD boxes. We just turn them
off when not in use. I was especially pleased that Mint recognized
every one of our diverse WiFi cards — not always common components
in desktop computers.
Conclusion
By now I’m sure some readers are ready to flame me for promoting a
“boring interface” or for “resisting learning something new.” But
this isn’t about what you or I would run on our computers. We’re
excited about the new directions of Windows 8, Unity, and GNOME 3.
End users with desktops and laptops are not. They don’t want to
spend time learning new software unless it clearly benefits them. If
you’re not using a handheld, it’s not clear that these new
interfaces do.
Perhaps there will come a day when users expect their desktops to
mimic their handhelds. If so, that day has not yet come. Today, desktop and
laptop users find Xfce easier to use than either Unity or Windows 8.
Mint with Xfce makes a great platform for those who just want to use
computers without hassles.
– – – – – – – – – – – – – – – – – – – – – –
Howard Fosdick (President, FCI) is an independent consultant who supports databases and operating systems.
for changing swappiness on the fly as root:
echo “20” > /proc/sys/vm/swappiness
Sys.vm kernel variables are totally dynamic.
As user friendly as it gets.
Of course it’s not user friendly, it’s a kernel variable! Changing such low level settings shouldn’t be user friendly as they’re very dangerous if you get them wrong and should only be adjusted by people who know what they’re doing (or at the very least, understand the risks of tinkering with them).
In a sense it’s very User Friendly. It does exactly what the user wants in a concise and direct way with a minimum of fuss.
The problem is somewhere along the lines “User Friendly” has also come to mean “designed so a mildly retarded field mouse could do it by randomly whacking the buttons enough times without ever knowing the manual existed”. The people responsible for this school of thought need to be taken out back and thrashed soundly till they see the error of their ways.
Hi,
I think that (in general) there’s a difference in expectations. To me, “user friendly” means that it’s designed to prevent the hassle of needing to know there’s a manual and finding or learning/remembering the relevant piece/s of the manual for the “user” (in this case, the system’s administrator).
You could even invent some sort of ratings scheme to measure “user friendly”. For example:
* start with 100 points
* if the user has to do anything at all (e.g. system doesn’t automatically find the optimum value for swappiness), subtract 50 points
* if the user needs to find/remember part of the manual (e.g. no help/information/advice built into the tool to change the setting), subtract 25 points
* if the user is able to set invalid settings (e.g. no range/value checking built into the tool to change settings, to prevent things like “vm.swappiness = yes” being entered), subtract 25 points
A score of 100 would be “as user friendly as possible”; while a score of 0 would be “the developers were too lazy to care about users”.
– Brendan
You can only cater to a users’ aversion to educating themselves on how to properly run their software for so lomg before a system becomes useless for doing actual work.
Better to have sane defaults and clear documentation on how to change them then have to unbork things when the systems assumptions on how things should automagically be configured for our convenience to run out to be extremely inconvenient.
You really think the average user should be provided easy access to giving direct commands to the kernel? I think not. Just be thankful it’s even there in the first place for advanced who actually know what the hell they’re doing. The vast majority of people don’t even know what “swappiness” is (and probably never will, let alone ever hear the word).
Absolutely. I think every option in the entire OS should be easy to access. If it’s something that might break the system, you give ’em a flashing WARNING dialog first… that should be sufficient. Of course, in a work environment, you should definitely lock that stuff down, though the command to do that is probably just as ass backwards as the one to change the kernel setting
It’s an admin setting. And it is as admin friendly as it gets, indeed.
If you are editing /etc/sysctl.conf, then there’s no need to reboot or muck around in /proc filesystem. Just run the /etc/init.d/procps script manually, or via “service procps restart”.
That reads /etc/sysctl.conf and sets the values listed in there.
You may also want to install the sysfs package. That allows you to use /etc/sysfs.conf to set values under /sys the same way /etc/sysctl.conf lets you set values under /proc. Mainly useful for tweaking the storage stack.
I actually like Unity for the most part. It’s the right combination of originality, also borrowing from OS X and Windows where appropriate. What I don’t like are other tings about Ubuntu, e.g. the Amazon searching by default, as well as the fact that Canonical seem to have an issue actually making a stable system most of the time. 13.04 is their first stable release for me in a long, long while, which I find amusing as 12.04 is the LTS release that crashed on me constantly.
I also like Ubuntu LTS, except for the update that makes certain broadcom drivers only work in IPv6 mode.
Now I have to use a cable to talk to my IPv4 router, while my other OSs are happily talking to it.
Since 1994, no matter how much GNU/Linux improves, tinkering is always required. Problem is I no longer have the same disposition as I used to have.
Edited 2013-06-24 06:27 UTC
I’m yet to find an OS that doesn’t require tinkering. OS’s are so massively complicated beasts these days that it’s impossible to cater for all personal preferences and hardware ranges out of the box.
As the saying goes: you can please some of the people some of the time. But you cannot please everyone all of the time.
Too true, however most Linux distributions have problems with some of the most basic things. With OS X and Windows, say for wi-fi, it’s simple. Install the driver if necessary, connect. With Linux, you may have to tinker with the wi-fi drivers just to make them work and this is not user-friendly, nor should it be necessary. It’s even worse when an update breaks the working drivers you’ve already set up, and this is far more common in Linux land than in Windows or OS X. I wish it weren’t so, but it is and until this is resolved you will never be able to consider any distribution to be user-friendly. Power user-friendly, certainly, but never for your average user unless they’ve got a techie friend to maintain it for them.
I totally disagree. On my new windows 7 laptop I needed to install Firefox, video codecs, antivirus (MSE), a bittorrent client, anti-malware, tune-up software, a video encoder etc before I had a really usable machine.
In contrast Xubuntu 12.04 worked flawlessly on the same laptop without me needing to remove a pile of bloat and install a lot of other software. The Broadcom wireless worked perfectly out of the box.
The very fact you even have to install drivers in Windows is still unnecessary tinkering. You don’t need to do so with Linux – drivers are shipped with the ISOs.
But I do agree that when wifi doesn’t work out of the box in Linux, then it’s a complete pig to get it working. And sadly that can leave many users helpless and with a bitter taste in their mouth
Thankfully I think the instances where users have to do so are quite rare in comparison to the number of times wifi drivers work out of the box (or at least that’s been my anecdotal evidence – I’m willing to concede that I’ve been lucky in that regard). and the one time I did have to play around with ndiswrapper was because Asus annoyingly rebranded my laptop wireless chipset – so the hardware was reporting the wifi chip as being some bespoke thingymebob when in fact it was a bog standard Atheros chip. In those instances I don’t think Linux stands a chance (though I’m not trying to blame every driver problem on the OEMs nor dismiss that there’s room for improvement. Just commenting on my own experiences).
That depends on the distro. With bleeding edge distros like Arch, then that’s a real possibility (again, I’ve been quite lucky in that regard despite being an Arch user – but I’m not blind to the possibility). But with distros like Debian, CentOS and Suse, that shouldn’t be an issue.
What really annoys me is how frequently Ubuntu breaks. That’s completely unacceptable given the target audience and it’s market presence (and one of the reasons I don’t have high opinions of Canonical)
There’s a different things there: operational user friendliness, ease of install and longevity. Some Linux distros are definitely user friendly from a day to day basis. But I do agree that they can be tougher to install. However that’s a tough one given that hardware is built for Windows and ships with Windows pre-installed. I don’t think Linux could ever compete until it’s shipped pre-installed like Windows is. And lastly, the longevity. While you do raise some excellent points about the issues of Linux, I don’t think Windows is any better (case in point: it’s standard practice for Windows users to do regular reformats and reinstalls).
I honestly do think the biggest issue that faces Linux is the lack of support with OEMs. And that’s the same reason why Android (and to a lesser degree, ChromeOS, webOS, etc) have proven popular: because it’s shipped preinstalled so the hardware support is a given. It’s also why OS X works so well despite UNIX traditionally being more finicky with hardware than Linux is.
And if they’re not, you are completely screwed unless you know how to compile source code. Are you really trying to equate installing drivers in Windows to the unnecessary, sheer pain in the ass, process of getting drivers installed in Linux when they’re not provided for you?
I’d already answered those points in the very next paragraph of my post!
No, if its not in the “ISO” you can install it a few ways that do not involve source code which depend upon the distro.
With red hat/Suse you can use a rpm to install a driver. With debian/Ubuntu a .deb There are official repositories full of these rpms/debs that are constantly being updated as bug fixes and new features are added.
For proprietary things like printer drivers or video drivers, manufacturers (like HP) can provide easy to install rpms. Its not difficult. Hasn’t been for a while…
I’ve haven’t had to install a driver with Windows unless I was swapping in a new graphics card.
Windows pretty much just grabs hold of the appropriate driver off the net now in seconds.
Unless that required driver is an ethernet driver, which is amazingly what I had to do when I reinstalled Windows 7 on this current laptop.
And, if it’s an nVidia networking chipset, odds are that Windows Update will install a broken update. nForce chipsets at work can’t be updated to the latest nVidia nForce chipset available in Windows Update for this reason.
Broken drivers on updates are not limited to Linux.
Yes all these very specific edge cases do happen but in the are quite rare.
Look software always is going to have bugs or points where it doesn’t work … the point is that it works in the majority of circumstances correctly.
I could point out where this and that isn’t perfect on every OS. It pointless and doesn’t really prove anything except you are being pedantic.
http://www.osnews.com/permalink?565695
Sorry but I think this as an exception to the rule.
And the rule is most people buy the computer with the OS preinstalled, which hides a lot of these problems. Hence why I only found out when I had to reinstall.
What are the real figures when we compare like with like?
Look most network hardware is generic. Stop being dick and pretending your exception to the rule is the rule.
Stop being a dick and comprehend that’s not the point being made.
Stop being a dick and maybe realize the problem is more common than you might imagine (which is logically different from saying my experience is a rule), so much so that laptop and desktop manufacturers mostly provide preinstalled and recovery partitions.
No it isn’t as common as you think it is and it trivial to find and install a driver anyway.
Most network hardware comes from a few manufacturers.
My assertion is that it’s more common than YOU think. I don’t know the numbers, but your guess is obviously too low because it seems you have trouble acknowledging that driver problems can easily be avoided in Windows with preinstallation and recovery partitions that come with computers these days.
Yeah, but that’s just a little bit difficult if your networking driver doesn’t come with it. I’m not disagreeing that Windows will usually pull down the correct drivers (though I’ve seen it misidentify the correct driver on a few rare occasions) but sometimes installing wi-fi drivers is still necessary. It’s kind of hard to grab a driver off the net if you can’t get connected to it.
That’s been my experience too, in fact Windows 7 even downloaded the graphics card driver when I started using a GeForce card. I still went to Nvidia’s site for the more current and complete driver as some games complained about the Microsoft provided one though.
The interesting bit happened when I decided that the Intel Sandy Bridge GPU built in to my machine was good enough for the simple games that I play, and I gave my GeForce card to a friend in dire need for his 3D modeling courses at school. My machine has built in VGA and DisplayPort outputs, and my LCD monitor has VGA and DVI inputs. In Windows, I got native resolution using VGA, but in any OS that uses Xorg I could only get 640×480, if anything. After days of tinkering to no avail I gave up and decided to pick up a DisplayPort-to-DVI adapter. That fixed the issue immediately, and even improved the picture quality on Windows (all OSes now saw my LCD as a “built-in” monitor and defaulted to its native 1600×1200 res).
All of that tinkering and research should not have been necessary. Intel is the most “open” of the GPU manufacturers when it comes to alternative OSes, and I would think that any GNU/Linux based OS would instantly support it. Supposedly a fix for this obscure issue is coming, but as most people don’t use VGA connectors these days I don’t see it happening in the long run. Sandy Bridge is “old” tech and the trend in the open source world nowadays is beating Microsoft and Apple to the finish line, not supporting old 2011 era tech like mine.
For some strange reason Windows Update (Windows XP) offers a GeForce driver, even when an official Nvidia is installed.
Let Windows do its thing and you’ll end up with a 640×480 screen boosting 16 colors.
The only way to fix it is by installing the Nvidia driver.
Nope, not installing the “Windows Update” regarding the nVidia driver, and tell it to never makes this “update” appears again. This solves the issue.
Kochise
When trying to install Windows 7 on a number of brand new Asus laptops (different models, all using Atheros GbE NIC) we had to manually install the drivers.
Granted, building the alx driver under Linux is more complicated than double-clicking on setup.exe under Windows (at least to a non-veteran user), but Windows is far from perfect non-the-less.
Oh, even though the alx driver under Linux is still a beta driver (required out-of-tree drivers) it seems to perform better at long range compared to the Windows 7 (and 8) driver. Go figure.
– Gilboa
You would still be in the same situation with Linux if the kernel didn’t have the network driver included.
Re-read the second paragraph in my post – I didn’t claim otherwise.
IMO -no- OS is really joe-six-pack-ready, unless its comes pre-installed by the computer/laptop/etc manufacturer (and even this comes at the bitter price – AKA bloatware)
– Gilboa
Unless you have bleeding edge hardware or very unusual NIC I suspect Windows 8 would work flawlessly.
This is true of Linux mostly these days as well. However it still much more difficult getting a driver working on Linux if it isn’t included in the kernel or your NIC isn’t supported compared to Windows (download the driver and put it on a usb stick and double click).
Most hardware today on most desktop PCs are pretty generic and if you are running any OS unless it is particularly niche (BeOS, Icaros and the ilk).
Edited 2013-06-25 12:58 UTC
Ah, but see Linux developers were able to foresee this issue ahead of time and implemented RFC-1149 as a means of obtaining the drivers.
Dunno, after installing Windows 8 on my new laptop, wi-fi worked out of the box. So did FN buttons. No drivers were required.
Mostly it seems dependent on whether the wireless device is natively supported by the Linux kernel or needing ndiswrapper to use Windows drivers.
I also tried to replace my aging Ubuntu 9.09 development box with Mint 13 (and also found Xcfe being the best distro for my Atom 330 board) but… while installing, partitioning the HD for a system part, a swap part and a data part, the data part was unaccessible after install :
“You don’t have the permission to access this part”
Come on guy, I’m the installer of this machine, I have root permission and I cannot access the data part of my local hard disk ? And you call this “security” ?
Wiped off Linux Mint, re installed Windows 2000 Pro SP4 with all the remaining patches, not a single problem ever since.
KISS !
Kochise
Actually it is security. We all know the chequered past Windows has had by letting everyone run everything as Administrator. And since we’re talking about Linux post-install, there’s absolutely no reason why you should be running as root any longer.
However by the sounds of it, the fact that you had installed and rebooted into Mint, and the fact that you only had one “data” partition, would mean that your user profiles were stored on that and loaded. Which means you do have permissions access to that disk. What it sounds like to me is that you were trying to access system areas of that partition which are secured against user access – and that’s an absolute must for security (in fact, Windows now does this as well!)
It interesting to hear that you’re still using Windows 2000 though. That’s a fantastic OS in my opinion (in fact it’s the only release of Windows that I’ve genuinely loved). But Windows 2000 isn’t secure by default (the default user is administrator, telnet is enabled by default, etc). Granted all these things are easily fixable, but my point is that you’re applying an old and insecure Windows paradigm (lets be honest, Windows security has come along way since Win2k) to Linux and then bitching when a different OS behaves differently. The fault here is entirely with you.
In fact this is one of the biggest causes of Linux FUD. For some reason, when Windows users switch to OS X, they expect OS X to behave differently and are ready to learn how to use their new OS. But when many Windows users install Linux for the 1st time, for some reason they expect Linux to behave like a drop in replacement for Windows – which is completely unreasonable and leads to many of the daft complaints like the aforementioned.
Edited 2013-06-24 08:32 UTC
Wasn’t my first Linux install, had Ubuntu (from 7 to 10) without any problem and Fedora (which Anaconda’s default behavior was to aggregate my ext3 partitions altogether, thus destroying my Ubuntu install and messing my MBR).
Tried Linux Mint 13, was pleased by the system, but was restrictive as hell :
The system partition (32 GB ext3) had the system installed on it, the swap partition (4 GB swap) and the data partition (200 GB fat32) had nothing on it, it was just purposed to store… guess it : data !
When I wanted to open/copy files on the DATA partition, Linux Mint shouted at me that I don’t had enough privilege (as root !) to access it. And it’s an offline ARM cross development PC.
I’m a geek though, but I do not expect to have to fine tune somewhere in the system the access to my DATA partition freshly formated of my newly installed system.
Security ? Paranoia !
The installation went pretty straightforward though, so imagine my frustration, and consider the newbie’s, when you cannot use you locked down -“for security reasons”- computer.
It’s Linux, the malwares and security holes aren’t supposed to mirror Windows’ ! So what’s the point ?
If I quit Windows not to fight malwares and security holes anymore, it’s not to find other flaws to chew on. If Linux cannot hide its “secutiry” behind the hood and had to put your nose into configurations files to “feel the power of the security by restrictive accesses” then I’m gonna quit immediately.
This is madness !
Kochise
Edited 2013-06-24 12:28 UTC
Is that the default auto partitioning? I’m more miffed why there’s a FAT32 partition. That’s just wrong. If it’s a Linux only set up, then it should be running ext3 or ext4. If it’s to be shared with Windows, then it should be ext3 (there are ext2&3 drivers for Windows) or NTFS. FAT32 should NEVER be used to store “data”. So if that’s a Mint default, I’m very disappointed.
As I’ve already pointed out. You wouldn’t have been root. Mint (like Ubuntu) doesn’t assign a password to root so you cannot even log in as root. Thus you’d have been a regular user.
That’s what they all say until their computers are infected with all sorts of crap…
That doesn’t even make sense. You’re complaining about security features. ACLs and other access permissions are not malware.
How would you suggest we secure computers without user access controls? It’s my day job to implement security procedures, specialising with Linux and UNIX (I’m not making that up either!) and I can’t think of a better foundation to begin with. At some point in the stack, you’re going to need to know who’s using the computer and whether they’re allowed to access that subsystem. And whichever way you try to implement that, you ultimately end up with a list of users and permissions.
This is why your arguments about computer security really don’t make any sense. Granted, in this particular instance the workstation is intended to be kept offline. But since you’re the one arguing about noob-friendliness, it makes infinitely more sense to assume that all the Mint desktops are going to be connected to the internet than have all the security turned off by default and expect those users to turn them on manually (but don’t take my word for it, let’s just look at Windows 95 through to Me and how well it’s security model worked).
It most certainly is not. Whatever happened there, I can’t easily believe it’s LMint’s fault. Been using it for 3 versions now at work in VBox, never seen anything like that happening.
That’s what I suspected. And to be honest, I wouldn’t have minded if he was honest about the fact that he was running non-standard config – as he could still have made a valid argument about usability. But to run a bespoke set up and then moan about how default Mint installs are broken is just deceptive.
Personal partition setup, I wanted the data partition to be fat32 to do disk image of the Linux system partition on the fat32.
If I had a problem I wanted to be able to recover my data partition with Windows. Ext drivers for Windows are not really reliable on writing, likewise Linux ntfs drivers are unreliable on writing.
Fat32 is very reliable, yet lacks of “security”, journaling and 4+ GB file size. But for ARM cross development, you hardly need such file size.
Kochise
Did you partition that within the Mint installer or manually before installing Mint? If the latter, then Mint wouldn’t even be aware that you want that partition available, wouldn’t have added it to your fstab and thus you’d need root permissions to mount it. Windows wouldn’t be any different in that regard (ie you’d have to mount the drive via Windows Disk Manager).
Where did you read that? I’ve done some massively heavy io to NTFS drives from within Linux over the course of years (at least 6 years) and never once had a problem (we’re talking music production, file servers, and such like. So pretty heavy duty stuff). Some of the writes reaching gigabytes in size. Others being hundreds for small files rapidly being written to. It worked flawlessly each and every time.
Even putting my personal experiences to one side, everything I’ve read to date would contradict your claims about the reliability of NTFS in Linux and ext within Windows. And while I can understand a little more reservation towards ext3 on Windows, ntfs-3g (the FUSE NTFS drivers that Linux uses for R/W access) is reliable. I’d honestly consider it more reliable than FAT32.
FAT32 is reliable right up until you hit the slightest bump or power failure, then it’s game over for any data you were writing at that time. :p
Anyhow, didn’t you say you replaced Mint with Win2000. I didn’t think Win2000 was ever ported to ARM.
Edited 2013-06-24 15:53 UTC
Keil and Iar ARM toolchains for Windows.
Power failure ? I have UPS guarding my setup.
Kochise
Nice
If you don’t mind me asking, what are you developing in ARM? (I dabble a little with the architecture myself – albeit just writing Go applications on my Raspberry Pi)
Linux NTFS drivers are very reliable for writing these days. At least in my vanilla usage of NTFS. There maybe some more obscure windows initiated usage of NTFS which Linux drivers might screw up, but for a traditional non Domain group policy, dual boot situation it works very well.
See my answer in another comment above…
Malwares ? On Linux ? Babylon toolbar ? McAfee anti-virus ? …
Access permissions ? So with Linux Mint, when I install the system, instead to lock things, I have to unlock them ? How convenient.
Preventing the user to access the computer to prevent him making mistakes is sure quite a strange behavior. An operating system turned into a denying access system, that doesn’t makes sense. I’m sure there is other ways to “protect” the system. Firewalls, etc, but not locking down the computer.
Sure, when you start having more than one registered user. But when there is only ONE f–king account, why the need to lock EVERYTHING when an access password would be enough ?
Like I said, preventing the user to access the computer, then the internet, for the sake of “safety” is a pure non-sense, especially on Linux. I don’t see what are the threats to the system. Active X ? Sony’s root-kits ? IE exploits ? SWF trojans ? Come on…
Kochise
Edited 2013-06-24 15:25 UTC
Which comment? None of them address the ‘root’ point I made.
I’m sorry but I thought you said you’d used Linux before.
It’s not about protecting from user error. It’s about locking unauthorised processes down to minimize the damage they can perform. Firewalls are a whole other type of security system and would have zero benefit in that regard.
With the greatest of respect, I suggest you have a read up on security practices. It’s quite an in-depth subject and it appears like you’re holding onto a number of misnomers. As we’re now starting to talk in circles, it’s clear that you’re never going to trust me on this topic, so I’d recommend you do a little research to see that I’m really not making this stuff up
Actually, exploits have been found with Linux builds of Flash in the past. Then there’s Java 0-days. There’s been instances where Canonical have inadvertently added trojans to their repos. And that’s before you even look at any of the networking software (p2p clients, etc)
Edited 2013-06-24 16:20 UTC
Operating system daemons exploits.
Application level exploits that expose the user’s $HOME to the outside world when p0wned.
Fake posts on public forums about how to install something, used as disguise to install worms.
People that just install whatever application they can get from the Internet without checking what it really does. Even if they are not root, at least $HOME is exposed.
Users are very creative, specially at home or when trying to go around IT security procedures.
I may be wrong but the last time I checked, and it was a long time ago, you had to resort to some not trivial steps to create and use a fat32 partition with more than 32GB. Perhaps, was this the root of the problem?
It was just a limitation of some partitioning tools (most notably the one in Windows), fat32 itself is easily capable of more than 32 GiB.
Why is Windows 2000 is considered good? It was terrible until Service Pack 3, hardly anything worked on it and was completely redundant after XP came out which was miles better in comparison.
I love how short some peoples memories are :p
Given the alternatives on the desktop 2000 was epic. WinMe was just a clusterfuck of fail, Desktop Linux was still in it’s infancy, and BeOS -while awesome- was failing to gain traction. Then let’s look at what 2000 was superseding (the 9x era – which were never good. Not even in their day. And NT4 that had so much potential but just failed to really deliver and wasn’t practical outside of corporate environments). Even outside of x86, most Apple users were still stuck on OS 9 (which was buggy as hell) with OS X due to hit the shelves a few months later.
Windows 2000 supported DirectX, OpenGL and ASIO, which made it practical for gaming and music production. It was the first and only time Microsoft had released a desktop edition of ‘Windows’ which wasn’t chocked full of bloat. It was lean, yet had all the features you’d need available to install on demand. It was clean, lacked unnecessary clutter and “bling” while still adding some missing usability features (eg hotkeys in notepad.exe). Even the bundled Windows Media Player followed the design guides at that time (in fact it spawned a clone after MS decided to “XPify” it: MPC, Media Player Classic)
Windows 2000 was also the most stable x86 desktop at that time. We all know how poor the 9x range was, but Me just took the piss. It was up and down more often than a whores draws. In fact Win2k even outperformed Linux desktop environments in terms of bug-free stability.
Everything that made Windows 2000 great, Microsoft did a U-turn with XP (dumbed down control panel et al, ugly and highly unnecessary skins, etc). While XP has since evolved into much more, originally it was little more than an uglier and stupider version of Win2k (and those skins nearly doubled the system requirements – Win2k being 128MB and XP being 256MB). In fact The only improvement that XP offered at launch was a significantly improved boot time.
Basically people revere Windows 2000 because it was the start of NT on desktops and the first time most home users were offered a glimpse of a stable desktop OS for x86. Windows 2000 was basically the first time (and only time, in my opinion) Microsoft pulled ahead their competition. It was simply awesome. It’s just a pity MS decided all their users were idiots in every release of Windows since.
I like the words “stable” and “bloat” handed around that mean absolutely nothing. Lots of people at the time said Windows 98 SE was stable and lean.
9x series wasn’t as bad as you make out. Yeah Windows 2000 was more stable, but almost nothing worked with it. You talk about gaming, most games didn’t work … not the fault of the OS, but don’t talk about it like it was some magical OS.
Which is why I gave context and comparisons rather than simply applying vague adjectives. And I know full well that you’re now just deliberately “cherry-picking” words in an attempt to destroy the validity of my claims (ie death by a thousand papercuts). Can’t we have a mature discussion for once?
You complained about none descriptive terms that don’t mean anything then go one to claim “lot’s of people” which is a worthless statement. Who’s “people”? How many are “lot’s”? If you’re going to complain about “stable” and “bloat” then you can at least follow your own criticisms.
Anyway, I agree that Win98SE was more stable than 98 (first edition). But it certainly wasn’t stable by today’s standards. Windows 2000 is.
And I disagree about the lean comment as well. The 98 series integrated web crap into the desktop shell – badly. Granted a lot of that stuff was still present in Win2000, but it was refined quite a bit by then. Though the main reason it didn’t feel as intrusive in 2000 might be down to hardware advances – ie faster CPUs etc == more resources to hope with Active Desktop etc. However I do accept your point that the bloat/lean argument isn’t precise nor constructive. So I’ll happily withdraw that argument
If you’d ever owned non-IBM PCs for comparison, you wouldn’t be making that comment. The 9x series was slow, buggy, and the mismatch of having DOS and Windows applications was just ugly. Switching their desktop line to NT was the best decision Microsoft have ever made in regards to Windows.
I used Windows 2000 since only a few months after launch and I really can’t think of any compatibility problems. If there were any, it would have been pretty inconsequential as Win2000 was my primary desktop and I never felt the need to boot into my 98SE dual boot (in fact I wiped that disk a few months later to mess about with Linux).
While I’m sure others might have had some issues (to be fair, any new release of Windows does bring new issues), stating “nothing worked” is massively overstated. “most had no issues” would be more accurate.
Windows 2000 was my primary gaming machine. My only gaming machine in fact (excluding the Dreamcast – but that came later). I never had a game that wouldn’t in Win2k (bar one Genesis emulator IIRC. But that was quickly fixed). I’ve genuinely had more problems gaming with DOS drivers in Windows 95 than I’ve had with Windows 2000.
Are you sure you’ve even used Windows 2000 and not just reciting some of the conservative press releases from Microsoft at the time of 2k’s launch? MS were tentative about the application support of Win2k as it was such a departure they didn’t want developers and users to assume everything was going to work out of the box. I remember scratching my head at that time because of having so few issues – I was wondering why Microsoft were being so paranoid (maybe they were trying to justify Windows Me sales?)
I’m not talking about it like it’s magical. I simply said it was my favourite release of Windows and gave reasons why; and because you bloody asked me to cite reasons too. So cut the sensationalist crap.
Edited 2013-06-27 08:40 UTC
I like the way you always request for proofs but you only provides bold claims. “Nothing worked with” ? If for you “everything” is “games” then yes, hardly any DirectX game worked well on Windows 2000 since it was on another code base than 9x, and for the better may I say.
Just a hint, but you probably don’t figured it out already, but very often Windows 2000 is tailored with the “Pro” suffix that gives a clue about its targeted audience, which, as you may also figured it out already, is for professional, not gamers that ridiculously tweaks their setup to gain few FPS.
Pro means stability, and just like XP matured at SP3, Windows 2000 matured at SP4. And if you just need an OS that makes it without fanciness, Windows 2000 is the most stable and “superb” OS out there. And its Microsoft who made it happen.
Its Win32 API is rock robust and stable, programs coded 15 years ago for Windows 2000 (SP4) still runs flawlessly on Windows 7. How many Linux distro can claim such a long API life and stability ? Windows 2000 Pro SP4 serves me for almost all the prupose I’d needs, even as an advanced user.
Sure IE7 isn’t supported, only IE6, but who use IE anymore ? Install Opera or Firefox and you get a decent browser. DirectX ? Hardly over DirectX 8 since video card vendors don’t support Windows 2000 anymore, but still provides legacy support. So if you have an old box somewhere, Windows 2000 would be the best match to revive it.
And Windows 2000 Pro offers you enough tweaking possibilities and administration privileges for you to fine tune security and accesses. So claiming its an old fag of an OS it just stupid. Like this topic claims, users are seeking for the OS that serve the purpose, and Windows 2000 Pro (SP4) still just do it in a marvelous manner.
Do you really need a 40 GB bloated Windows 7 installation to open Firefox or LibreOffice ? Play games ? Who need play games when a game console would do it far better without the hassle of bloating even more you hard drive, etc. And Mame or ePSX just runs flawlessly on Windows 2000 Pro (SP4) so this is absolutely not an issue.
I do use Windows 2000 Pro (SP4, please not how many times I insist on using the latest SP for Windows 2000, Pro, because you are obviously living in an alternate reality where Microsoft stopped at Me) for cross dev purposes where the GNU tool chain is just delighted with Windows 2000 Pro (SP4) and don’t need a Cray computer.
If you pleasure if to waste your time upgrading and maintening your setup to keep it up to date, for the sake of it, then follow the Fedora way and experience the “Beta” or “Alpha” syndrome. Windows 2000 Pro (SP4), being now “unsupported” provides on the other hand a huge feedback, is well understood, stable by default (yeah) but who cares ?
I need to upgrade the ARM tool chain, not the underlying OS, if I need to correct code generation. I need to update the browser if I want better HTML5 support, not the underlying OS. Etc. But I hope you get the point.
You can spit your hate all you want on Microsoft, Windows 2000, Pro (SP4) but the pragmatic conclusion is that I had the less issues with this OS, even though XP would have benefited from Windows 2000 (well, NT5 code base) it was more unstable and more crash prone in its early days (prior to SP2 then SP3). But, may I admit it without pain, Microsoft made a tremendous just at providing a consistent user and coder experience.
Then they destroyed everything with Vista, but it was the right path toward 7. I still don’t get the clue for 8, but well, I’m an old fag, I need the job done, not playing with an overly animated colorful UI.
So, please, Lucas, consider installing a Windows 2000 Pro (SP4) machine once in a while, and discover a 1 GB install partition, 50 MB memory usage at boot, flawless working. Multicore support but no hyperthreading (XP SP3 had it) but well, not bad of an OS, I assure you. Sometimes I see many Linux distro running at closing their gap to what Windows 2000 Pro (SP4) reached.
Even as a pure desktop experience, Windows 2000 Pro (SP4, almost forgot to note it for you to imprint it in your brain, otherwise you’ll talk about 9x again) is a breeze. And for gaming several later games supported it once they had OpenGL and DirectX 7 support. So your references are really that outdated. Please update your brain some days.
Kochise
Windows 2000 games worked pretty much from the from what I recall. I was playing Half Life, Quake III, Unreal and UT on Win2000 quite early on in the OS’s life (though I seem to recall UT was a little more fussy as it lacked a lot of the performance that it should have had).
OpenGL worked really well on Win2000. As for DirectX, well back then I used to write software using DirectDraw and all of my development was done in Windows 2000. That wasn’t even using the latest version of DirectX either (IIRC I was targeting Dx7).
Back then I was running a dual-processor motherboard (Abit BP6 – awesome board!) and this was long before dualcore CPUs were out. Games ran awesomely in Win2000; even without them being specifically coded for dual processors – though some did have run time flags to enable SMP (Quake III -and all the games built off it’s engine- did).
So I don’t really get where all this talk of Windows 2000 not being suitable for gaming comes from. Maybe I’ve just been lucky. But as bother a gamer and a DirectX developer on that OS, I genuinely didn’t have any problems at all.
As for Lucas’ Microsoft hate, he’s a Windows fanboy that just hates Windows 2000 (go figure)
Being a fanboy is not a problem, provided you have solid clues to back your claims. While XP was pretty good, thus had a service life time of more than 10 years (!) its roots are in Windows 2000. In fact, just remove XP’s theming, select Windows classic theme and Bing! you get Windows 2000. Everything works alike, settings, accessories, almost everything.
Played Unreal, Battlezone, Black and White and some other games on Windows 2000 without a problem either. And again, the responsiveness and the lightness of the system miss me, since I devoted my Windows 2000 Pro (SP4) license for my dual core Atom 330 dev board.
Otherwise Windows XP Pro SP4 and Windows 7 Pro 64 bits SP1. And I’m not a Microsoft fanboy, but it works quite well out of the box without having to waste time tweaking the system to have something usable after two or three days of hacking.
What baffles me with disbelief is that there is still no platform independent XML like config file that would reconfigure your settings system wide. Just like bookmarks that can be imported from one browser to another, it would be really cool to be able to import an OS settings into another, thus gaining a known configuration up and running in no time.
Perhaps I’m asking to much more and that would remains sci-fi for quite some centuries. It doesn’t work even across Linux distributions, so asking to do it from/to between Windows and Linux, I must be ill spirited to dare making such a request.
Kochise
But that is taking into account that Windows 2000 is around 14 years old. Since there’s no longer any support from Microsoft, it’s a “static” OS with no major new changes to break stuff.
Two versions ago I’d have agreed about xfce, but mate is a lot improved in the latest versions, and waaaaaaaay better for noobs coming from xp.
I know KDE gets a bit of a bad wrap on here, but these days it’s pretty stable and I think it’s another good alternative for users making the switch from Windows. Plus it also follows a similar desktop layout as Windows (excluding Win8 obviously) so users can draw a little more from past experience than they perhaps could with other Linux desktop environments.
That said, KDE4 does require a fair chunk of hardware. It’s not one for the low end hardware and is easily one of the most bloated DEs on Linux (personally I make use of most of KDEs features, so it’s not “bloat” to me – but I appreciate everybody’s use cases are different)
I have a love/hate thing with KDE. I agree that the “bloat” is actually just a wealth of features and choices, but that’s also its biggest problem. At one point there were three text editors (KEdit, KWrite, and Kate) and while each of them had some desirable features, none of them had a complete enough feature set to be truly useful as the only editor. Likewise, a KDE native office suite has been a moving target. It always seemed like a K-app would get so close to being feature complete only to be dropped and replaced with a new, featureless app that had to follow the same treadmill towards usefulness.
I’m hoping that razor-qt will become the lean, one-app-per-task DE that it promises to be. I still have a place in my heart for GTK apps, but the fragmentation thanks to Unity and Gnome3 means going all GTK is much more difficult than it should be.
Edited 2013-06-24 11:39 UTC
Kwrite is basically just the KDE equivalent of Notepad.exe and Kate is more of an IDE with Kwrite at it’s core. So from a developers perspective there is quite a difference there. But I can totally relate to the confusion as I only know this from spending hours of development time inside KDE.
As for Kedit, I only vaguely recall it and it’s not bundled with KDE any longer. I wonder if it was an original name for Kwrite or Kate but legal issues forced a name change (I notice there is another piece of software named “Kedit” that’s not related to KDE). I’m just speculating here though – you may well be right that the KDE devs decided to bundle 3 similar text editors.
Yeah, I don’t particularly rate KOffice myself. I don’t agree with your remark in terms of the wider KDE suite. But it’s certainly true of KOffice.
I’ve not checked out Razor for a while. I really should give it another look. I think our opinions differ with regards to Qt vs GTK though – but then I guess the beauty of Linux (and perhaps it’s biggest drawback too – in terms of fragmentation) is that you and I can have differing preferences and still run the same OS.
Kedit is no longer part of KDE, though there was a time when all three were available at once, back in the 3.x days. Kedit was a casualty of the move to 4.x, and dropping it was an example (in my mind, highly appropriate) of the slow but steady push towards slimming KDE to something less than a dozen apps for one task.
Take the multimedia player landscape: Dragon Player, Kaffeine, KMPlayer, and KPlayer are all included in default installs. Each is a “multimedia player”, so why four of them? In fact, Dragon Player’s About page describes what KDE itself needs:
Dragon Player is a multimedia player where the focus is on simplicity, instead of features. Dragon Player does one thing, and only one thing, which is playing multimedia files. It’s simple interface is designed not to get in your way and instead empower you to simply play multimedia files.
So, if Dragon Player is all we ever needed to play multimedia files, why three others?
I think a lot of the issues with KDE and its lack of consistency is due to just how large the project is. The 4.x release has steadily improved in a lot of ways, to the point that I actually do find KDE usable on a daily basis. However, there is still an overwhelming feeling of fragmentation and “project X doesn’t know what project Y did so both are broken in this point release” kind of thing.
Actually I don’t mind using both QT and GTK apps; from years of using Gnome and Xfce I do have an affinity for GTK apps but I also recognize that QT is a great toolkit in its own right, and has some great apps too (Amarok being one of my favorites). I also enjoy the fact that KDE is more graceful when handling GTK apps under Kwin than Xfce and Gnome are when handling QT apps. To borrow slang from a novel I’m reading, KDE is a lot more mesh than the others.
And it was for a very good reason, as KEdit had correct unicode support wich KWrite/Kate did not have at the time. Rather than not supporting some users need, the KDE developers kept KEdit fot those use cases.
No, it was not. KEdit was simply dropped when KWrite/Kate gained the missing unicode functionality. As the plan was the whole time. The reason it corresponded with KDE 4, was the switch to Qt 4 wich delivered much of it for “free”.
Blame your distribution! KDE comes default with one multimediaplayer: Dragon Player. While the 3 others use KDE libraries they are not part of the default installation, the are simply 3rd party applications.
For the same reasons you get several 3rd party browsers, mediaplyers etc on other platforms too. Like on Windows, it has both browser and mediaplayer, but you have several 3rd party alternatives for both.
Thanks for the clarification. Though, I still see them dropping it as slimming the compilation down a bit. I don’t understand why reducing bloat is considered a bad thing.
Probably because what you call bloat is the very reason some, like myself, prefer kde over any other *n[iu]x DE/WM. There is a need to some tweaks but this is way better than it is to not has options at all.
Yep, I’m definitely more of a minimalist. That said, I’ve found KDE to be usable lately, though I do turn off all the flashiness and remove a ton of useless or redundant apps.
Aptosid still has a KDE-lite package http://www.youtube.com/watch?v=rika5DVALjY
Apparently, so does FreeBSD: http://www.freebsdsoftware.org/x11/kde-lite.html
The Klyde project for SUSE is promising: http://blogs.kde.org/2013/04/11/hackweek9-lightweight-kde-desktop-p…
Of course, KDE is open source, so there are many minimal KDE configuration possibilities discussed/recommended in many distro forums.
I think the same yet I recently found a serious project about an xfce-class lightweight fork of KDE that aims at being compatible with the full fledged version.
http://blogs.kde.org/2013/04/11/hackweek9-lightweight-kde-desktop-p…
http://susestudio.com/a/pRvzFf/minimal-klyde
I didn’t test itbut it seems promising and very supported by the KDE community
Saw your informative comment only after I posted similar info above.
Unfortunately, now I can’t mod you up.
Edited 2013-06-24 22:56 UTC
I had the same “problem” with Unity and Gnome 3 as your users, so nowadays I use Debian stable with XFCE as desktop.
I set it up to look like Gnome 2 with two menubars, top and bottom. Then I installed some of my preferred apps (gedit instead of mousepad, eye of gnome instead of xfce’s picture viewer, …)
I love it: it does not get in my way while I am working. I don’t neeed anything else from a user interface…
Thank you so much, XFCE-guys…
http://www.urbandictionary.com/define.php?term=faff
I gave up with Windows because I spent so much time ‘faffing’ about with it.
I moved to OS/X for my personal use and basically, I don’t have to faff about with OS at all. So I get more time to do what I want with the computer rather than what Microsoft/AV Maker etc wants me to do.
I use Linux in my day job. I also use Windows Server 2008/2012. Neither of these requires the same level of ‘faffing’ about as Windows 7 does on a daily basis. I sometimes wonder if one bit of MS knows what any other bit is doing.
I selected OS/X because it was suitable for my needs. It is not because I’m a fanboi or something. I’ve been writing software for more than 40 years and believe me, after than long using something that ‘just works’ is a real pleasure.
Linux is getting there but it isn’t there yet. In the meantime, MS is getting left behind in ever so many ways.
I don’t know what “faffing” is supposed to mean but if you’re doing anything to Windows 7 on a daily basis other than using it, you’re doing something wrong.
I’ve been hearing this exact same nonsense for well over a decade..literally. So when is linux finally going to “get there”? As a desktop it’s never been more than a flash in the pan. Also, in what way is MS getting left behind? They completely dominate the desktop market to the degree that there’s no point in even mentioning any of the competition, if you could justify calling it that.
To put it nicely, your claims are less than convincing.
Based on the context of his whole comment, It seems pretty obvious to me that when he said “Linux is getting there”, he meant “good enough to fulfill my needs”. Anyway, the rest of his claims are pretty straightforward: Windows 7 is not very suitable for his needs. Since you’re calling them unconvincing, I guess you know his needs better than he does?
What he actually said was, “Linux is getting there but it isn’t there yet. In the meantime, MS is getting left behind in ever so many ways.” That sure doesn’t sound like he’s referring to his needs, but rather in general as a blanket statement. Making the claim isn’t good enough, make your case instead.
…No, of course not. While he did reference his needs, he also makes sweeping comments. I suppose you thought ignoring that would lend more to your own comments. But, no.
Faffing about is all the things you have to do to get windows working in a half decent way, loading all the drivers and overcoming all the so called helpful things that MS do which frankly for anyone who uses OS News just gets in the way of doing your work.
Their so called helpful stuff like hiding essential directories, making you use more moue clicks to unmount a USB Stick etc etc etc.
IMHO, the more that MS do to effectively dumb down their system the more useless it becomes (OOTB) to the power user.
Why does a USB stick have its driver loaded 99 times out of a 100 and then on the 100th time, it fails and the only way to get it working again is to reboot the system?
As I said, Linux is getting there but it isn’t quite there for MY needs. After all the topic of the article is relating to individual users needs. Don’t get me wrong, I really like Linux. I’ve been using Unix since the late 1980’s (Ultrix) and I first ran Slackware 1.1 all those years ago.
My main gripe is the usability of Gimp. Fix that and I might consider moving to it full time. No, I don’t use Ubuntu, as personally, I find it unusable. I mostly use CentOS on my Linux systems.
I’ll give you that there may be a person once every blue moon that has all kinds of troubles installing Windows. The same is true of Linux and probably every other os. But to suggest you have to jump through a bunch of hoops and do all this `extra work` to get Windows 7 working “half decent” for example, is utter nonsense.
It takes 2 mouse clicks and all of about 2 seconds to unmount a usb stick. It’s pretty pathetic to complain about that.
MS doesn’t dumb down their OS, they just keep the more advanced stuff out of the way of newbie/novice users, which happens to be a good idea. “Power users” should have no problem finding what they need. I certainly don’t so if you feel like you’re going on a treasure hunt to find something, I would suggest you’re more newbie/novice than power user.
No clue, I’ve never experienced that problem. Perhaps the cause is flaky hardware – often the real blame for “Windows problems”. A reboot probably isn’t necessary, but maybe restarting a service is. If I ever have that issue, I’ll post my findings.
I don’t use Ubuntu either, or any other Windows wanna-be distro for that matter. In my opinion there isn’t a Linux desktop that holds a candle to Windows so in the desktop realm, Linux is pretty much a joke. I do use Debian however and it works great for how I’m using it.
Only if you ignore the rest of his comment. Reading it in context, it’s clear he means Microsoft is getting farther from fulfilling his needs (which was really the point of his comment). I guess he doesn’t have much love for Windows 8.
No, I didn’t ignore anything. Quite the opposite, actually. I took the whole comment into account. He wrote a bit about his experience with those operating systems, and then a conclusion. Pretty standard practice.
I used to use Gnome 3 for a few months, and now I’m using Unity for almost a year.
I like them both, but I still find the search-based application starting to be a nightmare.
Searching for apps to run is pointless if don’t know their names. And most users won’t care to remember the fancy names of the applications. Menu-based application selection works best because everything is arranged into intuitive categories (ie: Graphics, Multimedia, Audio, Office, System Tools).
Both Gnome 3 and Unity have an option to find apps based on categories, but they are so awkward to use that I can’t even describe it.
The search things doesn’t just work for the short names displayed – it works on a bunch of metadata for the applications, including the longer descriptions. So you don’t actually need to know the application name – e.g you can get to the file manager by searching either “nautilus” or “file”; likewise, Firefox either by name or as “browser”.
I tend to favour KDE for my desktops and I like XFCE but for low-spec systems I tend to go for LXDE these days. I’m surprised nobody has mentioned it here yet.
Totally agree, I think LXDE is getting really good.
The issues I have with XFCE4 are:
– Don’t like the restricted positioning of icons on the desktop (so I need to replace xfdesktop with pcmanfm, spacefm or nautilus).
– I really want easy access to times in different places around the world. I still think the Gnome clock applet is brilliant for that, but gsimplecal can be configured to do that too. It integrates well with LXDE, but I haven’t found a non-hackish way to integrate it into XFCE4, and I don’t consider orage’s globaltime to integrate well!
I don’t have these issues with LXDE, and it’s significantly lighter than XFCE4. It’s also getting reasonably ‘polished’, but it can be slightly raw at places.
Yeah, having tired of rebuilding my mum’s spyware ridden Pentium M ultra-portable I wanted to find a linux distro with minimal graphical guff since all she does is browse the web and grab photos off her camera.
Having tried out Mate on very minimal Debian and Ubuntu installs as well as Ubuntu 10.04 (when they still used gnome 2) and feeling underwhelmed at the performance compared to WinXP on that hardware I gave Lubuntu a whirl… it was perfect for her after moving some UI elements around (essentially to mimic windows) and I’ve not had to fix it since.
If you don’t like changes that happen in Ubuntu, like xorg.conf removal, menu.lst change, etcetera. Don’t use mint, being Ubuntu based those changes trickle down. Since you are also running low cost systems, need solid stability, and long term support, I’d say use the latest stable version of Debian with XFCE.
I agree. The author didn’t like some of the Ubuntu changes, but he’s going to inherit them anyway now that he’s using Mint. He’s probably going to find out in a few years that he should have picked Debian or some other distro.
Mint has a Debian-repo-based version, which means you can avoid all the Ubuntu cruft if you really want.
Harold, I would have thought that Linux Mint Debian Edition (XFCE4) would be a really good fit for you. It has the Debian stability, even has continuous upgrades that are managed for you by Linux Mint in a stable way. And it’s obviously not based on Ubuntu.
Sorry, not ‘Harold’ but Howard Fosdick.
I think there is now way in hell that any of us would willingly switch back to xorg.conf. It requires restarting X all the time, even after turing Wacom tablet on, if it was unplugged when X started. The only reason for using x11.conf / corg.conf was no alternative. At first the transition to more automagical Xorg and hardware detection was confusing, but, like alsa->pulseaudio and staticdev->udev, it was totally worth it. And it’s backward compatible with xorg.conf still available.
I agree that switching to Mint solves nothing. There are Mate packages for both Mint and Ubuntu, which solves author’s desktop manager problem anyway.
It isnt Linux Mint, its OS4 OpenLinux, http://www.os4online.com. OS4 OpenLinux is built for the people who want it simple, stable and quick. The interface is great based on panel layouts and very intuitive. All your codecs and other software is installed so you can listen to music out of the box, you can watch flash video out of box and you can even watch the digital downloaded movies out of box. If you are a power user, its flexible enough for you offering complete access to everything under the hood. Used by software developers and even some of the kernel developers use OS4 OpenLinux. I used to be a distro hopper and I came from Linux Mint. But I have been so happy with OS4 OpenLinux since I landed here, you cant pry it out of my hard drive. In my opinion, OS4 OpenLinux is the only distribution that can truly compete with Windows 8, and Mac OS X. The personal support you get, and you talk to the development team personally, is unlike anything that I have ever seen with other distributions.
Why on earth would ever want to choose distro based on other distro for a corporate environment? [or any stable working environment, for that matter]
The only way to establish stable and predictive working environment for the end users is to go with the lowest possible level – which is usually Debian.
And now, when Canonical is planning to go for Mir and other stuff it’s better than ever to think about switching to some low-level, predictable distro, which is – again – possibly Debian.
Why? well, it’s simple. Arch changes way too fast and unpredictably, crashing some things here and there, and it needs constand maintainance on the lowest level, Suses are quite chaotic and buggy, same goes to RHEL and Co. [Fedora, anyone?]. Slackware is too poor in packages, but it could possibly serve as a good platform for small, embedded projects. Gentoo is way too time-consuming [although it can be used in conjunction with binary packages].
And that’s about it, folks. Here you have it.
The big truth clueless Windows 8, KDE 4, Gnome 3, and Unity developers did not understand is:
people want to actually USE the system.
People want to focus on their tasks or their leisure/fun, NOT on your latest brilliant concept that will be wiped away in a few months by a new fad.
No one cares of how hard you try to reinvent the wheel, most people will simply (and righteously) be outraged if your efforts makes them spend time/brain power thinking ABOUT the system rather than actually using it.
The better user interface is the one you don’t see.
No one wants your brand new shiny toys, no one buys a machine to look how good you are, for work or for fun they want to do what they need to do, not what YOU want them to do!
Brillant !
Kochise
What is KDE 4 doing wrong that Windows Seven is doing right then?
Considering they pretty much do things the same way.
A user logs in and programs are primarily launched, either from a “start menu” or clicking on desktop icons.
I’ve run latest KDE just fine on my 5 years old computer until half a year ago.
I think KDE is is a classic style desktop with all modern addons. For example it still has a classic menu if you like one, but on the other hand there is a keyboard launcher where you can launch anything through typing stuff in.
Wonderful.
Share a website with you ,
( http://www.femalemalls.com/ )
Believe you will love it.
We accept any form of payment.
http://www.femalemalls.com/Michael-Kors-Handbags-n2447/
I currently use SolydK (a fork of Mint LMDE when they stopped supporting KDE). My primary reason for going with KDE was better support and control for multiple monitors. I can have different backgrounds on different monitors for example without resorting to NVIDIA/ATI config wackiness.
My only gripe right now is I have a high resolution monitor (27″) and the fonts look terrible/unreadable at smaller sizes.
I also prefer the mint “LMDE” editions which allow for rolling upgrades instead of “backup/install/restore”.
To Ubuntu’s credit their package management system is really easy (for simple configs!) when upgrading to a newer release.
If you want a traditional interface these days, your major choices are KDE, XFCE, and LXDE.
Gnome 3 and Unity have gone off in a different direction, and Windows 8 UI has gone off the deep end. I don’t know why those behind these systems are convinced that one interface handles desktops, laptops, tablets, and phones. Apple and Google have wisely decided that these are different computer uses that require different operating systems. It will be interesting to see if 8.1 brings Windows back from the brink, or whether this view that one operating system can handle all kinds of computers is fatally flawed.