My post Thank you, Ubuntu attracted some attention (hello, Hacker News!). A couple of commenters were skeptical of my assertion that GNU/Linux is more usable than the major proprietary operating systems— and rightly so, since I didn't really elaborate on it at the time. For example, commenter Dubc wrote:
I think Ubuntu is a fine choice indeed, but you are going to either have to use another word than usability, or redefine the term to meet your presumptions.
So I'd like to expand on a few of the things that make Ubuntu such a pleasure to use, for those who aren't familiar with its ins and outs, especially because I so rarely see people take the time to articulate some of these nifty constructs at a high level.
I will happily grant that at the application level, Windows and Mac OS have a much more consistent base of polished applications. And Apple has an attention to detail and psychology that others would do well to learn from.
It's often said that Apple takes the time to get the little things right. Unfortunately, at times I think they should have instead spent a bit more time thinking through the big things in Mac OS. There are some fairly glaring architectural/design/HCI problems on the Mac (and on Windows, too) that hamper users needlessly. And because these are issues with how one interacts with the system at its most basic levels, no amount of polish on the "little things" can really satisfactorily outweigh them.
What that boils down to is this: yes, you will wrestle with GNU/Linux for a few hours or days or weeks when you get it set up. But if you use Windows or Mac OS, you will wrestle with it every minute of every day you're using it. Forever. And that's just not what "usability" means to me.
Here are some illustrative examples:
apt
An anonymous commenter wrote:
in Mac installing software is a single drag-and-drop function
Oh, if only that were the case. The flow for installing software on Windows/Mac looks something like this (some steps omitted for brevity):
- Open a web browser and search for the software you want.
- Navigate to the vendor's web page, fish around for the download link, and start downloading the software. Wait.
- Find the file you downloaded and open the installer/archive.
- Accept the license agreement.
- Drag the item to the dock (Mac only).
- Answer some setup questions. Wait.
- Delete the installer/archive.
These steps are error-prone, and unsophisticated users will quickly find themselves with malware on their systems. We should help users to do the right thing by default when installing software, and subjecting them to the vagaries of the wide-open web isn't the right thing.
On Ubuntu:
- Open Synaptic and search for the software you want.
- Click install. Wait.
Whatever the heck you want, you can find it in Synaptic. And there's no distinction between software that was preinstalled with your system and software that you choose to install later. It all works the same way and gets security updates in the same way. And it looks like Mac OS is finally starting to head in this direction with the App Store for Mac software.
But, what's more, each of the thousands of pieces of software available in the Ubuntu (or whatever distro you use) repository has an audit trail, and— unlike the apps you get in the Mac App Store, or any "App Store," really— Ubuntu has the means, motive, and opportunity to do whatever it takes to make them "just work," no matter where and when and for whom it was originally written. There is a vast body of software that is just considered to be part of your operating system and is maintained alongside it, and Ubuntu will vouch for that software just as it will vouch for the software at the core of your system.
It's not clear whether you will ever see that sort of large-scale integration work in a proprietary OS. Package management is really one of the highlights of the free software ecosystem.
Remote access
X11 has pretty much decoupled where a program runs and where the program can be used. You have a few basic composable tools— SSH, X forwarding, and port forwarding— and all of a sudden the vast majority of applications can be invoked remotely. So you can, very easily, interact with a program that's using the resources (CPU, filesystem, network bandwidth) of a computer sitting somewhere else. There are a few gotchas with respect to X11 (watch out for high-latency network links, for example), but by and large it works.
"But wait," you say, "Mac OS has SSH and X11 too." Except that most of the apps you want to use can't be invoked remotely, because they aren't X11 apps. So you have a litany of tricks. VNC and NX are appropriate sometimes, but they're clunky. Mac OS has "Back to My Mac" which is a slightly more general tool (and costs $99/year). But typically, whenever you work remotely you have to work differently too. It's a far cry from the situation on GNU/Linux, where you have a critical mass of applications you can use remotely and they just work, more or less transparently.
The result is that Mac and Windows users are typically highly dependent on a single piece of hardware. If you have a nice light laptop, more power to you. They say the best computer is the one you always have with you, after all. But even better than that is not having to carry a computer around. It's incredibly liberating, not to mention convenient. You can actually go places without having to pack a bag.
On GNU/Linux, one barely even perceives the concept of "other computers." Every computer you use, no matter how far away it is, is so accessible to you, so immediate, that it might as well be the one sitting on your lap right now.
Of course, Microsoft and Apple don't really have any incentive to make remote access terribly effective. Since they generally sell you a license to use a particular instance of their software on a particular computer, it would cut into their bottom line to make remote access actually useful. So they won't.
Window managers
The Windows/Mac window managers (the part of your OS that lets you manipulate windows and switch between them) are pretty bad. They weren't always this way— they have become noticeably clunkier on the large displays that are so common today.
On large displays, maximizing (zooming) windows is not useful much of the time. Maximized windows are just too big. Instead you have to move and resize windows to obtain a useful multi-window arrangement on your screen. But Windows and Mac OS give you very little help in this regard. To produce these arrangements you usually have to click on some tiny button/target and manually move windows around.
For example, moving or resizing windows on Windows or Mac OS typically requires you to point at the titlebar or resize handle or window border— both of which are just a few pixels wide.
Fitts's Law? Bueller? Bueller? Having enabled alt-dragging to move or resize on any capable X window manager (I use Openbox, but even the Ubuntu default window manager supports this), the entire window becomes your drag target. The entire window. (Alternatively, if you use a tiling window manager, even the whole concept of manually moving windows around becomes basically moot.)
In fairness to Apple and Microsoft, they have taken some steps in the right direction recently. Windows 7 "docks" windows to one side of the screen when you ask it to. You can configure some recent Macs to let you drag windows with three fingers (or so I've heard). And Mac recently gained Spaces (i.e. virtual desktops) too.
Unfortunately, I think these are the exceptions that prove the rule. Mac OS and Windows users have waited for years only to get a fraction of the window management facilities that you could have set up in your X11 window manager. Supposing you spent an hour setting up your WM to your liking, I figure you would earn back that time in improved productivity in just a few weeks, rather than having to wait years for Microsoft or Apple to implement a sensible workflow. But more importantly, your blood pressure would go down, immediately.
People really are more effective when they can set up their computers to work they way they want. But on Windows and Mac OS they aren't given the tools they need to do so. I wouldn't disparage these operating systems so much if they came with window managers that could at least be configured to be minimally adequate. But they don't.
The window manager is the one program you're using all the time, even when you're using other programs. It's your primary vehicle for multitasking and your command center for managing your work on a second-by-second basis. So it's really important to get this right.
Conclusion
Having read this post, you might decide that you would brand these issues not as "usability" issues but as something else. Which is completely fine by me. A rose by any other name, after all…
Many of the cumbersome rituals that Windows and Mac OS have whittled away at over the years are completely unnecessary in Ubuntu. And this is why, while I've had occasion to use a Mac (at various times since 2004 or so) and Windows (only occasionally), sitting down at one always feels like death by a thousand cuts. They're superficially simpler, but really quite tiresome once you get beneath the surface. It seems that hey are optimized for learnability at the expense of usability. That is is exactly the wrong optimization to make, if you use a computer for a living.
Life is just too short for that.
Now, it's not like Ubuntu is a paragon of usability out of the box, either, though some of the things I mention above do just work by default. The difference— the key difference— is that you can make it into one without much hassle.