Anachrony

2009-06-03

Tux the Homewrecker

Filed under: Rants — Tags: , , — halbyrd @ 01:35

or: Why Linux is not Ready for the Desktop

Linux is one of the most high-profile successes of the Free & Open Source Software (FOSS) movement to date. Starting as the hobby project of Linus Torvalds while he was studying at the University of Helsinki in 1991, Linux has evolved into a massive, world-wide collaborative effort, and the most widely used UNIX-style operating system in the world. It scales from hand-held PDAs and smartphones all the way up to clustered supercomputers, and had captured 12.7% of the overall server market worldwide as of Q1 2007[*]. Yet for all its power and flexibility, it still hasn’t managed to make a serious mark in the world of consumer desktop machines.

There are several reasons for this, some controllable, some not. Inertia is one of the uncontrollables: people are, if not comfortable, at least familiar with Windows. It’s weird, wonky, and sometimes unreliable, but it’s what comes pre-installed in most machines, and it does most of what the average user needs an OS to do. This by itself is not enough to impede a switchover, as Apple’s consistently upward sales trends can attest. It is a factor to consider however, and serves to amplify the other issues.

Apple’s success is particularly significant, as adopting OSX requires not just adjustment to a new OS, but a new computer to go with it. This would seem to be an even worse situation than the one Linux is in, but OSX offers enough advantages over Windows that many are willing to make the switch. Linux offers many of the same advantages, including a more stable system, better performance on most day-to-day tasks, and a more securely-designed system architecture that avoids much of Windows’s vulnerability to malware. Linux also has a significant cost advantage over OSX: not only do you not have to buy a new computer, you don’t even have to pay for the OS itself, just for the CD or DVD to burn the install image onto.

The fault, then, must lie in large part with Linux itself.  Simply put, there are several things that the Linux community—and the desktop distros in particular—are doing wrong.

Before I continue, note that this is about what Linux is doing wrong for the desktop.  What a desktop OS needs to do is different from what a server os needs to do, which is different from what a device-embedded OS needs to do, and so on.  Linux can fill all of these roles and more, but I’m not talking about those other roles today.

Installation

The first problem that springs to mind is getting the system up and running. While getting the OS itself is simple enough, as is installing anything in the repositories of your distribution of choice, there is still no way to install arbitrary 3rd party programs in a simple, consistent way. OSX is the clear winner in this department, with a pre-defined .DMG file format for delivering the program’s components over the internet, as well as installing the program itself in a simple drag-and-drop operation. Windows has had the .MSI format since Windows 2000 was current, and it performs much the same function, but most are still using 3rd party installer programs such as InstallShield. Linux is the clear loser here, with no real agreement on what installer package format to use (.deb? .rpm?) or whether the files should just be tossed into an archive (.bzip? .tar.gz? .rar?), and whether or not they should be pre-compiled.

Install procedures can range from the very simple (apt-get install <package name goes here>) to the incredibly obtuse (unpack to source directory, find dependencies, find dependencies’ dependencies, compile, link, find out you missed a needed command-line argument, curse, rant, rave, repeat). Each distribution has been attempting to simplify the install process through online program repositories and package-management systems, but there is no consistent standard for what programs should be included, how frequently the repositories should be updated, or what optional features and supplementary packages should be included.

There also isn’t a consistent location in the file system structure to put the programs when they are installed—some go in /bin, some go in /usr/bin, some go in /opt, and some go in other random locations like /usr/share/applications. Windows has been consistently providing C:\Program Files as an agreed-upon spot for years now, and MacOS has had an Applications folder for even longer. This inconsistency in Linux makes even a simple operation like putting a program shortcut on the desktop a laborious chore.

Another area of shortfall is program removal. Windows provides the Add/Remove Programs interface, which is pretty much Exactly What It Says On the Tin.  OSX doesn’t have anything similar, but seems to get by without due to the self-enclosed nature of most OSX-based applications. Programs installed in Linux often spread out across multiple directories, however, and no consistent interface whatever is provided for programs not contained in the distribution’s package repository. For programs within this repository, removal is as simple as installation (apt-get remove <insert package name here>), but for anything else, it’s back to the bad old days of hunt-and-delete.  Oh, and you’d better be sure what you’re getting rid of isn’t being used elsewhere, or you’re screwed.

Peripherals

Another major issue is the handling of peripherals. USB HID devices like mice and keyboards are handled gracefully enough in their most basic forms, but many devices are handled in a manner that ranges from inconsistent (USB flash drives) to schizoid (multimedia keyboards and other non-standard devices) to outright hostile (graphics cards, displays, sound devices). I have a fairly mundane setup as peripherals go, with a keyboard, multi-button mouse, a jog-wheel for volume control, speakers, microphone, and a pair of monitors. Yet getting these configured in useable fashion is a struggle every step of the way, from getting Linux to recognize the Back and Forward buttons on the mouse, to setting up a dual-display desktop and getting 3d acceleration turned on for games—or for the much vaunted Compiz that so many love to praise as Linux’s killer feature.

Setting up 3d acceleration and dual-display is particularly gruesome, requiring a descent into the arcana of xorg.conf and related configuration text files with manpages and google at the ready. Dynamic display detection, long since considered standard on both Windows and OSX for laptop users, seems to be beyond the ability of Linux’s X.org graphics engine. 3d acceleration, which is automatically turned on when needed in Windows and on all the time in OSX, is a further hassle, compounded by the slow driver release schedule from both Nvidia and ATI. Even something as basic as changing screen resolution can often require more prodding of the xorg.conf file, as X.org still commonly misinterprets or ignores DDE information provided by modern displays.

Another hardware-related source of grief is the difficulty of getting most WiFi chipsets to function. This is largely due to the unwillingness of Broadcom and other chipset manufacturers to release either working drivers, or enough of an API for the Linux driver team to write their own. Some of the blame, however, can be laid at the feet of the Linux network stack, which in most cases still requires poking around in config files even for setting up basic things like static IP addresses, network SSIDs and WPA2 keys.

USB flash drives and external hard drives are another source of grief in Linux, as there is no agreed-upon default location in the filesystem for these drives to appear once mounted. (The /mnt directory is suggested, but each device must be given its own empty subdirectory, which must be made beforehand.) Windows’s habit of assigning drive letters to each partition makes this a non-issue, as does OSX’s habit of making drive icons appear directly on the desktop for interaction. Linux, in its default state, forces the user to set these mount points manually, for each drive, and requires user intervention to even begin the mount process. Recent iterations of the Gnome and KDE desktop environments have attempted to automate this process, but neither one works consistently and predictably.

Conclusion

Linux is quite possibly the most versatile OS to date. Its scalability, reliability, and unique anyone-can-contribute development environment, combined with its already vast popularity in the server world, give it a unique position among operating systems. It has the potential to eventually displace Windows as the OS of choice for the desktop, but for now, these issues outlined above act as major stumbling blocks to its success. Hopefully these issues will be addressed in future development efforts. Once that happens, we may be able to truly declare that Linux is ready for prime-time in the home.

Advertisements

Blog at WordPress.com.

%d bloggers like this: