When Will Linux Really Go Mainstream?
Despite the Linux and Ubuntu netbook craze, I think there is a simple reason why Linux is not a mainstream solution for most users — yet. Let me explain.
There’s this guy I met a few times at LinuxWorld. At the time I met him, he worked for Palm. I never can seem to remember his name, but I remember this quote from him, “Open Source Software: 80% as good as the last guy who worked on it needed it to be.”
His key point: There’s a difference between the Open Source development process and Commercial development processes. Commercial entities spend considerable time and money testing software and ensuring that it is relatively easy to use, bug free, and intuitive. However, Open Source projects are mostly interested in results and functionality rather than cosmetic tweaks.
And therein lies the reason, in my humble opinion, that Linux has not yet reached critical mass.
Novices Beware?
I have no problems getting around in most any Linux situation, but many non-technical users would have a tough time if they were dumped into a Linux environment.
My favorite example is e-mailing with Outlook users. If the Outlook user is configured to use the Microsoft RTF format, any e-mails you get from them will be encoded in a TNEF (winmail.dat) file. At one point, I used to say this was Microsoft’s fault for using TNEF and how it wasn’t a standard file format, but pointing fingers doesn’t make the experience any better for the end user. There are open source tools for decoding TNEF files, but the ones I have run across require to you to go through many non-standard steps just to open and extract the contents. Doing a quick search on Freshmeat will show you that in most cases you have to use an external program to open these pesky files. So, opening TNEF e-mails is not intuitive and not user friendly.
Another take on this same concept is how applications are installed and configured. If you install an application from a repository (apt,yum,yast,etc…), those packages have been tested and integrated for the distribution they are a part of. This means that the basic configuration is done, but it doesn’t necessarily mean they are configured in a useful way for the end user. Additionally, if you install an application that is not included in a repository, then it may not work properly at all due to poor testing, lack of intuitive configuration options, and lacking documentation.
One great example of this problem is Apache Tomcat. When you install Tomcat in Ubuntu from the apt sources, you get an installed Tomcat server, but it’s hardly functional. There’s no user account defined for the manager interface, there’s no pointers to tell you where to drop your application archives to have them loaded, and on and on… I’ve gotten to the point where it’s easier for me to download the package straight from Apache rather than try to use the version provided in the repositories. This could easily be addressed by providing a post-install script which would allow the user to enter some sane defaults for various options. That script should be written to be as intuitive and simple as possible so that the user does not need to have ten years of experience to understand it.
Don’t Kill the Messenger
I don’t want to sound like I’m beating up on all developers. Some Open Source apps do provide these sorts of configuration options (like when you install MySQL). The general idea of this article is to say that we should all spend a little more time trying to disprove the 80/20 rule by finishing off the last mile of the application, which is the polish and quality assurance. The extra effort and time will be what makes users more comfortable with Open Source and specifically Linux and Ubuntu. The overall issue is that we, as Linux users and developers, need to step back from our projects once in a while and try asking ourselves what average people doing average things would think about our software. If we want Linux to be the dominant operating system, then investing time and effort into quality assurance and intuitive user interfaces is where the battle will be won.
I don’t like to state a problem and just let it go at that, so here I am going to propose some ideas on what the solution may be.
Quality assurance certainly is a challenge. As developers and technical people, we are bad at testing software because we know not to put nonsensical information into places where it doesn’t belong. But, how do you find a sufficient test-bed of people to do alpha/beta testing and QA? Well, I propose that there be a new option for QA and testing volunteers on project sites. Many people want to contribute to the Open Source community, but do not have programming or other technical skills. These people are the ideal candidates for testing and QA because they do NOT know what inputs and settings will break the application (and subconsciously avoid it). That would be our call to service then, ask the end users to volunteer some time to tell us (the developers) what we can do to make their user experience better.
Finally, to answer the question set forth in the title of this article: When will Linux really go mainstream? Well, the answer is: When we start making sure that we follow through on our projects with user testing and quality assurance.
Contributing blogger Deven Phillips is a senior systems administrator and software engineer for a major manufacturing company based in Louisville, KY. He has used Linux since 1997, and Ubuntu in both desktop and server settings since 2006. WorksWithU is updated multiple times per week. Don’t miss a single post. Sign up for our RSS and Twitter feeds (available now) and newsletter (coming in 2009).
About the Author
You May Also Like