Another big problem is the requirement to use the terminal to do a lot of the program installations and any advanced settings that are required to get some programs up and running. It's not as simple as Windows where it is often a double-click on an icon that opens a wizard to guide you through installation in a graphical way. Also, I think that there are too many Linux distros out there and this harms it because now a user has to figure out if a Linux program will actually run on his system or if he has to compile it from the source code!(40hz sighs and hauls his butt up on his soapbox once again.... )
There are a lot of people, Linux users among them, that would agree that there are too many separate distributions. But a lot of the confusion comes from people thinking that each distribution represents a radically different flavor of Linux.
In actuality, despite the number of distributions, there are really only four major distributions: Slackware; Debian; RedHat; and SUSE. Virtually all the other distros use one of the four 'majors' as the base and build from there. In most cases, the only real differences are: what non-free software gets included; what local languages are supported; and what the preferred desktop manager is.
So in a nutshell, there's significantly less there than meets the eye.
As far as compiling packages goes, most Linux users will never need to do that unless they are installing something so arcane or bleeding edge that distro-specific precompiled packages haven't yet been made. But with approximately 18-20,000 (and still growing!) packages available in each of the major distro repositories, it's getting a little difficult to find something the average (or even not so average) user would want that isn't already precompiled.
Furthermore, there are working conversion utilities that allow you to use the packages from one distro's repositories with a completely different distro (i.e. RPM to DEB, etc.). I've used such converted packages several times when I couldn't be bothered to wait for something to be included in my preferred distro's repository and I didn't feel like compiling it on my own. And to date, I never experienced a problem with doing so.
And compiling isn't really such a big deal. Compiling a Linux app is almost always done in four simple steps:Step-1:Unpacking
Most packages come compressed when you download them. Unpacking a package in Linux is not very different from unzipping a file in Windows: you copy the file to a directory and unpack it. The ususal command to do that is called tar
(from the archaic 'tape archive') and usually looks like this:tar -xvzf package.tar.gz
(sometimes the file extension is bz2
which is just a variant of .gz)Step-2: Configure
Configuration is actually a bit of a misnomer. You aren't configuring something so much as you're checking your system to make sure all the system dependencies are in place before you start compiling. This is an important step because this process generates values for system dependent variables that are needed for something called a makefile
. You don't need to know what that is just so long as you know that the configuration step has to be completed successfully before you actually compile you application.
All you need to do to make all this happen is change to the directory that you unpacked your package in:cd/package
and run the supplied configuration script which is included as part of your package:./configure
Usually it will execute without a problem. The only time you will normally get an error will be if the package is damaged (in which case you need to download it again) or you're missing a dependency which you can almost always install through your distro's normal package manager (i.e. APT, YUM, Synaptic, etc.). This is no different than what you run into in Windows when an application informs you that you need to install .NET Framework, a VisualBasic runtime, a Java runtime, or update a Windows component before it can be installed.Step-3:Build the binary
Once ./configure exits without errors you actually compile your binary package. This is done with the make
command:makeStep-4: Install the app
Once make completes all that's left to do is install the binary t you just created. That's done with the make-install
And that's it. You've just compiled your very own Linux package from source!
Here's a quick review. We'll use a fictional Linux application named w00t
that ships as a tar.gz archive for this example:# tar -xvzf w00t.tar.gz
# cd w00t
# make install
That's it. Not so hard, right? Especially when you consider that you don't really need to know what any of that actually means so long as you follow the steps. And these four steps are the same for about 99.9% of what's out there. Do it once and you know how to do it for almost everything.
I think Linux could take a step forward by doing anyway with the terminal and simplifying the installation of some programs. I don't think any user short of a programmer should have to touch the terminal.
The Terminal (brass fanfare as the incense rises...
) goes right to the heart of what Linux is all about. It's a philosophical as well as a technical issue for many people. Especially once they get some experience and discover just how powerful and useful a tool the command line is.
Much like touch-typing, the command line is a skill set that takes some effort to learn. But once you make that effort, there's just no going back.
I can't say much in response to your suggestion to do away with the command line other than to say not to hold your breath. You can take the command prompt away from many Linux users "when you can pry it from their cold dead fingers" to borrow from an old bumper sticker.
So spend a little time getting acquainted with the bash shell and your terminal app. And make a modest effort to learn how to use Vim, or some other basic editor. You'll be amazed how much power you've gained with nothing more invested than your time.