Concept image of a Linux terminal full of text on a laptop
Fatmawati Achmad Zaenuri/

The command line is almost 50 years old, but it’s not outdated. Text-based terminals are still the best way to accomplish many tasks, even in the age of graphical desktops and touch-screen gadgets.

In fact, the command line is becoming more respected than ever with Microsoft creating a powerful new Windows Terminal application. Windows 10’s PowerShell environment is surprisingly powerful, but Microsoft still went out of its way to add support for basically the full Linux command-line environment to Windows 10.

The Command Line Was Once the Only Option

At one time if you wanted to interact with a computer, you typed. That was it. There was nothing else. That might sound restrictive and archaic, but as a step-up from having to use punched cards or perforated paper tapes, typing was radical and transformative. And migrating from teletypewriters with their rolls of paper to terminals with cathode ray tube (CRT) screens was another ground shift in human and computer interactions.

That step paved the way for the interactive shell to really come into its own. Now you could send instructions to the computer and very quickly have responses displayed on your screen. No more clack-clack-clack as you waited for your paper printout to clatter its way out of your teletypewriter.

Fair enough, but that was then, this is now. Computing is a whole different ball game. Apart from the obvious locked-in cases like using a computer that doesn’t have a graphical desktop environment installed, or using a remote computer via SSH over a low bandwidth connection,  or controlling a headless or embedded system, why use the command line over a graphical desktop?

Jargon Explained

Terms like command line, terminal window, and shell are used almost interchangeably by some people. That’s incorrect jargon. They are all quite different. They’re related, but they’re not the same thing.

A terminal window is a window in a graphical desktop environment that runs an emulation of a teletype terminal.

The shell is the program that runs inside the terminal window. It takes your input and, depending on what you typed, tries to interpret and execute the instructions itself, pass them to some of the other utilities that make up the operating system, or find a script or program that matches what you have typed.

RELATED: What's the Difference Between Bash, Zsh, and Other Linux Shells?

The command line is where you type. It is the prompt that the shell presents when it is waiting for you to enter some instructions. The term “command line” is also used to refer to the actual content of what you typed. For example, if you talk to some other computer user about a difficulty you had trying to get a program to run, they may ask you, “What command line did you use?” They’re not asking what shell you were using; they want to know what command you typed.

Altogether, these combine to form the command line interface (CLI).

Why Use the Command Line in 2019?

The CLI can seem retrograde and confusing to those who are not familiar with it. Surely there’s no place in a modern operating system for such a dated and geeky way of using a computer? Didn’t we give all that up decades ago when windows, icons, and mice appeared and graphical desktop environments with graphical user interfaces (GUIs) became available?

Yes, the GUI has been around for decades. The first version of Microsoft Windows was released way back in 1985 and became the PC desktop norm with the release of Windows 3.0 in 1990.

The X Window System, used in Unix and Linux, was introduced in 1984. This brought graphical desktop environments to Unix and its many derivatives, clones, and off-shoots.

But the release of Unix pre-dates these events by a more than a decade. And because there was no other option, everything had to be possible via the command line. All human interaction, all configuration, every use of the computer had to be able to be performed via the humble keyboard.

So, ipso facto, the CLI can do everything. A GUI still cannot do everything the CLI can do. And even for the parts that it can do, the CLI is usually faster, more flexible, can be scripted, and is scalable.

And there’s a standard.

They’re Standardized Thanks to POSIX

POSIX is a standard for Unix-like operating systems—basically, everything that’s not Windows. And even Windows has the Windows Subsystem for Linux (WSL.) Open a terminal window on any POSIX compliant (or close to compliant) operating system, and you’ll find yourself in a shell. Even if the shell or distribution provide their own extensions and enhancements, as long as they provide the core POSIX functionality you’ll be able to use it straight away. And your scripts will run.

The command line is the lowest common denominator. Learn how to use it and, regardless of Linux distribution and graphical desktop environment, you’ll be able to carry out all the tasks you need. Different desktops have their own way of doing things. Different Linux distributions bundle various utilities and programs.

But open a terminal window, and you’ll feel at home.

Commands Are Designed to Work Together

Each of the Linux commands is designed to do a particular something and to do that something well. The underlying design philosophy is to add more functionality by adding another utility that can be piped or chained together with the existing ones to achieve the desired outcome.

This is so useful that Microsoft went out of its way to add support for the full Linux command line to Windows 10!

For example, the sort command is used by other commands to sort text into alphabetical order. There’s no need to build sorting capability into each of the other Linux commands. Generally, GUI applications do not allow this type of collaborative interworking.

Look at the following example. This uses the ls command to list the files in the current directory. The results are piped into the sort command and sorted on the fifth column of data (which is the filesize). The sorted list is then piped into the head command which by default lists the first ten lines of its input.

ls -l | sort -nk5,5 | head

We get a neat listing of the smallest files in the current directory.

listing of the ten smallest files in the current directory

By changing one command—using tail instead of head—we can get a list of the ten largest files in the current directory.

ls -l | sort -nk5,5 | tail

This gives us our list of the ten largest files, as expected.

list of the ten largest files in the current directory

The output from commands can be redirected and captured in files. The regular output (stdin) and error messages (stderr) can be captured separately.

RELATED: What Are stdin, stdout, and stderr on Linux?

Commands can include environment variables. The following command will list the contents of your home directory:

ls $HOME

This works from wherever you happen to be in the directory tree.

listing of home directory in the terminal window

If the idea of all that typing still fazes you, techniques like tab completion can reduce the amount of typing you have to do.

Scripts Enable Automation and Repeatability

Humans are prone to errors.

Scripts allow you to standardize on a set of instructions that you know are going to be executed the same way every time the script is run. This brings consistency to system maintenance. Safety checks can be built into the scripts that allow the script to determine whether it should proceed. This removes the need for the user to have sufficient knowledge to make the decision themselves.

Because you can automate tasks by using cron on Linux and other Unix-like systems, long, complicated, and repetitive tasks can be simplified or, at least, figured out once and then automated for the future.

PowerShell scripts offer similar power on Windows, and you can schedule them to run from the Task Scheduler. Why click 50 different options every time you set up a computer when you could run a command that automatically changes everything?

The Best of Both Worlds

To get the best out of Linux—or any operating system as a power user—you really need to use the CLI and the GUI.

The GUI is unsurpassed for using applications. Even die-hard command line advocates have to come out of the terminal window and use office productivity suites, development environments, and graphical manipulation programs now and again.

Command line addicts don’t hate the GUI. They just favor the benefits of using the CLI—for the appropriate tasks. For administration, the CLI wins hands down. You can use the CLI to make changes to one file, one directory, a selection of files and directories or completely global changes with an equal amount of effort. Trying to do this with the GUI often needs long-winded and repetitive keyboard and mouse actions as the number of affected objects rises.

The command line gives you the highest fidelity. Every option of every command is available to you. And a lot of the Linux commands have many options. To take just one example, consider the lsof command. Take a look at its man page and then consider how you’d wrap that into a GUI.

There are too many options to present to the user in an effective GUI. It would be overwhelming, unattractive, and clunky to use. And that’s the complete opposite of what a GUI aims to be.

It’s horses for courses. Don’t shy away from the CLI horse. It is often the faster and more agile steed. Earn your spurs, and you’ll never regret it.

Profile Photo for Dave McKay Dave McKay
Dave McKay first used computers when punched paper tape was in vogue, and he has been programming ever since. After over 30 years in the IT industry, he is now a full-time technology journalist. During his career, he has worked as a freelance programmer, manager of an international software development team, an IT services project manager, and, most recently, as a Data Protection Officer. His writing has been published by,,, and Dave is a Linux evangelist and open source advocate.
Read Full Bio »
Profile Photo for Chris Hoffman Chris Hoffman
Chris Hoffman is Editor-in-Chief of How-To Geek. He's written about technology for over a decade and was a PCWorld columnist for two years. Chris has written for The New York Times and Reader's Digest, been interviewed as a technology expert on TV stations like Miami's NBC 6, and had his work covered by news outlets like the BBC. Since 2011, Chris has written over 2,000 articles that have been read more than one billion times---and that's just here at How-To Geek.
Read Full Bio »