To begin operating a computer, you find its power switch, turn it on, and then what?
What do you type? What do you do? How will the computer respond?
The answers to those questions depend on which operating system your computer uses.
Most IBM clones use an operating system called MS-DOS, supplemented by Windows (which lets you more easily use a mouse). Mac computers use a different operating system instead, called the Mac System. This book explains how to use all three: MS-DOS, Windows, and the Mac System.
Other kinds of computers use different operating systems instead.
Three kinds of user interface
How do you give commands to the computer? The answer depends on what kind of user interface the operating system uses. Three kinds of user interface have been invented.
Command-driven In a command-driven interface, you give commands to the computer by typing the commands on the keyboard.
For example, MS-DOS uses a command-driven interface. To command MS-DOS to copy a file, you sit at the keyboard, type the word ``copy'', then type the details about which file you want to copy and which disk you want to copy it to. To command MS-DOS to erase a file so the file is deleted, you type the word ``erase'' or ``del'', then type the name of the file you want to delete.
Menu-driven In a menu-driven interface, you act as if you were in a restaurant and ordering food from a menu: you give orders to the computer by choosing your order from a menu that appears on the screen.
For example, Pro DOS (an operating system used on some Apple 2 computers) has a menu-driven interface. When you start using Pro DOS, the screen shows a menu that begins like this:
1. Copy files
2. Delete files
If you want to copy a file, press the ``1'' key on the keyboard. If you want to delete a file instead, press the ``2'' key. Afterwards, the computer lets you choose which file to copy or delete.
Icon-driven In an icon-driven interface, the screen shows lots of cute little pictures; each little picture is called an icon. To give orders to the computer, you point at one of the icons by using a mouse, then use the mouse to make the icon move or disappear or turn black or otherwise change appearance.
For example, the Mac's operating system (which is called the Mac System) has an icon-driven interface. When you turn the Mac on, the screen gets filled with lots of little icons.
If you want to copy a file from the Mac's hard disk to a floppy disk, just use the mouse! Point at the icon (picture) that represents the file, then drag the file's icon to the floppy disk's icon. Dragging the file's icon to the floppy's icon makes the computer drag the file itself to the floppy itself.
One of the icons on the screen is a picture of a trash can. To delete a file, drag the file's icon to the trash-can icon. When you finish, the trash can will bulge, which means the file's been deleted, thrown away.
Multiuser systems
Our country is run by monsters! Big monster computers run our government, banks, insurance companies, utility companies, airlines, and railroads. To handle so many people and tasks simultaneously, those computers use advanced operating systems. Here's how they arose. . . .
Back in the 1950's, the only kind of operating system was single-user: it handled just one person at a time. If two people wanted to use the computer, the second person had to stand in line behind the first person until the first finished.
The first improvement over single-user operating systems was batch processing. In a batch-processing system, the second person didn't have to stand in line to use the computer. Instead, he fed his program onto the computer's disk (or other kind of memory) and walked away. The computer ran it automatically when the first person's program finished. That procedure was called batch processing because the computer could store a whole batch of programs on the disk and run them in order.
While running your program, the CPU often waits for computer devices to catch up. For example, if your program makes the printer print, the CPU waits for the printer to finish. While the CPU waits for the printer (or another slow device), you should let the CPU temporarily work on the next guy's program. That's called multiprogramming, because the CPU switches its attention among several programs.
In a simple multiprogramming system, the CPU follows this strategy: it begins working on the first guy's program; but when that program makes the CPU wait for a slow device, the CPU starts working on the second program. When the second program makes the CPU wait also, the CPU switches its attention to the third program, etc. But the first program always has top priority: as soon as that first program can continue (because the printer finished), the CPU resumes work on that program and puts all other programs on hold.
Suppose one guy's program requires an hour of computer time, but another guy's program requires just one minute. If the guy with the hour-long program is mean and insists on going first, the other guy must wait an hour to run the one-minute program. An improved operating system can ``psyche out'' the situation and help the second guy without waiting for the first guy to finish. Here's how the operating system works. . . .
A jiffy is a sixtieth of a second. During the first jiffy, the CPU works on the first guy's program. During the next jiffy, the CPU works on the second guy's program. During the third jiffy, the CPU works on a third guy's program, and so on, until each program has received a jiffy. Then, like a card dealer, the CPU ``deals'' a second jiffy to each program, then deals a third jiffy, etc. If one of the programs requires little CPU time, it will finish after being dealt just a few jiffies and ``drop out'' of the game, without waiting for all the other players to finish.
In that scheme, each jiffy is called a time slice. Since the computer deals time slices as if dealing to a circle of card players, the technique's called round-robin time-slicing.
To make that technique practical, attach the computer to many terminals, so each guy has his own terminal. The CPU goes round and round, switching its attention from terminal to terminal every jiffy.
If you sit at a terminal, a few jiffies later the CPU gets to your terminal, gives you its full attention for a jiffy, then ignores you for several jiffies while it handles the other users, then comes back to you again. Since jiffies are quick, you don't notice that the CPU ignores you for several jiffies.
That technique's an example of timesharing, which is defined as ``an operating system creating the illusion that the CPU gives you its full attention continuously''.
In that system, if your program needs to use the printer, the CPU sends some data out to the printer but then immediately moves on to the next person, without waiting for the printer to catch up, and without giving you a full jiffy of attention. After the CPU's given the other people their jiffies, the computer returns to you again and checks whether the printer has finished your job yet.
While the CPU works on a particular guy, the state of that guy's program is stored in the CPU and RAM. When that guy's jiffy ends, the CPU typically copies that guy's state onto the disk, then copies the next guy's state from disk to the CPU and RAM. So every time the CPU switches from one guy to the next, the CPU must typically do lots of disk I/O (unless the CPU's RAM is large enough to hold both guy's programs simultaneously). Such disk I/O is ``bureaucratic overhead'' consuming lots of time. To reduce that overhead, switch guys less often. Here's how to make the CPU switch guys less often but still switch fast enough to maintain each guy's illusion of getting continuous attention.
Suppose a guy's a ``CPU hog'': he's running a program that won't finish for several hours. Instead of giving him many short time slices, the CPU should act more efficiently by totally ignoring him for several hours (which will make everybody else in the computer room cheer