A modern computer has a multitude of layers of software and hardware between the user and the actual transistor gates which implement the computer. Let's take a quick plunge through a simple action, and see how it passes through the system:
A user has opened the Windows calculator, and has so far entered in '5','+','3' with his keyboard. They now press the '=' key.
1. An electrical switch closes under the equals key on the keyboard, causing a brief surge of current as an eletrical wire changes its voltage potential.
2. The keyboard's small processor, scanning through each row of keys hundreds of times a second, detects the change in voltage levels in its input wires. It decodes the key based on the row and column pressed. Data is sent to main computer describing the event
3. The computer's interrupt controller, a system for picking up and handling external events, receives the alert from the keyboard. Assuming no higher-priority tasks are executing in the computer, the controller issues an interrupt to the control unit, telling it to immediately switch tasks and pay attention to the keyboard.
4. The control unit consults an internal interrupt vector table, all stored in nearby memory cells, and issues commands to fetch the next program bytes from the memory location listed for a keyboard interrupt.
5. The interrupt routine, a small piece of independent code, is loaded into the control unit, which begins its execution. The code fetches the keypress information, and places it in a queue data structure in memory for the rest of the operating system. The routine then ends, and the control unit switches back to whatever it was doing before the interrupt.
6. Shortly, in machine time, and practically instantaneously in human time, Windows executes a basic, low-level routine which checks for new user inputs. Seeing the keypress, the routine determines which executing program was the recipient, and places a keypress event on that program's event queue. All these actions take hundreds of steps in machine code each.
7. The Windows scheduler, the master controller of which program is allowed to use processing time, passes control to the calculator program.
8. The calculator program's main loop sees the new keypress event, and reads it in. Its internal program logic determines the significance of pressing the equals key, and it begins to process the actual calculation. It may be interrupted at any point by the scheduler to allow more critical routines to execute (such as interrupt routines).
9. In C-code, the actual calculation might look like this:
... case EQUALS: result = op1 + op2; break; ...
10. The calculator program, having been compiled into machine code long before, runs through the computer's control unit step by step. Finally, the calculation itself is reached. The computer must transfer two values from the main memory into an execution unit, which then performs the addition. The result must then be moved back into memory. In assembly language, the code might look like this:
MOV DX, 00014B24
MOV AX, @DX
MOV DX, 00014B28
MOV BX, @DX
ADD AX, BX
MOV DX, 00014B2C
MOV @DX, AX
which, to an assembly programmer, is not hard to read. To most people, it is complete gibberish. The long numbers preceded by zeros are the memory locations of the values used in the calculation, represented in hexadecimal, or base-16. The MOV command instructs the computer to move a piece of data from one place to another. The ADD instruction actually performs the addition.
11. The machine code equivalent for the ADD for an Intel 8086-compatible processor would be 0000 0011 11 000 011, or in hexadecimal notation, 03C3. The control unit in the CPU reads in this value, and interprets it as an addition of two values currently stored in some of the CPU's registers. The control unit sends out signals that transfer the values in the registers to the execution unit, and then signals to the unit to begin addition.
12. The execution unit contains an adder circuit, a fairly standard piece of electronics. It routes the two values to the inputs of the adder, and waits a clock phase or two.
13. Inside the adder circuit, the binary digits, all low/high voltage signals, race through transistors and wires. A simple adder circuit has the same circuit per pair of digits, with two single-digit inputs, a carry in, a carry out, and the sum output. The outputs (sum, carry out) is determined based on the inputs (a, b, carry in) according to the following table:
A B Carry In | Carry Out Sum
0 0 0 -> 0 0
0 0 1 -> 0 1
0 1 0 -> 0 1
0 1 1 -> 1 0
1 0 0 -> 0 1
1 0 1 -> 1 0
1 1 0 -> 1 0
1 1 1 -> 1 1
Using Boolean logic, the above can be described as
Sum = ((A XOR B) XOR Carry In)
Carry Out = ((A AND B) OR (A AND Carry In) OR (B AND Carry In))
The circuitry can be implemented in several ways, depending on the underlying technology.
14. At the lowest level, we have a single transistor, one of typically millions on a single CPU. Most digital logic is done with the CMOS process, which uses MOSFET transistors. Here, a P-MOS transistor that is currently on receives a high voltage to its base contact. The gate voltage causes the conducting channel between the source and drain to close off, switching the transistor off. A short while later, the voltage drops again, turning the transistor back on. This switching eats up a tiny bit of current each time.
Multiplied millionfold, the result is a modern CPU, requiring constant active cooling to keep the chip from frying.
15. Now the result of the calculation percolates upward through the layers of hardware and software, until finally, the user sees some glowing phosphors on their screen, displaying an '8'. Quite a bit of work for a simple result.
It should be clear from the above that even a simple task in a modern computer is quite complex below the surface, requiring hundreds of actions to all work together in unison. A single failure anywhere along the path, and the system breaks. Sometimes it surprising that computers work as well as they do.
Computers are used for millions of applications; they are in everything from toothbrushes to million-dollar factory machinery. And ever-increasingly, they are connected together, allowing information to flow in unprecedented ways. It's hard to say where it'll all lead.
Modern research on computing focuses on several fields. First, many researchers are actively trying to shrink the size of transistors on silicon chips smaller and smaller. Commercial companies are currently trying for 0.09 micron technology, where the smallest feature that can be created is 0.09 microns wide. A second field of research is quantum computers, where computations are performed in ways radically different than current digital systems. Computer science research is pushing forward on dozens of fronts,
making computers more complex, more powerful, and hopefully more useful daily.
A hundred years ago, computing machines were piles of gears and sprockets, and looked like complex typewriters. Now, they utilize results quantum mechanics (for example, Flash ROM) in their operation, and perform billions of calculations per second.
Monday, August 24, 2009
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment