I dream of rain
I dream of gardens in the desert sand
I wake in pain
I dream of love as time runs through my hand
I dream of fire
Those dreams are tied to a horse that will never tire
And in the flames
Her shadows play in the shape of a man's desire
This desert rose
Each of her veils, a secret promise
This desert flower
No sweet perfume ever tortured me more than this
And as she turns
This way she moves in the logic of all my dreams
This fire burns
I realize that nothing's as it seems
I dream of rain
I dream of gardens in the desert sand
I wake in pain
I dream of love as time runs through my hand
I dream of rain
I lift my gaze to empty skies above
I close my eyes
This rare perfume is the sweet intoxication of her love
I dream of rain
I dream of gardens in the desert sand
I wake in pain
I dream of love as time runs through my hand
Sweet desert rose
Each of her veils, a secret promise
This desert flower
No sweet perfume ever tortured me more than this
Sweet desert rose
This memory of Eden haunts us all
This desert flower
This rare perfume, is the sweet intoxication of the love
"Desert Rose" is a single by Sting from his album Brand New Day (1999). The song peaked at #15 in the UK Singles Chart and #17 in the US Billboard Hot 100.
Thursday, August 27, 2009
Wednesday, August 26, 2009
Prime Ministers of INDIA
PT. JAWAHARLAL NEHRU
The first Prime Minister of Independent India, also holds the record for continuously being in the office for the longest period (Aug. 15, 1947 to May 27, 1964) about 17 years, or 6,131 days - to be precise!
Born in Allahabad, November 14, 1889; died on May 27, 1964. A prolific writer ("Discovery of India") and a great orator, he was the co-founder of Non-Aligned Movement. Conferred Bharat Ratna (1955).
GULZARI LAL NANDA
Gandhian to the core, was interim Prime Minister on two occasions - after the sudden demise of Pt. Nehru (from may 27, 1964 to June 9, 1964), for 14 days, and again after the sudden demise of Lal Bahadur Shastri (from January 11 to 24, 1966), again for a period of 14 days.
Born in Sialkot (now in Pakistan), July 4, 1898; died on January 15, 1998, as a centenarian. Was a leading light in the labour movement. Conferred Bharat Ratna (1997).
LAL BAHADUR SHASTRI
Though he came to power under the shadow cast of the ever-famous Pt. Nehru, he quickly won the hearts of all Indians, especially after routing Pakistan in the 1965 war. Indian still reverbrates with his slogan "Jai Jawan ! Jai Kisan !". He held office for 582 days, from June 9, 1964 to January 11, 1966.
Born in Moghalsarai (Varanasi), October 2, 1904; died in Tashkent (USSR) on January 11, 1966 , while negotiating peace talks with Pakistan. Conferred Bharat Ratna (posthumous, 1966).
INDIRA GANDHI
Proved her mettle as the chip of the old block; held office from Jan. 24, 1966 to March 24, 1977 and again |from Jan. 14 to Oct. 31,1984, for a total of 5,831 days, just 300 days short of her father, Pt. Nehru. Abolition of privy purse, nationalization of banks, birth of Bangladesh are some of her achievements.
Born in Allahabad, Nov. 19, 1917; died on Oct. 31,1984. Conferred Bharat Ratna in 1971.
MORARJI DESAI
The first non-congress Prime Minister since Independence, Morarji Desai headed the Janata Party Government for 857 days, from March 24, 1977 to July 28, 1979. He was one of those rare Gandhians. Born on 29th February, 1896, in Bahadesli Village, Gujarat; a centenarian, he passed away on April 10, 1995. Conferred Bharat Ratna in 1991.
CHARAN SINGH
A "kisan" Prime Minister from Uttar Pradesh, he never faced Parliament, even though he was in office for 171 days, from July 28, 1979 to Jan. 14, 1980. Within days of assuming office, one of the supporting parties withdrew its support, after which he continued as caretaker P.M , till fresh elections were held.
Born in Noorpur, Meerut Dist., U. P. on Dec.23, 1902; died in 1987.
RAJIV GANDHI
The youngest Prime Minister so far, this grand son of Pt. Nehru assumed office at the age of 41, the day his mother was assassinated (Oct. 31,1984). In the 1985 elections, he led his party to a big win and continued as P. M. till Dec. 1,1989, when his party lost the hustings.
Born in Bombay (Mumbai, now) on Aug. 20, 1944; died in Sriperumbudur on May 21, 1991, at the hands of a human bomb. Conferred Bharat Ratna (posthumous, 1991)
V.P. SINGH
Riding the wave of the Janata Dal, V. P. Singh became Prime Minister on Dec. 2,1989 and continued till Nov. 10, 1990 (344 days). He took a major decision in implementing the Mandal Commission recommendations.
Born in Allahabad on June 25, 1931.
CHANDRA SHEKHAR
The " Young Turk" became Prime Minister on Nov. 10,1990 and continued till June 21, 1991 (224 days).
Born in Village Ibrahimpatti in Ballia District, U P., on July 1, 1927.
P. V. NARASIMHA RAO
The first ever from South to hold this office, he ruled for full five years, from June 21, 1991 to May 10, 1996 (1, 785 days), despite a hung Parliament, initially.
A literature, Pamulaparti Venkata Narasimha Rao was born in Vangara, Karimnagar District, Andhra Pradesh, on June 28, 1921. He launched the Economic Liberalisation Programme.
A. B. VAJPAYEE
The first "bachelor" Prime Minister, he was at the helm for 16 days from May 16 to June 1, 1996, creating a record for the shortest stint in office.
Atal Bihari Vajpayee was born in Gwalior (M. P) on 25th December, 1926. A prolific writer and a great orator.
H D DEVE GOWDA
The second Government of the 11th Lok Sabha was headed by Haradanahalli Dodde Deve Gowda, who was in office from June 1, 1996 to April 21, 1997 (for roughly 11 months).
Deve Gowda was born on May 18, 1933, in Haradanahalli Village, Hassan District, Karnataka.
I. K. GUJRAL
Inder Kumar Gujral became Prime Minister on April 21, 1997. He resigned office on Nov. 28. 1997.
Gujral was born on December 4, 1919, in Jhelum (in the undivided Punjab, now in Pakistan).
A. B. VAJPAYEE
He assumed charge for the second time on March 19, 1998. He took the oath on October 13, 1999 after General Election. He submitted resignation on May 13, 2004.
DR. MANMOHAN SINGH
A globally renowned economist and former Finance Minister, Dr. Manmohan Singh assumed charge on May 22, 2004.
The first Prime Minister of Independent India, also holds the record for continuously being in the office for the longest period (Aug. 15, 1947 to May 27, 1964) about 17 years, or 6,131 days - to be precise!
Born in Allahabad, November 14, 1889; died on May 27, 1964. A prolific writer ("Discovery of India") and a great orator, he was the co-founder of Non-Aligned Movement. Conferred Bharat Ratna (1955).
GULZARI LAL NANDA
Gandhian to the core, was interim Prime Minister on two occasions - after the sudden demise of Pt. Nehru (from may 27, 1964 to June 9, 1964), for 14 days, and again after the sudden demise of Lal Bahadur Shastri (from January 11 to 24, 1966), again for a period of 14 days.
Born in Sialkot (now in Pakistan), July 4, 1898; died on January 15, 1998, as a centenarian. Was a leading light in the labour movement. Conferred Bharat Ratna (1997).
LAL BAHADUR SHASTRI
Though he came to power under the shadow cast of the ever-famous Pt. Nehru, he quickly won the hearts of all Indians, especially after routing Pakistan in the 1965 war. Indian still reverbrates with his slogan "Jai Jawan ! Jai Kisan !". He held office for 582 days, from June 9, 1964 to January 11, 1966.
Born in Moghalsarai (Varanasi), October 2, 1904; died in Tashkent (USSR) on January 11, 1966 , while negotiating peace talks with Pakistan. Conferred Bharat Ratna (posthumous, 1966).
INDIRA GANDHI
Proved her mettle as the chip of the old block; held office from Jan. 24, 1966 to March 24, 1977 and again |from Jan. 14 to Oct. 31,1984, for a total of 5,831 days, just 300 days short of her father, Pt. Nehru. Abolition of privy purse, nationalization of banks, birth of Bangladesh are some of her achievements.
Born in Allahabad, Nov. 19, 1917; died on Oct. 31,1984. Conferred Bharat Ratna in 1971.
MORARJI DESAI
The first non-congress Prime Minister since Independence, Morarji Desai headed the Janata Party Government for 857 days, from March 24, 1977 to July 28, 1979. He was one of those rare Gandhians. Born on 29th February, 1896, in Bahadesli Village, Gujarat; a centenarian, he passed away on April 10, 1995. Conferred Bharat Ratna in 1991.
CHARAN SINGH
A "kisan" Prime Minister from Uttar Pradesh, he never faced Parliament, even though he was in office for 171 days, from July 28, 1979 to Jan. 14, 1980. Within days of assuming office, one of the supporting parties withdrew its support, after which he continued as caretaker P.M , till fresh elections were held.
Born in Noorpur, Meerut Dist., U. P. on Dec.23, 1902; died in 1987.
RAJIV GANDHI
The youngest Prime Minister so far, this grand son of Pt. Nehru assumed office at the age of 41, the day his mother was assassinated (Oct. 31,1984). In the 1985 elections, he led his party to a big win and continued as P. M. till Dec. 1,1989, when his party lost the hustings.
Born in Bombay (Mumbai, now) on Aug. 20, 1944; died in Sriperumbudur on May 21, 1991, at the hands of a human bomb. Conferred Bharat Ratna (posthumous, 1991)
V.P. SINGH
Riding the wave of the Janata Dal, V. P. Singh became Prime Minister on Dec. 2,1989 and continued till Nov. 10, 1990 (344 days). He took a major decision in implementing the Mandal Commission recommendations.
Born in Allahabad on June 25, 1931.
CHANDRA SHEKHAR
The " Young Turk" became Prime Minister on Nov. 10,1990 and continued till June 21, 1991 (224 days).
Born in Village Ibrahimpatti in Ballia District, U P., on July 1, 1927.
P. V. NARASIMHA RAO
The first ever from South to hold this office, he ruled for full five years, from June 21, 1991 to May 10, 1996 (1, 785 days), despite a hung Parliament, initially.
A literature, Pamulaparti Venkata Narasimha Rao was born in Vangara, Karimnagar District, Andhra Pradesh, on June 28, 1921. He launched the Economic Liberalisation Programme.
A. B. VAJPAYEE
The first "bachelor" Prime Minister, he was at the helm for 16 days from May 16 to June 1, 1996, creating a record for the shortest stint in office.
Atal Bihari Vajpayee was born in Gwalior (M. P) on 25th December, 1926. A prolific writer and a great orator.
H D DEVE GOWDA
The second Government of the 11th Lok Sabha was headed by Haradanahalli Dodde Deve Gowda, who was in office from June 1, 1996 to April 21, 1997 (for roughly 11 months).
Deve Gowda was born on May 18, 1933, in Haradanahalli Village, Hassan District, Karnataka.
I. K. GUJRAL
Inder Kumar Gujral became Prime Minister on April 21, 1997. He resigned office on Nov. 28. 1997.
Gujral was born on December 4, 1919, in Jhelum (in the undivided Punjab, now in Pakistan).
A. B. VAJPAYEE
He assumed charge for the second time on March 19, 1998. He took the oath on October 13, 1999 after General Election. He submitted resignation on May 13, 2004.
DR. MANMOHAN SINGH
A globally renowned economist and former Finance Minister, Dr. Manmohan Singh assumed charge on May 22, 2004.
Tuesday, August 25, 2009
Quotes of the Day..........
6. To be useful, a system has to do more than just correctly perform some task.[John McDermott]
7. There are many methods for predicting the future. For example, you can read horoscopes, tea leaves, tarot cards, or crystal balls. Collectively, these are known as 'nutty methods.' Or you can put well-researched facts into sophisticated computer models, more commonly referred to as "a complete waste of time."[Scott Adams]
8. Statistics can be made to prove anything - even the truth.[Anonymous]
9. Speech is the representation of the mind, and writing is the representation of speech.[Aristotle]
10. We achieve more than we know. We know more than we understand. We understand more than we can explain.[Claude Bernard]
7. There are many methods for predicting the future. For example, you can read horoscopes, tea leaves, tarot cards, or crystal balls. Collectively, these are known as 'nutty methods.' Or you can put well-researched facts into sophisticated computer models, more commonly referred to as "a complete waste of time."[Scott Adams]
8. Statistics can be made to prove anything - even the truth.[Anonymous]
9. Speech is the representation of the mind, and writing is the representation of speech.[Aristotle]
10. We achieve more than we know. We know more than we understand. We understand more than we can explain.[Claude Bernard]
Monday, August 24, 2009
Computers - From steam power to quantum tunneling in 150 years - Basic Computing Methodolgy
A modern computer has a multitude of layers of software and hardware between the user and the actual transistor gates which implement the computer. Let's take a quick plunge through a simple action, and see how it passes through the system:
A user has opened the Windows calculator, and has so far entered in '5','+','3' with his keyboard. They now press the '=' key.
1. An electrical switch closes under the equals key on the keyboard, causing a brief surge of current as an eletrical wire changes its voltage potential.
2. The keyboard's small processor, scanning through each row of keys hundreds of times a second, detects the change in voltage levels in its input wires. It decodes the key based on the row and column pressed. Data is sent to main computer describing the event
3. The computer's interrupt controller, a system for picking up and handling external events, receives the alert from the keyboard. Assuming no higher-priority tasks are executing in the computer, the controller issues an interrupt to the control unit, telling it to immediately switch tasks and pay attention to the keyboard.
4. The control unit consults an internal interrupt vector table, all stored in nearby memory cells, and issues commands to fetch the next program bytes from the memory location listed for a keyboard interrupt.
5. The interrupt routine, a small piece of independent code, is loaded into the control unit, which begins its execution. The code fetches the keypress information, and places it in a queue data structure in memory for the rest of the operating system. The routine then ends, and the control unit switches back to whatever it was doing before the interrupt.
6. Shortly, in machine time, and practically instantaneously in human time, Windows executes a basic, low-level routine which checks for new user inputs. Seeing the keypress, the routine determines which executing program was the recipient, and places a keypress event on that program's event queue. All these actions take hundreds of steps in machine code each.
7. The Windows scheduler, the master controller of which program is allowed to use processing time, passes control to the calculator program.
8. The calculator program's main loop sees the new keypress event, and reads it in. Its internal program logic determines the significance of pressing the equals key, and it begins to process the actual calculation. It may be interrupted at any point by the scheduler to allow more critical routines to execute (such as interrupt routines).
9. In C-code, the actual calculation might look like this:
... case EQUALS: result = op1 + op2; break; ...
10. The calculator program, having been compiled into machine code long before, runs through the computer's control unit step by step. Finally, the calculation itself is reached. The computer must transfer two values from the main memory into an execution unit, which then performs the addition. The result must then be moved back into memory. In assembly language, the code might look like this:
MOV DX, 00014B24
MOV AX, @DX
MOV DX, 00014B28
MOV BX, @DX
ADD AX, BX
MOV DX, 00014B2C
MOV @DX, AX
which, to an assembly programmer, is not hard to read. To most people, it is complete gibberish. The long numbers preceded by zeros are the memory locations of the values used in the calculation, represented in hexadecimal, or base-16. The MOV command instructs the computer to move a piece of data from one place to another. The ADD instruction actually performs the addition.
11. The machine code equivalent for the ADD for an Intel 8086-compatible processor would be 0000 0011 11 000 011, or in hexadecimal notation, 03C3. The control unit in the CPU reads in this value, and interprets it as an addition of two values currently stored in some of the CPU's registers. The control unit sends out signals that transfer the values in the registers to the execution unit, and then signals to the unit to begin addition.
12. The execution unit contains an adder circuit, a fairly standard piece of electronics. It routes the two values to the inputs of the adder, and waits a clock phase or two.
13. Inside the adder circuit, the binary digits, all low/high voltage signals, race through transistors and wires. A simple adder circuit has the same circuit per pair of digits, with two single-digit inputs, a carry in, a carry out, and the sum output. The outputs (sum, carry out) is determined based on the inputs (a, b, carry in) according to the following table:
A B Carry In | Carry Out Sum
0 0 0 -> 0 0
0 0 1 -> 0 1
0 1 0 -> 0 1
0 1 1 -> 1 0
1 0 0 -> 0 1
1 0 1 -> 1 0
1 1 0 -> 1 0
1 1 1 -> 1 1
Using Boolean logic, the above can be described as
Sum = ((A XOR B) XOR Carry In)
Carry Out = ((A AND B) OR (A AND Carry In) OR (B AND Carry In))
The circuitry can be implemented in several ways, depending on the underlying technology.
14. At the lowest level, we have a single transistor, one of typically millions on a single CPU. Most digital logic is done with the CMOS process, which uses MOSFET transistors. Here, a P-MOS transistor that is currently on receives a high voltage to its base contact. The gate voltage causes the conducting channel between the source and drain to close off, switching the transistor off. A short while later, the voltage drops again, turning the transistor back on. This switching eats up a tiny bit of current each time.
Multiplied millionfold, the result is a modern CPU, requiring constant active cooling to keep the chip from frying.
15. Now the result of the calculation percolates upward through the layers of hardware and software, until finally, the user sees some glowing phosphors on their screen, displaying an '8'. Quite a bit of work for a simple result.
It should be clear from the above that even a simple task in a modern computer is quite complex below the surface, requiring hundreds of actions to all work together in unison. A single failure anywhere along the path, and the system breaks. Sometimes it surprising that computers work as well as they do.
Computers are used for millions of applications; they are in everything from toothbrushes to million-dollar factory machinery. And ever-increasingly, they are connected together, allowing information to flow in unprecedented ways. It's hard to say where it'll all lead.
Modern research on computing focuses on several fields. First, many researchers are actively trying to shrink the size of transistors on silicon chips smaller and smaller. Commercial companies are currently trying for 0.09 micron technology, where the smallest feature that can be created is 0.09 microns wide. A second field of research is quantum computers, where computations are performed in ways radically different than current digital systems. Computer science research is pushing forward on dozens of fronts,
making computers more complex, more powerful, and hopefully more useful daily.
A hundred years ago, computing machines were piles of gears and sprockets, and looked like complex typewriters. Now, they utilize results quantum mechanics (for example, Flash ROM) in their operation, and perform billions of calculations per second.
A user has opened the Windows calculator, and has so far entered in '5','+','3' with his keyboard. They now press the '=' key.
1. An electrical switch closes under the equals key on the keyboard, causing a brief surge of current as an eletrical wire changes its voltage potential.
2. The keyboard's small processor, scanning through each row of keys hundreds of times a second, detects the change in voltage levels in its input wires. It decodes the key based on the row and column pressed. Data is sent to main computer describing the event
3. The computer's interrupt controller, a system for picking up and handling external events, receives the alert from the keyboard. Assuming no higher-priority tasks are executing in the computer, the controller issues an interrupt to the control unit, telling it to immediately switch tasks and pay attention to the keyboard.
4. The control unit consults an internal interrupt vector table, all stored in nearby memory cells, and issues commands to fetch the next program bytes from the memory location listed for a keyboard interrupt.
5. The interrupt routine, a small piece of independent code, is loaded into the control unit, which begins its execution. The code fetches the keypress information, and places it in a queue data structure in memory for the rest of the operating system. The routine then ends, and the control unit switches back to whatever it was doing before the interrupt.
6. Shortly, in machine time, and practically instantaneously in human time, Windows executes a basic, low-level routine which checks for new user inputs. Seeing the keypress, the routine determines which executing program was the recipient, and places a keypress event on that program's event queue. All these actions take hundreds of steps in machine code each.
7. The Windows scheduler, the master controller of which program is allowed to use processing time, passes control to the calculator program.
8. The calculator program's main loop sees the new keypress event, and reads it in. Its internal program logic determines the significance of pressing the equals key, and it begins to process the actual calculation. It may be interrupted at any point by the scheduler to allow more critical routines to execute (such as interrupt routines).
9. In C-code, the actual calculation might look like this:
... case EQUALS: result = op1 + op2; break; ...
10. The calculator program, having been compiled into machine code long before, runs through the computer's control unit step by step. Finally, the calculation itself is reached. The computer must transfer two values from the main memory into an execution unit, which then performs the addition. The result must then be moved back into memory. In assembly language, the code might look like this:
MOV DX, 00014B24
MOV AX, @DX
MOV DX, 00014B28
MOV BX, @DX
ADD AX, BX
MOV DX, 00014B2C
MOV @DX, AX
which, to an assembly programmer, is not hard to read. To most people, it is complete gibberish. The long numbers preceded by zeros are the memory locations of the values used in the calculation, represented in hexadecimal, or base-16. The MOV command instructs the computer to move a piece of data from one place to another. The ADD instruction actually performs the addition.
11. The machine code equivalent for the ADD for an Intel 8086-compatible processor would be 0000 0011 11 000 011, or in hexadecimal notation, 03C3. The control unit in the CPU reads in this value, and interprets it as an addition of two values currently stored in some of the CPU's registers. The control unit sends out signals that transfer the values in the registers to the execution unit, and then signals to the unit to begin addition.
12. The execution unit contains an adder circuit, a fairly standard piece of electronics. It routes the two values to the inputs of the adder, and waits a clock phase or two.
13. Inside the adder circuit, the binary digits, all low/high voltage signals, race through transistors and wires. A simple adder circuit has the same circuit per pair of digits, with two single-digit inputs, a carry in, a carry out, and the sum output. The outputs (sum, carry out) is determined based on the inputs (a, b, carry in) according to the following table:
A B Carry In | Carry Out Sum
0 0 0 -> 0 0
0 0 1 -> 0 1
0 1 0 -> 0 1
0 1 1 -> 1 0
1 0 0 -> 0 1
1 0 1 -> 1 0
1 1 0 -> 1 0
1 1 1 -> 1 1
Using Boolean logic, the above can be described as
Sum = ((A XOR B) XOR Carry In)
Carry Out = ((A AND B) OR (A AND Carry In) OR (B AND Carry In))
The circuitry can be implemented in several ways, depending on the underlying technology.
14. At the lowest level, we have a single transistor, one of typically millions on a single CPU. Most digital logic is done with the CMOS process, which uses MOSFET transistors. Here, a P-MOS transistor that is currently on receives a high voltage to its base contact. The gate voltage causes the conducting channel between the source and drain to close off, switching the transistor off. A short while later, the voltage drops again, turning the transistor back on. This switching eats up a tiny bit of current each time.
Multiplied millionfold, the result is a modern CPU, requiring constant active cooling to keep the chip from frying.
15. Now the result of the calculation percolates upward through the layers of hardware and software, until finally, the user sees some glowing phosphors on their screen, displaying an '8'. Quite a bit of work for a simple result.
It should be clear from the above that even a simple task in a modern computer is quite complex below the surface, requiring hundreds of actions to all work together in unison. A single failure anywhere along the path, and the system breaks. Sometimes it surprising that computers work as well as they do.
Computers are used for millions of applications; they are in everything from toothbrushes to million-dollar factory machinery. And ever-increasingly, they are connected together, allowing information to flow in unprecedented ways. It's hard to say where it'll all lead.
Modern research on computing focuses on several fields. First, many researchers are actively trying to shrink the size of transistors on silicon chips smaller and smaller. Commercial companies are currently trying for 0.09 micron technology, where the smallest feature that can be created is 0.09 microns wide. A second field of research is quantum computers, where computations are performed in ways radically different than current digital systems. Computer science research is pushing forward on dozens of fronts,
making computers more complex, more powerful, and hopefully more useful daily.
A hundred years ago, computing machines were piles of gears and sprockets, and looked like complex typewriters. Now, they utilize results quantum mechanics (for example, Flash ROM) in their operation, and perform billions of calculations per second.
Computers - From steam power to quantum tunneling in 150 years - Modern Computer Traits
All modern computers share these traits, listed below:
* Binary Operation: All computers of any complexity have been based on Boolean logic, where there are only two values: True and False, easily represented in electronic circuitry as two different voltages. A base-two system is therefore a natural platform for a computer built on switches, operating with Boolean logic.
* A Memory: The memory is used to store information that the computer is processing, or to archive data that might be accessed later. Computer memory has ranged from hard-wired switches, to punch cards, drum memory, bubble memory, floppy disks, hard drives, optical disks, solid-state memory, and experimental media such as holographs and quantum dots. They allow the storage of arbitrary binary numbers, some with easy rewritability, some without. Capacities, access speeds, and portability vary widely across the different formats.
* Execution Units: These take binary values from the memory, and operate on them, returning the result into the memory. In modern systems, several execution units are placed on the each CPU chip. The units are often specialized, some working with integer values, others with real numbers represented in binary. Recent advances include SIMD units (such as the Altivec), pipelined execution, and many other features to increase speed.
* A Program: A program is a sequence of instructions for the computer, telling it what to do with the data stored in its memory. At the lowest level, a program is encoded in machine code, direct binary values that each encode a single action for the computer, such as reading a value from memory, or adding two values together. As computers have evolved in complexity, high-level programming languages (such as C, Ada, and LISP) have been created to simplify the creation of complex programs. These languages hide the lower level details from the programmer,
using compilers to automatically convert abstract program code into the computer's native machine code.
For small, embedded computer systems, programmers sometimes still work in Assembly languages, which use simple english mnemonics for each machine code instruction.
* A Control Unit: The control unit takes in the program, reads in the machine code values (or bytes), and then operates the memory and execution units accordingly. The control is often a state machine of varying complexity. Separating the control and the execution units makes desiging computers much easier, since the control only needs to tell the execution units 'do this', and the execution units do not need to deal with the details of the machine code interpretation. The control and execution units, and sometimes limited amounts of memory, are typically
on a single silicon chip, known as the Central Processing Unit, or CPU.
* A Clock: Almost all modern computers run off very precise quartz crystal clocks, with speeds currently measured in Megahertz and recently Gigahertz. The clock makes the sequencing of operations in a computer simpler, guaranteeing valid results from one stage of computation to the next. However, synchronous designs, as clocked digital computers are called, are not the only modern method. Asynchronous VLSI techniques are also being investigated.
* A User Interface: The ability to compute would be of little use if there was no way to input new data, or read out the results. User interfaces for computers have also evolved a great deal over the last fifty years, from blinking rows of lights showing memory values, to modern monitors, keyboards, and mice. Additionally, computers are now typically connected to others through computer networks (such as the Internet, of course), allowing for widespread information exchange.
* Binary Operation: All computers of any complexity have been based on Boolean logic, where there are only two values: True and False, easily represented in electronic circuitry as two different voltages. A base-two system is therefore a natural platform for a computer built on switches, operating with Boolean logic.
* A Memory: The memory is used to store information that the computer is processing, or to archive data that might be accessed later. Computer memory has ranged from hard-wired switches, to punch cards, drum memory, bubble memory, floppy disks, hard drives, optical disks, solid-state memory, and experimental media such as holographs and quantum dots. They allow the storage of arbitrary binary numbers, some with easy rewritability, some without. Capacities, access speeds, and portability vary widely across the different formats.
* Execution Units: These take binary values from the memory, and operate on them, returning the result into the memory. In modern systems, several execution units are placed on the each CPU chip. The units are often specialized, some working with integer values, others with real numbers represented in binary. Recent advances include SIMD units (such as the Altivec), pipelined execution, and many other features to increase speed.
* A Program: A program is a sequence of instructions for the computer, telling it what to do with the data stored in its memory. At the lowest level, a program is encoded in machine code, direct binary values that each encode a single action for the computer, such as reading a value from memory, or adding two values together. As computers have evolved in complexity, high-level programming languages (such as C, Ada, and LISP) have been created to simplify the creation of complex programs. These languages hide the lower level details from the programmer,
using compilers to automatically convert abstract program code into the computer's native machine code.
For small, embedded computer systems, programmers sometimes still work in Assembly languages, which use simple english mnemonics for each machine code instruction.
* A Control Unit: The control unit takes in the program, reads in the machine code values (or bytes), and then operates the memory and execution units accordingly. The control is often a state machine of varying complexity. Separating the control and the execution units makes desiging computers much easier, since the control only needs to tell the execution units 'do this', and the execution units do not need to deal with the details of the machine code interpretation. The control and execution units, and sometimes limited amounts of memory, are typically
on a single silicon chip, known as the Central Processing Unit, or CPU.
* A Clock: Almost all modern computers run off very precise quartz crystal clocks, with speeds currently measured in Megahertz and recently Gigahertz. The clock makes the sequencing of operations in a computer simpler, guaranteeing valid results from one stage of computation to the next. However, synchronous designs, as clocked digital computers are called, are not the only modern method. Asynchronous VLSI techniques are also being investigated.
* A User Interface: The ability to compute would be of little use if there was no way to input new data, or read out the results. User interfaces for computers have also evolved a great deal over the last fifty years, from blinking rows of lights showing memory values, to modern monitors, keyboards, and mice. Additionally, computers are now typically connected to others through computer networks (such as the Internet, of course), allowing for widespread information exchange.
Computers - From steam power to quantum tunneling in 150 years - Evolution
A history and description of computers is immediately faced with a task of determining what truly constitutes a computer. A basic definition from typical web sources spits out:
com.put.er Pronunciation Key (km-pytr) n.
1. A device that computes, especially a programmable electronic machine that performs high-speed mathematical or logical operations or that assembles, stores, correlates, or otherwise processes information.
2. One who computes.
This node concerns itself with definition 1, a broad category of machinery: mechanical, electrical, whatever. While most modern computers are purely electrical devices, the history of computers is steeped in mechanical
contraptions of ever-increasing complexity. A full history would be excessive here, and is well documented in many places over the web. A brief synopsis, therefore, of that long history is presented here:
1. ~3000 BCE: The abacus is invented, apparently in Babylonia. Simple to construct, but allows addition and multiplication to be done with relatively high speed.
2. 1623 CE: William Schickard develops a calculation clock to help with multiplying large numbers, the first true mechanical calculating machine. No copies of the machine remain today.
3. 1801 CE: Joseph-Marie Jacquard constructs an automatic loom, using punch cards (making them the longest enduring computer data storage format), a precursor to today's robotic assembly lines. The loom could create intricate woven patterns based on sequences of punch cards. Riots later erupt over such machinery, blamed for replacing people with machines.
4. 1823 CE: Charles Babbage is given a grant by the English government to develop a full Difference Engine, a steam-powered polynomial evaluator. Babbage planned to use the Engine to recalculate critical mathematical and navigational tables. Ada Byron, Countess of Lovelace, corresponded with Babbage over the design constantly. Ten years later, Babbage conceived of the Analytic Engine, a true modern computer.
It was to have a memory store, an execution unit, and it would operate on a formula (in modern usage, computer programs), and it would be able to calculate any expression. Countess Lovelace documented the design throughly, and became the first known programmer when she developed formulas for the planned machine.
Unfortunately, the design was far too ambitious for the technology of the era, and an Analytic Engine was never built. In the end, neither was the Difference Engine, the half-built machine now residing in the Science Museum in London.
Babbage is considered to be the inventor, if not the implementor, of the true computer.
5. 1854 CE: George Boole publishes "An Investigation of the Laws of Thought", developing Boolean logic, a system of logical algebra. Boole created the algebra to evaluate the truth of logical propositions; his system is now the basis of every digital computer.
6. 1939-1944 CE: Konrad Zuse, a German engineer, completes the Z2, a machine using electromechanical relays and boolean logic, but otherwise very similar in basic design to Babbage's Analytic Engine. Conscripted into the army, he lead a team in designing the Z3, which had a 64-number memory, which each number being 22 binary digits (bits) long. The machine could perform a multiplication in 5 seconds. The Z3 was finished in 1941,
and was the first-ever working general purpose programmable computer. Unfortunately, it was destroyed in an air raid in Berlin in 1945. Meanwhile, research in the United States followed similar lines: Howard Aiken constructed the Mark 1, using electromechanical relays, with speeds similar to the Z machines. John Atanasoff and Clifford Berry designed, and partially constructed a fully electronic machine based on eletronic valves,
but the Atanasoff-Berry Computer (or ABC) was never completed.
7. 1943 CE: The Colossus is constructed to help break the German Enigma code. Built by a group in Bletchley Park, England which included Alan Turing who became one of the pioneers of Computer Science, the Colossus can be considered the first completely electronic computer, even though it could only perform the specialized task of breaking Enigma codes.
8. 1944 CE: The ENIAC is completed under a U.S. Government grant. Designed by J. Presper Eckert and John Mauchly, it was the first working fully electronic general-purpose computer. It was also quite large, measuring 100 by 10 by 3 feet, and weighed roughly 30 tons. It could perform a multiplication in less than 3 microseconds.
9. 1954 CE: The TRADIC becomes the first computer to use transistors as a replacement for the valves, setting the stage for the rapid growth in computer complexity predicted by Moore's Law
com.put.er Pronunciation Key (km-pytr) n.
1. A device that computes, especially a programmable electronic machine that performs high-speed mathematical or logical operations or that assembles, stores, correlates, or otherwise processes information.
2. One who computes.
This node concerns itself with definition 1, a broad category of machinery: mechanical, electrical, whatever. While most modern computers are purely electrical devices, the history of computers is steeped in mechanical
contraptions of ever-increasing complexity. A full history would be excessive here, and is well documented in many places over the web. A brief synopsis, therefore, of that long history is presented here:
1. ~3000 BCE: The abacus is invented, apparently in Babylonia. Simple to construct, but allows addition and multiplication to be done with relatively high speed.
2. 1623 CE: William Schickard develops a calculation clock to help with multiplying large numbers, the first true mechanical calculating machine. No copies of the machine remain today.
3. 1801 CE: Joseph-Marie Jacquard constructs an automatic loom, using punch cards (making them the longest enduring computer data storage format), a precursor to today's robotic assembly lines. The loom could create intricate woven patterns based on sequences of punch cards. Riots later erupt over such machinery, blamed for replacing people with machines.
4. 1823 CE: Charles Babbage is given a grant by the English government to develop a full Difference Engine, a steam-powered polynomial evaluator. Babbage planned to use the Engine to recalculate critical mathematical and navigational tables. Ada Byron, Countess of Lovelace, corresponded with Babbage over the design constantly. Ten years later, Babbage conceived of the Analytic Engine, a true modern computer.
It was to have a memory store, an execution unit, and it would operate on a formula (in modern usage, computer programs), and it would be able to calculate any expression. Countess Lovelace documented the design throughly, and became the first known programmer when she developed formulas for the planned machine.
Unfortunately, the design was far too ambitious for the technology of the era, and an Analytic Engine was never built. In the end, neither was the Difference Engine, the half-built machine now residing in the Science Museum in London.
Babbage is considered to be the inventor, if not the implementor, of the true computer.
5. 1854 CE: George Boole publishes "An Investigation of the Laws of Thought", developing Boolean logic, a system of logical algebra. Boole created the algebra to evaluate the truth of logical propositions; his system is now the basis of every digital computer.
6. 1939-1944 CE: Konrad Zuse, a German engineer, completes the Z2, a machine using electromechanical relays and boolean logic, but otherwise very similar in basic design to Babbage's Analytic Engine. Conscripted into the army, he lead a team in designing the Z3, which had a 64-number memory, which each number being 22 binary digits (bits) long. The machine could perform a multiplication in 5 seconds. The Z3 was finished in 1941,
and was the first-ever working general purpose programmable computer. Unfortunately, it was destroyed in an air raid in Berlin in 1945. Meanwhile, research in the United States followed similar lines: Howard Aiken constructed the Mark 1, using electromechanical relays, with speeds similar to the Z machines. John Atanasoff and Clifford Berry designed, and partially constructed a fully electronic machine based on eletronic valves,
but the Atanasoff-Berry Computer (or ABC) was never completed.
7. 1943 CE: The Colossus is constructed to help break the German Enigma code. Built by a group in Bletchley Park, England which included Alan Turing who became one of the pioneers of Computer Science, the Colossus can be considered the first completely electronic computer, even though it could only perform the specialized task of breaking Enigma codes.
8. 1944 CE: The ENIAC is completed under a U.S. Government grant. Designed by J. Presper Eckert and John Mauchly, it was the first working fully electronic general-purpose computer. It was also quite large, measuring 100 by 10 by 3 feet, and weighed roughly 30 tons. It could perform a multiplication in less than 3 microseconds.
9. 1954 CE: The TRADIC becomes the first computer to use transistors as a replacement for the valves, setting the stage for the rapid growth in computer complexity predicted by Moore's Law
Sunday, August 23, 2009
Quotes of the day....................
1. There are three kinds of intelligence: one kind understands things for itself, the other appreciates what others can understand, the third understands neither for itself nor through others. This first kind is excellent, the second good, and the third kind useless.[Niccolo Machiavelli]
2. It's not that I am so smart, it's just that I stay with problems longer.[Albert Einstein]
3. Failure is the opportunity to begin again more intelligently.[Moshe Arens]
4. In general we are least aware of what our brain do best.[Marvin Minsky]
5. Nature cares nothing for logic, our human logic: she has her own, which we do not recognize and do not acknowlege until we are crushed under its wheel.[Ivan Turgenev]
2. It's not that I am so smart, it's just that I stay with problems longer.[Albert Einstein]
3. Failure is the opportunity to begin again more intelligently.[Moshe Arens]
4. In general we are least aware of what our brain do best.[Marvin Minsky]
5. Nature cares nothing for logic, our human logic: she has her own, which we do not recognize and do not acknowlege until we are crushed under its wheel.[Ivan Turgenev]
Subscribe to:
Posts (Atom)