A Brief History of The Computer
A Brief History of The Computer
Computers and computer applications are on almost every aspect of our daily lives. As like
many ordinary objects around us, we may need clearer understanding of what they are. You
may ask "What is a computer?" or "What is a software", or "What is a programming language?"
First, let's examine the history.
1. The history of computers starts out about 2000 years ago in Babylonia (Mesopotamia),
at the birth of the abacus, a wooden rack holding two horizontal wires with beads
strung on them.
2. Blaise Pascal is usually credited for building the first digital computer in 1642. It added
numbers entered with dials and was made to help his father, a tax collector.
The basic principle of his calculator is still used today in water
meters and modern-day odometers. Instead of having a carriage
wheel turn the gear, he made each ten-teeth wheel accessible to
be turned directly by a person's hand (later inventors added keys
and a crank), with the result that when the wheels were turned in
the proper sequences, a series of numbers was entered and a
cumulative sum was obtained. The gear train supplied a
mechanical answer equal to the answer that is obtained by using
arithmetic.
This first mechanical calculator, called the Pascaline, had several disadvantages.
Although it did offer a substantial improvement over manual calculations, only Pascal
himself could repair the device and it cost more than the people it replaced! In addition,
the first signs of technophobia emerged with mathematicians fearing the loss of their
jobs due to progress.
3. A step towards automated computing was the development of punched cards, which
were first successfully used with computers in 1890 by Herman Hollerith and James
Powers, who worked for the US. Census Bureau. They developed devices that could read
the information that had been punched into the cards automatically, without human
help. Because of this, reading errors were reduced dramatically, work flow increased,
and, most importantly, stacks of punched cards could be used as easily accessible
memory of almost unlimited size. Furthermore, different problems could be stored on
different stacks of cards and accessed when needed.
4. These advantages were seen by commercial companies and soon led to the
development of improved punch-card using computers created by International
Business Machines (IBM), Remington (yes, the same people that make shavers),
Burroughs, and other corporations. These computers used electromechanical devices in
which electrical power provided mechanical motion -- like turning the wheels of an
adding machine. Such systems included features to:
o feed in a specified number of cards automatically
o add, multiply, and sort
o feed out cards with punched results
5. The start of World War II produced a large need for computer capacity, especially for
the military. New weapons were made for which trajectory tables and other essential
data were needed. In 1942, John P. Eckert, John W. Mauchly, and their associates at the
Moore school of Electrical Engineering of University of Pennsylvania decided to build a
high - speed electronic computer to do the job. This machine became known
as ENIAC (Electrical Numerical Integrator And Calculator)
Two men (in uniform) being trained to
maintain the ENIAC computer. The two women
in the photo were programmers. The ENIAC
occupied the entire thirty by fifty feet room.
6. The size of ENIAC’s numerical "word" was 10 decimal digits, and it could multiply two of
these numbers at a rate of 300 per second, by finding the value of each product from a
multiplication table stored in its memory. ENIAC was therefore about 1,000 times faster
then the previous generation of relay computers. ENIAC used 18,000 vacuum tubes,
about 1,800 square feet of floor space, and consumed about 180,000 watts of electrical
power. It had punched card I/O, 1 multiplier, 1 divider/square rooter, and 20 adders
using decimal ring counters, which served as adders and also as quick-access (.0002
seconds) read-write register storage. The executable instructions making up a program
were embodied in the separate "units" of ENIAC, which were plugged together to form a
"route" for the flow of information.
7. Early in the 50’s two important engineering discoveries changed the image of the
electronic - computer field, from one of fast but unreliable hardware to an image of
relatively high reliability and even more capability. These discoveries were the magnetic
core memory and the Transistor - Circuit Element.
8. Many companies, such as Apple Computer and Radio Shack, introduced very successful
PC’s in the 1970's, encouraged in part by a fad in computer (video) games. In the 1980's
some friction occurred in the crowded PC field, with Apple and IBM keeping strong. In
the manufacturing of semiconductor chips, the Intel and Motorola Corporations were
very competitive into the 1980s, although Japanese firms were making strong economic
advances, especially in the area of memory chips. By the late 1980s, some personal
computers were run by microprocessors that, handling 32 bits of data at a time, could
process about 4,000,000 instructions per second.
The first modern computers
The World War II years were a crucial period in the history of computing, when powerful
gargantuan computers began to appear. Just before the outbreak of the war, in 1938, German
engineer Konrad Zuse (1910–1995) constructed his Z1, the world's first programmable binary
computer, in his parents' living room. The following year, American physicist John
Atanasoff (1903–1995) and his assistant, electrical engineer Clifford Berry (1918–1963), built a
more elaborate binary machine that they named the Atanasoff Berry Computer (ABC). It was a
great advance—1000 times more accurate than Bush's Differential Analyzer.
The first large-scale digital computer of this kind appeared in 1944 at Harvard University, built
by mathematician Howard Aiken (1900–1973). Sponsored by IBM, it was variously known as
the Harvard Mark I or the IBM Automatic Sequence Controlled Calculator (ASCC). A giant of a
machine, stretching 15m (50ft) in length, it was like a huge mechanical calculator built into a
wall. It must have sounded impressive, because it stored and processed numbers using
"clickety-clack" electromagnetic relays (electrically operated magnets that automatically
switched lines in telephone exchanges)—no fewer than 3304 of them. Impressive they may
have been, but relays suffered from several problems: they were large (that's why the Harvard
Mark I had to be so big); they needed quite hefty pulses of power to make them switch; and
they were slow (it took time for a relay to flip from "off" to "on" or from 0 to 1).
Personal computers
By 1974, Intel had launched a popular microprocessor known as the 8080 and computer
hobbyists were soon building home computers around it. The first was the MITS Altair 8800,
built by Ed Roberts. With its front panel covered in red LED lights and toggle switches, it was a
far cry from modern PCs and laptops. Even so, it sold by the thousand and earned Roberts a
fortune. The Altair inspired a Californian electronics wizard name Steve Wozniak (1950–) to
develop a computer of his own. "Woz" is often described as the hacker's "hacker"—a
technically brilliant and highly creative engineer who pushed the boundaries of computing
largely for his own amusement. In the mid-1970s, he was working at the Hewlett-Packard
computer company in California, and spending his free time tinkering away as a member of the
Homebrew Computer Club in the Bay Area.