History of Computers - Bulaybulay
History of Computers - Bulaybulay
Bulaybulay
History of Computers
The primitives were the first to utilize a counting device using sticks, stones, and bones
as counting tools. More computer devices were produced as human cognition and technology
progressed with time. The history of computers may be traced back to the invention of the
abacus which is said to be the earliest computer and claimed to be invented by the Chinese
about 4,000 years ago. The Abacus is a wooden rack with rods on which beads were placed and
can perform arithmetic calculations.
In 1600 during the Industrial Age, John Napier invented a manually-operated device
known as "Napier's Bones. He carved logarithmic measures into a set of ten wooden rods and
was able to multiply and divide by matching the numbers on the rods. It is the first machine to
use the decimal point. With the concept of Napier's Bones, William Oughtred invented the
sliderule in 1621. He created the calculating "machine" by inscribing logarithms on strips of
wood, which was utilized until the mid-1970s, when the first hand-held calculators and
microcomputers developed.
Blaise Pascal invented the Pascaline between 1642 and 1644 that could do addition and
subtraction to help his father who is a tax accountant. It is also known as Arithmetic Machine or
Adding Machine and is believed that it was the first mechanical and automatic calculator. It was
a wooden box with gears and wheels inside. When one wheel turns one revolution, the
adjoining wheel rotates as well. To read the totals, a series of windows are provided on the top
of the wheels.
Gottfried Wilhelm von Leibniz improved Pascal's invention to develop the Stepped
Reckoner or Leibnitz wheel in 1673. Similar to Pascal's machine it could add and subtract but it
could not multiply and divide. Charles Babbage, the "Father of the Modern Computer,"
developed the Difference Engine in the early 1820s. It was a mechanical computer capable of
doing basic computations. It was a steam-powered calculating machine used to solve numerical
tables such as logarithm tables. Charles Babbage also developed a calculating machine called
Analytical Engine in 1830. It was a mechanical computer that took input from punch cards. It can
solve any mathematical problem and store data in permanent memory.
Vannevar Bush invented an analog device called Differential Analyzer. In 1930, it was the
first electrical computer to be released in the United States. Vacuum tubes are used in this
machine to switch electrical impulses and conduct computations and just in a few minutes, it
could do 25 computations.
The next significant development in computer history occurred in 1937 when Howard
Aiken sought to create a machine that could do large-number computations. In 1944, IBM and
Harvard collaborated to create the Mark I computer and it was the first digital computer that
could be programmed.
Computers of the first generation (1946-1959) were sluggish, large, and costly. The main
components of the CPU and memory of these computers were vacuum tubes. The batch
operating system and punch cards were the mainstays of these machines. This generation
employed magnetic tape and paper tape as output and input technologies. The following are
some of the most popular first-generation computers: ENIAC ( Electronic Numerical Integrator
and Computer), EDVAC ( Electronic Discrete Variable Automatic Computer), UNIVACI( Universal
Automatic Computer), IBM-701, and IBM-650.
Very large-scale integrated (VLSI) circuits were employed in the fourth generation of
computers (1971-1980), which consisted of a chip with millions of transistors and other circuit
components. These processors made this generation's computers smaller, more powerful, faster,
and less expensive. Real-time, time-sharing, and distributed operating systems were utilized on
these machines. This generation also employed programming languages such as C, C++, and
DBASE. The following are some of the most popular fourth-generation computers: DEC 10, STAR
1000, PDP 11, CRAY-1(Supercomputer), and CRAY-X-MP(Supercomputer).
VLSI technology was replaced with ULSI technology in fifth-generation computers (1980-
to-date) (Ultra Large-Scale Integration). It enabled the manufacture of 10 million electrical
components microprocessor chips. Parallel processing hardware and AI (Artificial Intelligence)
software were employed in this generation of computers. C, C++, Java,.Net, and other
programming languages were employed in this generation. The following are some of the most
popular fifth-generation computers: Desktop, Laptop, NoteBook, UltraBook, and Chromebook.