0% found this document useful (0 votes)
88 views2 pages

History of Computers - Bulaybulay

The document provides a history of computers from ancient counting tools like the abacus to modern computers. It details the major generations of computers, from the first generation in 1946-1959 which used vacuum tubes and were large and expensive, to the current fifth generation from 1980 to today which uses ultra-large-scale integrated circuits and artificial intelligence. Each generation saw improvements in speed, memory capacity, size, cost and programming capabilities as new technologies like transistors, integrated circuits, and microchips were developed and adopted.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
88 views2 pages

History of Computers - Bulaybulay

The document provides a history of computers from ancient counting tools like the abacus to modern computers. It details the major generations of computers, from the first generation in 1946-1959 which used vacuum tubes and were large and expensive, to the current fifth generation from 1980 to today which uses ultra-large-scale integrated circuits and artificial intelligence. Each generation saw improvements in speed, memory capacity, size, cost and programming capabilities as new technologies like transistors, integrated circuits, and microchips were developed and adopted.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Loren Mae A.

Bulaybulay

1st Year BS in Computer Engineering

COE100 W1 Computer Engineering as a Discipline

History of Computers

The primitives were the first to utilize a counting device using sticks, stones, and bones
as counting tools. More computer devices were produced as human cognition and technology
progressed with time. The history of computers may be traced back to the invention of the
abacus which is said to be the earliest computer and claimed to be invented by the Chinese
about 4,000 years ago. The Abacus is a wooden rack with rods on which beads were placed and
can perform arithmetic calculations.

In 1600 during the Industrial Age, John Napier invented a manually-operated device
known as "Napier's Bones. He carved logarithmic measures into a set of ten wooden rods and
was able to multiply and divide by matching the numbers on the rods. It is the first machine to
use the decimal point. With the concept of Napier's Bones, William Oughtred invented the
sliderule in 1621. He created the calculating "machine" by inscribing logarithms on strips of
wood, which was utilized until the mid-1970s, when the first hand-held calculators and
microcomputers developed.

Blaise Pascal invented the Pascaline between 1642 and 1644 that could do addition and
subtraction to help his father who is a tax accountant. It is also known as Arithmetic Machine or
Adding Machine and is believed that it was the first mechanical and automatic calculator. It was
a wooden box with gears and wheels inside. When one wheel turns one revolution, the
adjoining wheel rotates as well. To read the totals, a series of windows are provided on the top
of the wheels.

Gottfried Wilhelm von Leibniz improved Pascal's invention to develop the Stepped
Reckoner or Leibnitz wheel in 1673. Similar to Pascal's machine it could add and subtract but it
could not multiply and divide. Charles Babbage, the "Father of the Modern Computer,"
developed the Difference Engine in the early 1820s. It was a mechanical computer capable of
doing basic computations. It was a steam-powered calculating machine used to solve numerical
tables such as logarithm tables. Charles Babbage also developed a calculating machine called
Analytical Engine in 1830. It was a mechanical computer that took input from punch cards. It can
solve any mathematical problem and store data in permanent memory.

In 1890, Herman Hollerith invented the Tabulating Machine which is a mechanical


tabulator based on punch cards. It could compile statistics and store or organize data or
information. This machine was used in the United States Census of 1890 thus, in 1924, Hollerith
founded the Hollerith's Tabulating Machine Company, which subsequently became International
Business Machine (IBM).

Vannevar Bush invented an analog device called Differential Analyzer. In 1930, it was the
first electrical computer to be released in the United States. Vacuum tubes are used in this
machine to switch electrical impulses and conduct computations and just in a few minutes, it
could do 25 computations.

The next significant development in computer history occurred in 1937 when Howard
Aiken sought to create a machine that could do large-number computations. In 1944, IBM and
Harvard collaborated to create the Mark I computer and it was the first digital computer that
could be programmed.

A generation of computers refers to the particular advancements in computer


technology throughout time. To conduct the counting, electronic channels known as circuits
were established in 1946. It took the place of the counting gears and other mechanical elements
used in prior computers. The circuits in each generation got smaller and more sophisticated
than the circuits in the preceding generation. Computers' speed, memory, and power were all
improved as a result of miniaturization. Computers are divided into five generations.

Computers of the first generation (1946-1959) were sluggish, large, and costly. The main
components of the CPU and memory of these computers were vacuum tubes. The batch
operating system and punch cards were the mainstays of these machines. This generation
employed magnetic tape and paper tape as output and input technologies. The following are
some of the most popular first-generation computers: ENIAC ( Electronic Numerical Integrator
and Computer), EDVAC ( Electronic Discrete Variable Automatic Computer), UNIVACI( Universal
Automatic Computer), IBM-701, and IBM-650.

Transistor computers dominated the second generation (1959-1965). Transistor


computers were quicker than first-generation computers because they employed transistors,
which were inexpensive, small, and consumed less power. Magnetic cores were utilized as the
primary memory, and magnetic discs and tapes were used as supplementary storage in this
generation. These computers employed assembly language and programming languages like
COBOL and FORTRAN, as well as batch processing and multiprogramming operating systems.
The following are some of the most popular second-generation computers: IBM 1620, IBM 7094,
CDC 1604, CDC 3600, and UNIVAC 1108.

Integrated circuits (ICs) replaced transistors in the third generation of computers. A


single integrated circuit (IC) may hold a large number of transistors, increasing the computing
capability while lowering the cost. Computers also grew more dependable, efficient, and
compact. As an operating system, these machines employed remote processing, time-sharing,
and multi-programming. This generation also utilized high-level programming languages
including FORTRON-II TO IV, COBOL, PASCAL PL/1, and ALGOL-68. The following are some of
the most popular third-generation computers: IBM-360 series, Honeywell-6000 series,
PDP(Personal Data Processor), IBM-370/168, and TDC-316.

Very large-scale integrated (VLSI) circuits were employed in the fourth generation of
computers (1971-1980), which consisted of a chip with millions of transistors and other circuit
components. These processors made this generation's computers smaller, more powerful, faster,
and less expensive. Real-time, time-sharing, and distributed operating systems were utilized on
these machines. This generation also employed programming languages such as C, C++, and
DBASE. The following are some of the most popular fourth-generation computers: DEC 10, STAR
1000, PDP 11, CRAY-1(Supercomputer), and CRAY-X-MP(Supercomputer).

VLSI technology was replaced with ULSI technology in fifth-generation computers (1980-
to-date) (Ultra Large-Scale Integration). It enabled the manufacture of 10 million electrical
components microprocessor chips. Parallel processing hardware and AI (Artificial Intelligence)
software were employed in this generation of computers. C, C++, Java,.Net, and other
programming languages were employed in this generation. The following are some of the most
popular fifth-generation computers: Desktop, Laptop, NoteBook, UltraBook, and Chromebook.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy