Definition: Computer is an electronic device which is performing calculating, storing,
The word Computer is derived from a Latin word “computare” which means to “to calculate” “to count”, “to sum up” or “to think together”.
There are five computer generations known till date. Each generation has been discussed in detail along with their time period and characteristics.
1. First Computer Generation – Vacuum Tubes (1940 – 1956):
The first generation computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, the first computers generated a lot of heat, which was often the cause of malfunctions. In this generation Input was based on punched cards and paper tape, and output was displayed on printouts.
2. Second Computer Generation – Transistors (1956 – 1963):
The replacement of vacuum tubes by transistors saw the advent of the second generation of computing. Although first invented in 1947, transistors weren’t used significantly in computers until the end of the 1950s. They were a big improvement over the vacuum tube, despite still subjecting computers to damaging levels of heat. However they were hugely superior to the vacuum tubes, making computers smaller, faster, cheaper and less heavy on electricity use. They still relied on punched card for input/printouts.
3. Third Generation – Integrated Circuits (1964 – 1971):
The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers. Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory.
4. Fourth Computer Generation – Microprocessors (1972 – 2010):
The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What is the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer—from the central processing unit and memory to input/output controls—on a single chip.
In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.
5. Fifth Computer Generation – Artificial Intelligence (2010-Present):
Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.