Generations of Computer

Generations of Computer 
• The computer has evolved from a large-sized simple calculating machine to a smaller but much more powerful machine.
 • The evolution of computer to the current state is defined in terms of the generations of computer. • Each generation of computer is designed based on a new technological development, resulting in better, cheaper and smaller computers that are more powerful, faster and efficient than their predecessors.


First Generation
In the late 1940s and early 1950s (EDSAC, UNIVAC I, etc.) computers used vacuum tubes for their digital logic and liquid mercury memories for storage. See early memoryEDSAC and UNIVAC I.

Second Generation
In the late 1950s, transistors replaced tubes and used magnetic cores for memories (IBM 1401, Honeywell 800). Size was reduced and reliability was significantly improved. See IBM 1401 and Honeywell.

Third Generation
In the mid-1960s, computers used the first integrated circuits (IBM 360, CDC 6400) and the first operating systems and database management systems. Although most processing was still batch oriented using punch cards and magnetic tapes, online systems were being developed. This was the era of mainframes and minicomputers, essentially large centralized computers and small departmental computers. See punch cardSystem/360 and Control Data.

Fourth Generation
The mid to late-1970s spawned the microprocessor and personal computer, introducing distributed processing and office automation. Word processing, query languages, report writers and spreadsheets put large numbers of people in touch with the computer for the first time. See query language and report writer.

Fifth Generation - The Future
The 21st century ushered in the fifth generation, which increasingly delivers various forms of artificial intelligence (AI). More sophisticated search and natural language recognition are features that users recognize, but software that improves its functionality by learning on its own will change just about everything in the tech world in the future. See AImachine learningdeep learningneural networkcomputer visionvirtual assistant and natural language recognition.

Post a Comment

Previous Post Next Post