Computer History: Classification of Computer Generations

Updated on June 20, 2018
amuno profile image

Alfred is a long-time teacher and computer enthusiast who works with and troubleshoots a wide range of computing devices.

Computer generations explain the history of computers based on evolving technologies. With each new generation, computer circuitry, size, and parts have been miniaturized, the processing and speed doubled, memory got larger, and usability and reliability improved.

Note that the timeline specified for each generation is tentative and not definite. The generations are actually based on evolving chip technology rather than any particular time frame.

The evolution of computers has always centered  around the miniaturization of transistors and integrated circuit chips
The evolution of computers has always centered around the miniaturization of transistors and integrated circuit chips

The five generations of computers are characterized by electrical current flowing through the processing mechanisms listed below:

  • The first within vacuum tubes
  • The second within transistors
  • The third within integrated circuits
  • The fourth within microprocessor chips
  • The fifth unveiled smart devices capable of artificial intelligence.

First Computer Generation: 1940s -1950s: (Vacuum Tubes and Plugboards)

First generation computers were actually the first general purpose and true digital computers. They came in time to replace the electromechanical systems which were way too slow for assigned tasks.

A case in point was the need by the USA army to have machines capable of computing artillery firing tables fast enough. Existing ones took almost two days. When completed the new machines computed this table data in seconds. Fortunately or unfortunately, they became available only after the end of World War II in 1946.

Vacuum tubes were used in the 1st computer generation
Vacuum tubes were used in the 1st computer generation | Source

The first computer generations used vacuum tubes for amplification and switching purposes. The tubes were made of sealed glass containers, the size of light bulbs. The sealed glass allowed current to flow wirelessly from the filaments to metal plates. And because there were no moving parts in the system, the flow amplified current to enable the computer to manipulate assigned tasks. Vacuum tubes also started and ended the circuitry by switching on and off when turned on or off.

Besides boasting of thousands of resisters and capacitors, these computers would use anything up to and over 17,000 vacuum tubes, which meant computer installations covered entire rooms!

Input and output was done using punch cards, magnetic drums, typewriters and punch card readers. Initially, technicians manually perforated the cards with holes. This was later done using computers.

Interfacing with first gen systems was done using plugboards and machine language. The technicians wired up electrical circuits by connecting numerous cables to plugboards.

Then they slotted in specified punched cards into them and waited for hours for some form of computation while hoping every one of the thousands of vacuum tubes lasted the distance. Lest they went through the procedure again.

A record machine plugboard for IBM 1401
A record machine plugboard for IBM 1401 | Source

These machines were intended for low-level operations and thus programming was done using only binary digits 0s and 1s. The systems could solve only one problem at a time. Assembly language and operating system software were nonexistent.

One of the most outstanding computers in this era was The ENIAC (Electronic Numerical Integrator and Computer), which was designed and built by Engineers John W. Mauchly and J. Presper Eckert of the University of Pennsylvania. Its assembly was done by a team of fifty men.

It was 1000 times faster than the previous electromechanical computers but was a little slow when it came to re-programming.

Among many things, The ENIAC was used to study the feasibility of thermonuclear weaponry, firing of ballistic artillery and engine thermal ignition, and elsewhere, for weather predictions.

The left side of The ENIAC computer
The left side of The ENIAC computer | Source

These systems were enormous in size and occupied entire rooms while using lots of electric power. This made them generate unbearable heat.

A list of popular first generation computers:

  • The ENIAC (1946)
  • EDSAC (1949)
  • EDVAC (1950)
  • UNIVAC I (1951)

The UNIVAC (Universal Automatic Computer), still by Engineers John W. Mauchly and J. Presper Eckert was the first in the same era to be designed for commercial other than military use. It manipulated both the alphabet and numbers fairly well and was used by USA Census Bureau to enumerate the general population. It was later used to manipulate payrolls, records, company sales, and even predicted presidential election results in 1952.

Unlike the over 17,000 vacuum tubes in The ENIAC, UNIVAC I used slightly over 5,000 vacuum tubes. It was also half the size of its predecessor and sold over 46 units.

 UNIVAC as exhibited in the Vienna Technical Museum
UNIVAC as exhibited in the Vienna Technical Museum | Source

Characteristics of 1st Generation Computers

They:

  • Used vacuum tubes for circuitry
  • Electron emitting metal in vacuum tubes burned out easily
  • Used magnetic drums for memory
  • Were huge, slow, expensive, and many times undependable
  • Were expensive to operate
  • Were power hungry
  • Generated a lot of heat which would make them malfunction
  • Solved one problem at a time
  • Used input based on punched cards
  • Had their outputs displayed in print outs
  • Used magnetic tapes
  • Used machine language
  • Had limited primary memory
  • Were programming only in machine language

Second Generation Computers: 1950s -1960s: (Transistors and Batch Filing)

These were computers which used transistors instead of vacuum tubes. They were better than their predecessors in many ways because of apparent small size, speed and cheaper cost.

Transistors are more or less the building blocks of any microchip out there, and also, more reliable, energy efficient and capable of conducting electricity faster and better.

The transistor was used in the 2nd computer generation
The transistor was used in the 2nd computer generation | Source

Just like vacuum tubes, transistors are switches or electronic gates used to amplify or control current, or switch electric signals on and off. They are called semiconductors because they contain elements which lie between conductors and insulators.

Transistor semiconductors were invented at Bell Laboratories in 1947 by scientists William Shockley, John Bardeen and Walter Brattain, but did not see the day of light until mid-1950s.

Second generation computers saw advancement in data input and output procedures. Initially, these processes were similar to the last models of 1st gen computers. They were tedious because they involved multiple personnel carrying punched cards from room to room.

To speed up the process, the batch system was conjured up and implemented. It involved collecting multiple data jobs into multiple punched cards and feeding them into single magnetic tapes using a fairly smaller and inexpensive system. The IBM-1401 was one such computer. Processing, on the other hand, was done using a more powerful system like the IBM 7094.

When data manipulation was complete, the files were transferred back to a magnetic tape. To do this efficiently, IBM's operating system for IBM-7094 system and Fortran Monitor System were used. These were the harbingers of operating system software to come.

Using a smaller system again, say IBM-1401, the data was printed out to multiple punch cards as output.

IBM 1401 computer with one circuit card access drawer opened, on display at the Computer History Museum.
IBM 1401 computer with one circuit card access drawer opened, on display at the Computer History Museum. | Source

Besides the development of operating systems software, other commercial applications were also hitting the 'shelves'. This was probably due to the overall upgrade from restrictive binary based machine code to languages that wholly supported symbolic and alphanumeric coding. Programmers could now write in assemblers and high-level languages like FORTRAN, COBOL, SNOWBALL, and BASIC in 1964.

Punched card as part of a FORTRAN IV program
Punched card as part of a FORTRAN IV program

Characteristics of 2nd Gen Computers

They:

  • Used transistors
  • Faster and more reliable than first generation systems
  • Were slightly smaller, cheaper, faster
  • Generated heat though a little less
  • Still relied on punch cards and printouts for input/output
  • Allowed assembly and high-level languages
  • Stored data in magnetic media
  • Were still costly
  • Needed air conditioning
  • Introduced assembly language and operating system software

Operator's console for IBM 7094 at the Computer History Museum
Operator's console for IBM 7094 at the Computer History Museum | Source

The early mainframes and supercomputers were just some of the machines which took advantage of transistors. The UNIVAC LARC mainframe from Sperry Rand (1960) and IBM-7030 Stretch supercomputer (1961), and CDC 6600 mainframe (1963) were examples of these systems.

Other examples of 2nd Gen computers:

  • IBM-7000
  • CDC 3000 series
  • UNIVAC 1107
  • IBM-7094
  • MARK III
  • Honeywell 400

Third Computer Generation: 1960 - 1970s (Integrated Circuits and Multi-Programming)

3rd generation computers used the integrated circuit (IC) microchip instead of transistors. The semiconductor IC packed a huge number of transistors, capacitors, diodes and rectifiers onto a single germanium or silicon. These were then printed on separate parts of a printed circuit board.

The implementation of these computers was also in line with Moore's Law (1965), which observed that transistor size was shrinking so fast, that double the number would fit into new microchips every two years for 10 years to come.

He readjusted this exponential growth after ten years, to every five years, in 1975.

Integrated circuit on microchip
Integrated circuit on microchip

The IC sought to solve the cumbersome procedures that went into designing the transistor circuitry. The manual interconnection of capacitors, diodes, and rectifiers in transistors was time-consuming and not completely reliable.

Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Corporation separately discovered the benefits of integrated circuits in 1958 and 1959, respectively. Kilby built his IC onto germanium whereas Noyce built one onto a silicon chip.

The first systems to use the IC was the IBM 360, which was packed with the muscle to handle both commercial and scientific assignments.

Jack Kilby's IC chip
Jack Kilby's IC chip

Besides the reduction in cost, the speed and performance of any one computer increased tremendously after placing multiple transistors on a single chip. Since its invention, the IC speed doubled every two years, shrinking both the size and cost of computers even further.

Almost all electronic devices today use some form of integrated circuits placed on printed circuit boards.

The IC circuitry aside, the interaction with computers improved. Instead of punched cards printouts, keyboards and better input peripherals were used to input data which were displayed for output through visual display units.

Computers now used operating system software to manage computer hardware and resources. This allowed systems to run different applications at a time. This was because of centralized applications that monitored memory distribution.

Computers became accessible to the mass audience because of size and fair costing.

This generation also ushered in the concept of 'computer family' which challenged manufacturers to come up with computer components that were compatible with other systems.

A neat looking IBM 370 mainframe released in 1970
A neat looking IBM 370 mainframe released in 1970 | Source

Characteristics of 3rd Gen Computers

They:

  • Used ICs
  • Used parallel processing
  • Were slightly smaller, cheaper, faster
  • Used motherboards
  • Data was input using keyboards
  • Output was visualized on the monitors
  • Used operating systems, thus permitting multitasking
  • Simplified programming languages i.e. BASIC

The next generation of mainframes and supercomputers took advantage of integrated circuits (IC). The Scientific Data Systems Sigma 7 (1966) mainframe, and IBM-360 (1964) and CDC 8600 supercomputers (1969) were examples of these systems.

Other examples of third generation computers:

  • IBM-360
  • Personal Data Processor (PDP)
  • IBM-370

	 IBM 360 Mainframe in the German Museum
IBM 360 Mainframe in the German Museum | Source

Fourth Computer Generation: 1970s to Present (The Microprocessor, OS and GUI)

The birth of the microprocessor was at the same time the birth of the microcomputer. It was also in line to fulfill Moore's law which predicted exponential growth in transistor and microchips starting in 1965. This generation is instrumental in ushering in diverse devices. The 2nd generation computers which began in 1971 are those in use today.

Intel, through its engineers Ted Hoff, Federico Faggin and Stan Mazor In November 1971, introduced the world's first single chip microprocessor, the Intel 4004. It boasted of 2300 transistors and measured 1/8" by 1/16".

What in the first generation filled an entire room could now fitted in the palm of the hand.

On its own, new microchip was as powerful as The ENIAC computer from 1946. It also merged most of the functions that charged a computer like central processing unit, memory, input and output controls.

The Intel C4004 microprocessor initiated the 4th computer generation
The Intel C4004 microprocessor initiated the 4th computer generation | Source

Manufacturers soon started integrating these microchips in their new computers.

In 1973, the Xerox Alto computer from PARC was released silently. It was a true personal computer and featured an Ethernet port, a mouse and a bit-mapped graphical user interface, the first of its kind.

The last feature was motivation to Apple computers to build one of a kind. Xerox Alto was powered by a 16bit TI SN74S181N ALU chip from Texas Instruments.

Xerox Alto, arguably the firsst PC from 1973. It was powered by TI SN74S181N ALU chip from Texas Instruments
Xerox Alto, arguably the firsst PC from 1973. It was powered by TI SN74S181N ALU chip from Texas Instruments | Source

Challenged by the Xerox Alto, serious staff began in 1974 when Intel came up with a general purpose 8-bit microprocessor it named 8808. It sought for, and asked Gary Kildall, a consultant, to write an operating system for its new baby. This led to the disk-based operating system software known as Control Program for Microcomputers (CPM).

In 1981, International Business Machine introduced its first computer for the home which ran the 4004 processor. It was known as IBM PC, with PC standing for personal computer. They partnered with Bill Gates who bought Disk Operating System from Seattle Computer Product and had it distributed with IBM's new computer.

The original IBM PC in 1981
The original IBM PC in 1981 | Source

The IBM PC architecture became the de facto market standard model, which other PC makers emulated.

Apple under Steve jobs, changed the software game when it released the Apple Macintosh computer with an improved GUI (Graphical User Interface) in 1984, using the interface idea learned from Xerox PARC.

Remember that both Control Program for Microcomputer and Disk Operating System were command-line based operating systems which the user to interface with the computer using the keyboard.

The Apple Macintosh of 1984
The Apple Macintosh of 1984 | Source

Following the success of Apple's GUI, Microsoft too integrated a shell version of Windows in the DOS version of 1985. Windows was used like this for the next 10 years until it was reinvented as Windows 95. This was a true operating system software complete with all the right utilities.

Ubuntu running MS Windows 3.11 under VBox
Ubuntu running MS Windows 3.11 under VBox | Source

While software became commonplace and corporations began charging money for it, a new movement of programmers started Linux in 1991. Led by Linux Torvalds, they pioneered a free open source operating system project called Linux.

Besides Linux, other open source operating systems and free software were distributed to cater for office, networking and home computers.

Examples of open source and free software:

  • Ubuntu OS
  • Mozilla Firefox browser
  • Open Office
  • MySQL
  • VLC media player

Through the 1980s and 2000s, personal computers, and desktops, in particular, became commonplace. They were cheap and installed in offices, schools and homes. Software that ran on these computers also became readily available for small money or for free.

Examples of popular personal computer categories:

  • Desktops
  • All-in-one
  • Laptops
  • Workstations
  • Nettops
  • Tablets
  • Smartphones

A desktop computer
A desktop computer

Soon, microprocessors moved out of the reserve of desktop computers into other platforms in businesses and homes. First was the laptop, followed by tablets and Smartphone, consoles, embedded systems, smart cards, made popular of the need to use the internet while on the move.

The proliferation of mobile computing device soon fought off the dominance of desktops. According to ComScore in the publication Mobile’s Hierarchy of Needs of March 2017, mobiles accounted for 60% of all digital minutes across the world.

A kid using an iPad tablet
A kid using an iPad tablet

Characteristics of 4th Gen Computers

They:

  • Used CPUs which contained thousands of transistors
  • Were much smaller and fitted on a desktops, laps and palms
  • Used a mouse
  • Were used in networks
  • Were cheap
  • Had GUI
  • Were very fast
  • Register over 19 billion transistors in high-end microprocessors (Compare with 2,300 in Intel 4004)

The fourth generation of mainframes and supercomputers evolved to powerful systems:

  • IBM z9 (2005), z10 (2008) and z13 (2015) are examples of mainframes.
  • Cray 1 (1975), Fugitsu K (2011), Titan (2013), Sunway TaihuLight (2016) are examples of supercomputers.

The Cray-1 supercomputer of 1975
The Cray-1 supercomputer of 1975 | Source

Fifth Computer Generation : The Present and The Future

Fifth generation computing is built on technological advancement gained in the previous computer generations. The implementation hopes to improve human and machine interaction by harnessing human intelligence and taking advantage of the large data that has accumulated since the dawn of the digital age.

Referred to as computation of the future, it arises from the theory, concept and implementation of artificial intelligence (AI) and machine learning (ML). AI and ML may not be the same but are used interchangeably to mean the science of crafting devices and programs which are intelligent enough to interact with humans, other computers, the environment, and programs, by mining big data to achieve set goals.

The proliferation of computing devices with the possibility they can self-learn, respond and interact in normal and probably different ways, based on acquired experience and environment, has also given momentum to the Internet of Things (IoT) concept.

At their peak and with the right algorithms, computers will probably exhibit and process quite high levels of deep learning from which, humans too, can.

Many AI projects are already being implemented while others are still in developmental stages. Pioneers in accelerating AI include Google, Amazon, Microsoft, Apple, Facebook and Tesla.

The initial implementations are now seen on smart home devices which are meant to automate and integrate activities in the house though audio/visual devices, and self-drive cars which are already gracing the roads.

Coral (red) version of the Google Home Mini smart speaker
Coral (red) version of the Google Home Mini smart speaker | Source

Do you get worried that intelligent computers may take over the world one day?

See results

The larger goals in AI is to indulge devices to,

  • Understand natural language
  • Recognize human speech
  • See the world in three-dimensional perspective
  • Play interactive games
  • Implement expert input in medical and other complex fields
  • Exercise heuristic classification analysis
  • Implement neural networks

Other areas which are geared towards making AI possible are developments in:

  • Quantum computing
  • Parallel processing

Ongoing AI projects:

  • Virtual personal assistants e.g. Siri, Google Now and Braina.
  • Smart cars e.g. Tesla's autopilot cars and Google's self-driving cars.
  • News generation tools like Wordsmith are used by Yahoo and Fox to generate news snippets.
  • Computer Aided Diagnosis for detection of cancer.

Questions & Answers

    © 2017 Alfred Amuno

    Comments

      0 of 8192 characters used
      Post Comment

      No comments yet.

      working

      This website uses cookies

      As a user in the EEA, your approval is needed on a few things. To provide a better website experience, turbofuture.com uses cookies (and other similar technologies) and may collect, process, and share personal data. Please choose which areas of our service you consent to our doing so.

      For more information on managing or withdrawing consents and how we handle data, visit our Privacy Policy at: https://turbofuture.com/privacy-policy#gdpr

      Show Details
      Necessary
      HubPages Device IDThis is used to identify particular browsers or devices when the access the service, and is used for security reasons.
      LoginThis is necessary to sign in to the HubPages Service.
      Google RecaptchaThis is used to prevent bots and spam. (Privacy Policy)
      AkismetThis is used to detect comment spam. (Privacy Policy)
      HubPages Google AnalyticsThis is used to provide data on traffic to our website, all personally identifyable data is anonymized. (Privacy Policy)
      HubPages Traffic PixelThis is used to collect data on traffic to articles and other pages on our site. Unless you are signed in to a HubPages account, all personally identifiable information is anonymized.
      Amazon Web ServicesThis is a cloud services platform that we used to host our service. (Privacy Policy)
      CloudflareThis is a cloud CDN service that we use to efficiently deliver files required for our service to operate such as javascript, cascading style sheets, images, and videos. (Privacy Policy)
      Google Hosted LibrariesJavascript software libraries such as jQuery are loaded at endpoints on the googleapis.com or gstatic.com domains, for performance and efficiency reasons. (Privacy Policy)
      Features
      Google Custom SearchThis is feature allows you to search the site. (Privacy Policy)
      Google MapsSome articles have Google Maps embedded in them. (Privacy Policy)
      Google ChartsThis is used to display charts and graphs on articles and the author center. (Privacy Policy)
      Google AdSense Host APIThis service allows you to sign up for or associate a Google AdSense account with HubPages, so that you can earn money from ads on your articles. No data is shared unless you engage with this feature. (Privacy Policy)
      Google YouTubeSome articles have YouTube videos embedded in them. (Privacy Policy)
      VimeoSome articles have Vimeo videos embedded in them. (Privacy Policy)
      PaypalThis is used for a registered author who enrolls in the HubPages Earnings program and requests to be paid via PayPal. No data is shared with Paypal unless you engage with this feature. (Privacy Policy)
      Facebook LoginYou can use this to streamline signing up for, or signing in to your Hubpages account. No data is shared with Facebook unless you engage with this feature. (Privacy Policy)
      MavenThis supports the Maven widget and search functionality. (Privacy Policy)
      Marketing
      Google AdSenseThis is an ad network. (Privacy Policy)
      Google DoubleClickGoogle provides ad serving technology and runs an ad network. (Privacy Policy)
      Index ExchangeThis is an ad network. (Privacy Policy)
      SovrnThis is an ad network. (Privacy Policy)
      Facebook AdsThis is an ad network. (Privacy Policy)
      Amazon Unified Ad MarketplaceThis is an ad network. (Privacy Policy)
      AppNexusThis is an ad network. (Privacy Policy)
      OpenxThis is an ad network. (Privacy Policy)
      Rubicon ProjectThis is an ad network. (Privacy Policy)
      TripleLiftThis is an ad network. (Privacy Policy)
      Say MediaWe partner with Say Media to deliver ad campaigns on our sites. (Privacy Policy)
      Remarketing PixelsWe may use remarketing pixels from advertising networks such as Google AdWords, Bing Ads, and Facebook in order to advertise the HubPages Service to people that have visited our sites.
      Conversion Tracking PixelsWe may use conversion tracking pixels from advertising networks such as Google AdWords, Bing Ads, and Facebook in order to identify when an advertisement has successfully resulted in the desired action, such as signing up for the HubPages Service or publishing an article on the HubPages Service.
      Statistics
      Author Google AnalyticsThis is used to provide traffic data and reports to the authors of articles on the HubPages Service. (Privacy Policy)
      ComscoreComScore is a media measurement and analytics company providing marketing data and analytics to enterprises, media and advertising agencies, and publishers. Non-consent will result in ComScore only processing obfuscated personal data. (Privacy Policy)
      Amazon Tracking PixelSome articles display amazon products as part of the Amazon Affiliate program, this pixel provides traffic statistics for those products (Privacy Policy)