Consumer ElectronicsComputersCell PhonesHome Theater & AudioGraphic Design & Video EditingInternetIndustrial Technology

The Evolution of the Computer

Updated on August 2, 2017

Computers are arguably one of the most significant inventions in human history and their development, though rapid in the last century, spans over nearly half a millennia. Their storied history is also an excellent example of how scientists, inventors, and mathematicians can build on the work of their predecessors to address the needs of their respective times and shake the world with their creations.

The First Calculator

While the abacus is the world's first calculator, the first mechanical calculator was created in 1642. French polymath Blaise Pascal created a machine with a series of interlocking cogs that could add and subtract decimal numbers to help his father, a tax-collector, do sums. This first step towards the modern computer inspired several others, including German mathematician Gottfried Leibniz. In the late 17th century, he went on to elevate the design to complete more functions, including division and multiplication.

He is also credited with the pioneering the first memory store, or register, as well as inventing binary code. This was a way of representing any decimal number using only two digits, zero and one. Though his inventions did not use the latter, it inspired British mathematician George Boole to create binary, or Boolean, algebra. This would become widely used in computer design and operation, but would not come about for another century.

Charles Babbage

Based off a concept from Hessian army engineer J. H. Müller, Charles Babbage was the first to design an automatic mechanical calculator to tabulate polynomial functions. The Difference Engine, as it was called, was never completed, even with initial funding from the British government. With the assistance of Agusta Ada Byron, who helped to refine his ideas for programming, Babbage went a step further in proposing a mechanical multi-purpose general computer referred to as the Analytical Engine.

This machine is the earliest design for a computer similar to what we see today. It would operate on 40-digit numbers, with the "mill", an early CPU, with two main accumulators and some auxiliary ones, while the store, or memory, would have held a thousand 50-digit numbers. There would be several card readers to interpret programs and data stored on punch cards. These cards were chained and with the motion of these chains reversible. The machine would have performed conditional jumps and would have allowed a form of micro-coding. Powered by a steam engine, it was thought to be capable of addition in seconds and multiplication or division within a few minutes.

The Census Dilemma

In the United States, a census is taken every ten years, but by 1880, the population had grown so much that the process took eight years to complete working by hand using journal sheets. Fearing a backlog from estimations that the next census could take up a whole decade or longer, a competition was held to find a better method.

Census Department employee Herman Hollerith provided the solution in creating his first tabulating machine. The invention recorded data onto a medium that could be then be read. Prior machine readable media, such as a piano roll, had been used for control, not data. Instead of using paper tape, he went with punched cards and mechanical relays to increment mechanical counters. Even with the larger population, the 1890 census was reduced to six years and reportedly saved the government $5,000,000.

From this success, Hollerith went on to create the Tabulating Machine Company, which would eventually go on to become International Business Machines, or IBM. Several major census bureaus around the world and major insurance companies leased his machines and equipment and purchased his cards, funding future development, including a plugboard that could be rewired for different applications.

Warring Computer Development

World War II generated some of the most massive and powerful computers seen in that time, with progress quickened by opposing factions. Konrad Zuse of Germany took the first steps in 1938, building the world's first programmable binary computer, called Z1, in his parents' home. The following year, American physicist John Atanasoff and electrical engineer Clifford Berry built a more elaborate binary machine they names the Atanasoff Berry Computer (ABC). These were the first to use electrical switches to store numbers. Hundreds or thousands of switches could store many binary digits, unlike their analog predecessors that stored numbers using the positions of wheels and rods.

Based at Bletchley Park near London, a team of mathematicians, including Alan Turning, sought a solution for Alliance forces to crack the ever-changing secret codes of the Germans. They created a computer called Colossus, which was the first fully electronic computer. Instead of relays, it used a better form of switch known as a vacuum tube, which was previously created by Lee De Forest in 1906. The vacuum tubs, known as the Audion, were about as big as a person's thumb and glowed red hot like a tiny, electric bulb.

However, as it was a top secret program, the first visible computer utilizing these was created in 1946 by John Mauchly and J. Presper Eckert. The ENIAC, or Electronic Numerical Integrator And Calculator, contained 18,000 vacuum tubes, nine times more than Colossus, was around eighty feet long and weighed in at 30 tons. This made it more capable than its predecessor, which was designed exclusively for code breaking, but they were still massive, costly, and, at times, unreliable.

Microelectronic Solution

A solution was provided by three physicist working to develop new technology for the American public telephone system. John Bardeen, Walter Brattain and William Shockley of Bell Telephone Laboratories believed semiconductors could be used as a better form of amplifier than the vacuum tubes. This lead to the creation of the point-contact transistor, followed by the junction transistor.

Despite the many benefits of utilizing transistors as switches, they had to be hand wired to connect all the components together, which was laborious, costly and prone to errors. This prompted Jack Kirby of Texas Instruments to create the first integrated circuit, a collection of transistors and other components that could be manufactured all at once. Robert Noyce of Fairchild in California developed something similar at the same time but found a way to include connections between the components in an integrated circuit to automate the entire process.

In 1968, Robert Noyce and Gordon Moore left Fairchild to establish Intergrated Electronics, or Intel for short. They originally planned to make memory chips, but from an order to make chips for a range of pocket calculators, engineers Fredrico Faggin and Marcian Edward Hoff realized a universal chip could be made to work in them all. From there, the general purpose, single-chip computer, or microprocessor, was created.

Personal Computers

In 1974, using the popular Intel processor known as the 8080, the first MITIS Altair 8800 was built by Ed Roberts. This computer had a front panel covered with red LED lights and toggle switches and made Roberts a fortune. This inspired Steve Wozniak, an employee at Hewlett-Packard and member of the Homebrew Computer Club, to use a 6502 microprocessor to build the Apple I. One of his friends, Steve Jobs, convinced him to go into business with the machine and the two set up Apple Computer Corporation.

One year later, IBM released the IBM Personal Computer, or PC, using their 8080 microprocessor. They eventually turned to Bill Gates and his small company, Microsoft, to put together and opertating system called DOS, based on a product acquired from Seattle Computer Products. Gates retained the rights to a very similar version for his own use and eventually sold these to other computer manufacturers, like Compaq and Dell, who started making IBM compatible hardware.

This was the start of the computer as we know it today. Over nearly half a century, major industry names have continued to compete and innovate, making their products sleeker, smaller, faster and more powerful. With major innovations, like Google Glass, and continued integrations into all parts of society, there is little doubt that the computer will continue to evolve for many years to come.

Comments

    0 of 8192 characters used
    Post Comment

    No comments yet.