Alfred is a long-time teacher and computer enthusiast who works with and troubleshoots a wide range of computing devices.
The history of computers is deeply rooted in the desire by mankind to compute basic mathematical functions. These and other tasks remain the core functions of computers even today.
Man actually began this quest by using readily available tools like fingers, sticks, and stones before adopting more advanced contraptions such as the abacus. Then came the Turing machine, a model computer that manipulates symbols on strips of tape to simulate algorithms. The Turing machine paved way for finite mechanical machines in the mid-1900s.
Advanced computers as we know them today became mainstream at the turn of the 20th century.
The Abacus Calculator
Initial data manipulation was done using fingers, sticks, and stones. The increase in population and the need to manipulate even bigger tasks meant that a switch to more advanced systems was needed.
The Chinese used the abacus tables and strings during the second century BC. It was the ultimate counting machine at the time.
It assisted its user to calculate with addition, division, subtraction, and multiplication. This was done by sliding beads on its wooden rods.
The use of the suanpan—abacus in Chinese—and other types of abaci and counting frames in Asia was not well documented until about 1000 BC. This was when there was an increase in trade across Asia.
Variants of the abaci have remained popular and are still used in parts of Asia and Africa.
How the Abacus Works
Leonardo da Vinci, Charles Babage, and Alan Turing
The concept of the computing machine was probably hatched by Leonardo da Vinci around 1500 AD. He was fascinated by the meaning of numbers and other computing possibilities. He envisaged a mechanical calculator capable of manipulating abstract and absolute numbers. His dreams never materialized in his lifetime but they remain in the history books.
The 1830s saw the birth of the Analytical Engine theory from Charles Babbage. His idea was a general purpose mechanical computer designed with an arithmetic logic unit and control flow. These concepts are very much alive in modern computers. While he never completed building any of these types of machines, a working difference machine was built in 1991.
Babbage's theories were expanded on in the 1930s by Alan Turing. He envisaged a device that could compute probable/improbable mathematical tasks.
The next great leap in computing happened in the early 1990s with the creation of better computing models. This was pioneered by previous devices, such as the Turing machine. That device was designed to manipulate symbols on a strip of tape according to a table of rules.
The first digital and programmable computer was manufactured in 1941. It was called the Z3 and was designed by Konrad Zuse. It was a mammoth electromechanical system that could be operated by a team of hands-on experts. It was used for the statistical analysis of wing flutter by the German Aircraft Research Institute.
The Z3 was also considered to be capable of true Turing completeness since it implemented Alan's concepts.
This machine was followed by a flurry of equally huge machines which occupied whole rooms. Popular ones were:
- Z4 in 1944.
- Colossus in 1944.
- ENIAC in 1946.
- UNIVAC I in 1951.
- IBM 702 in 1955.
- UNIVAC 1108 in 1976.
The ENIAC (Electronic Numeric Integrator and Computer) was also a Turing Complete machine. It was used by the US Army Ballistic Research Lab to study the feasibility of thermonuclear weaponry, the firing of ballistic artillery, and engine thermal ignition. It was also used for weather predictions.
Its processing abilities were immense and much faster than electromechanical computers designed earlier.
The ENIAC was designed and made by John Mauchly and J. Presper Eckert from the University of Pennsylvania.
It weighed 30 tons and covered about 1,800 square feet.
In the 1960s and 70s, the concept of computers that could be used by individuals was developing roots.
Sooner rather than later, the likes of ENIAC were miniaturized to fit on top of desks in the 1980s. At the dawn of the 21st century, even smaller devices that fit in handbags and pockets were made. This was the dawn of the personal computer.
The Microprocessor Generation
The invention of the personal computer was made possible by two technical innovations in the field of microelectronics.
- In 1959, the integrated circuit (IC) was developed.
- In 1971, the microprocessor was unveiled to the computing world.
The integrated circuit was a great innovation since it led to the miniaturization of computer circuit boards and chips. They were reduced to the size of single silicon chips that fit in the palm of your hand.
In 1971, a team of engineers working for Intel Corporation (Ted Hoff, Federico Faggin, and Stan Mazor) invented the microprocessor, calling it the Intel 4004.
The new chip boasted an equivalent of 2300 transistors on a single silicon chip. This made it possible to reduce computer sizes and components.
The Altair 8800 and Birth of Microsoft
In 1975, a company called Micro Instrumentation Telemetry Systems (MITS) produced the first desktop-size system specifically for personal use. It was called the Altair 8800. This device was not necessarily the best example of a personal computer since interaction with it was limited to very high levels of sophistication.
The birth of the Altair sent a number of computer geeks and enthusiasts into a flurry. Most of them wanted to become part of teams that would be counted in computer history as pioneers of the first personal computers.
While this period was marked with hardware activity, the software industry was also beginning to take shape and there was obvious enthusiasm.
The demand for personal computers meant that someone had to create software to accompany the hardware. Bill Gates and Paul Allen were such enthusiasts.
The two young men offered to write software for the newly made Altair. The original model used machine code. The two believed that the computer could be user-friendly if users could program it using basic interpreter software.
The management of MITS agreed. After almost two months, the new software was installed in the Altair.
A year later, in 1975, Gates and Allen formed a software company called Microsoft and they began writing software for new computers.
The demand for the personal computers rose immediately. This propelled innovation in new hardware and software.
Apple II and the Birth of Apple Computer
In 1976, Steve Jobs and Stephen Wozniak designed a homemade wooden computer with an integrated microprocessor. They named it the Apple I. It was targeted at a growing market of computer enthusiasts.
Apple I featured an integrated motherboard circuit and a handmade wooden casing.
Pushed on by the excitement of this innovation, the two started a computer manufacturing company, which they named Apple Computers, in 1977.
In June 1977, they came up with a brand new personal computer called the Apple II. It came ready with a monitor, two floppy disk drives, a system-integrated keyboard, and sound device.
The computer ran an 8-bit MOS 6502 microprocessor at a speed of 1 MHz. It had 4 kilobytes of memory.
Apple Computers went on to become the fastest-growing company in the USA at the time. Its rapid growth quickly inspired a large number of microcomputer manufacturers to churn out their own brands.
By the time the Apple II was phased out around 1993, it had sold over five million sets in the US alone, a record in itself.
The new design definitely set the pace for the next line of computers. In one way or another, it set the tone for what the future computer was going to look like. The cue was picked up by many companies; most notably by Tandy Corporation and IBM.
TRS-80 and Computer Storage
RadioShack, formerly known as Tandy Corporation, introduced the TRS-80 on August 3, 1977.
The company soon became one of the leading PC sellers. It dominated the field because of two attractive features it included in its computers: a keyboard and a cathode ray display terminal (CRT) monitor.
The TRS-80 was also popular because it could be programmed and the user was able to store data in cassette tapes.
The IBM PC
The first truly successful personal computer was the IBM PC, which was launched in 1981. What made this machine special and different from Apple II and other models was the simple fact that it was built from a combination of off-the-shelf parts. It came complete with a separate monitor, keyboard, and system unit.
This innovation allowed different manufacturers to come up with different qualifying computer parts that could be assembled into complete sets.The standardized components included keyboards, disk drives, floppy disks, data cables, power cables, and monitors.
The IBM PC used a 16-bit microprocessor. It contained a 4.7-MHz 8088 processor, 64 kilobytes of Random Access Memory, a 5.25-inch floppy drive, a cassette tape drive for storage, and the MS-DOS 1.0 operating system software from Microsoft.
This computer initiated the development of faster and more powerful microprocessors. The use of an operating system by Microsoft propelled the PC industry like never before.
Ever since this machine was released, almost all new PCs have remained compatible with IBM and Microsoft software packages.
Smartphones and Tablets
Computers have gone through fundamental changes in both speed and size in a period shorter than 70 years.
When personal computers finally arrived, they offered speed without taking up an entire room.
The reign of desktops and laptops was at its peak during the start of the 21st century. From the highest office in Washington to the smallest hut in an African village, computers had become commonplace.
Then came the mobile phone, which was immediately followed by the tablet computer. This was also the period where the internet was becoming a part of daily life across the globe.
The introduction of the iPhone in 2007 changed the demographics of computing. Everyone could now own a computer in their palms. Business tasks could now be accomplished on the road.
PC manufacturers got the clue in the market change. A flurry of smartphones flooded the market.
Then came the iPad in 2010. Tablets offer a much bigger screen, making it possible to accomplish tasks that could be done on a traditional desktop computer.
Once again, computer manufacturers flooded the market with tablets.
A new phenomenon also emerged in the name of the phablet. A phablet is a tablet computer that is much smaller but is still bigger than a smartphone. Their average size is around seven inches.
The proliferation of smartphones and other mobile gadgets have forced many leading companies to shelve production of traditional desktop computers in their favor, alongside traditional laptops.
This article is accurate and true to the best of the author’s knowledge. Content is for informational or entertainment purposes only and does not substitute for personal counsel or professional advice in business, financial, legal, or technical matters.
© 2012 Alfred Amuno
No on May 21, 2019:
Alfred Amuno (author) from Kampala on July 12, 2012:
Lucky you Dave, because that must be a prized item. I remember Apple selling an original Apple I board at $374500. Thanks for dropping by.
Davesworld from Cottage Grove, MN 55016 on July 12, 2012:
I still have a working Apple ][ with serial number 1735 (I think) I know it's under 2000. Neener neener.
Alfred Amuno (author) from Kampala on July 12, 2012:
I personally used Pentium 1 to learn computers. Looks like long time back. The feel and look of the computer has changed a great deal. Thanks for reading the hub Linda.
Linda from Texas on July 12, 2012:
I recall several years ago I was working at a newspaper in a small town. They had a dictionary that was over 100 years old, it was HUGE! One day I looked up the word computer and it said simply 'one who computes.'
Your hub is very interesting and informative, I love the photos of the old computers, reminds me of the data processing machines we had in high school, they were the size of a desk. And I just told my age, didn't I?
Thanks for posting!