History of Computers: Timeline & Evolution

Lesson Transcript
Instructor: Paul Zandbergen

Paul is a GIS professor at Vancouver Island U, has a PhD from U of British Columbia, and has taught stats and programming for 15 years.

Modern computing has a rich history. Learn about the earliest computing devices developed by humans, the first electronic computers and the development of modern computer components, such as microprocessors and the graphical user interface, in this video lesson.

History of Computers

A computer is an electronic machine that accepts information, stores it, processes it according to the instructions provided by a user and then returns the result. Today, we take computers for granted, and they have become part of our everyday activities. While computers as we know them today are relatively recent, the concepts and ideas behind computers have quite a bit of history - time for a whirlwind tour of how we got to the age of email, YouTube and Facebook.

An error occurred trying to load this video.

Try refreshing the page, or contact customer support.

Coming up next: Information Technology: Impact on the Economy

You're on a roll. Keep up the good work!

Take Quiz Watch Next Lesson
Your next lesson will play in 10 seconds
  • 0:07 History of Computers
  • 0:43 Early Computing Devices
  • 2:04 The First Electronic Computers
  • 4:59 Personal Computers
  • 7:54 Trends in Processing Power
  • 8:58 Lesson Summary
Save Save Save

Want to watch this again later?

Log in or sign up to add this lesson to a Custom Course.

Log in or Sign up

Speed Speed

Early Computing Devices

Attempts by humans to develop a tool to manipulate data go back as far as 2600 BC when the Chinese came up with the abacus. The slide rule was invented in 1621 and remained widely used until the emergence of electronic calculators in the 1970s. Both these examples of early devices were mechanical and on a human scale.

In 1830 the English mathematician Charles Babbage conceived an analytical engine, which could be programmed with punched cards to carry out calculations. It was different from its predecessors because it was able to make decisions based on its own computations, such as sequential control, branching and looping. Almost all computers in use today follow this basic idea laid out by Babbage, which is why he is often referred to as 'the father of computers.' The analytical engine was so complex that Babbage was never able to build a working model of his design. It was finally built more than 100 years later by the London Science Museum.

The First Electronic Computers

Many different types of mechanical devices followed that built on the idea of the analytical engine. The very first electronic computers were developed by Konrad Zuse in Germany in the period 1935 to 1941. The Z3 was the first working, programmable and fully automatic digital computer. The original was destroyed in World War II, but a replica has been built by the Deutsches Museum in Munich. Because his devices implemented many of the concepts we still use in modern-day computers, Zuse is often regarded as the 'inventor of the computer.'

Around the same time, the British built the Colossus computer to break encrypted German codes for the war effort, and the Americans built the Electronic Numerical Integrator Analyzer and Computer, or ENIAC. Built between 1943 and 1945, ENIAC weighed 30 tons and was 100 feet long and eight feet high. Both Colossus and ENIAC relied heavily on vacuum tubes, which can act as an electronic switch that can be turned on or off much faster than mechanical switches, which were used until then. Computer systems using vacuum tubes are considered the first generation of computers.

Vacuum tubes, however, consume massive amounts of energy, turning a computer into an oven. The first semiconductor transistor was invented in 1926, but only in 1947 was it developed into a solid-state, reliable transistor for the use in computers. Similar to a vacuum tube, a transistor controls the flow of electricity, but it was only a few millimeters in size and generated little heat. Computer systems using transistors are considered the second generation of computers.

It took a few years for the transistor technology to mature, but in 1954 the company IBM introduced the 650, the first mass-produced computer. Today's computers still use transistors, although they are much smaller. By 1958 it became possible to combine several components, including transistors, and the circuitry connecting them on a single piece of silicon. This was the first integrated circuit. Computer systems using integrated circuits are considered the third generation of computers. Integrated circuits led to the computer processors we use today.

Personal Computers

Computers become quickly more powerful. By 1970 it became possible to squeeze all the integrated circuits that are part of a single computer on a single chip called a microprocessor. Computer systems using microprocessors are considered the fourth generation of computers.

In the early 1970s computers were still mostly used by larger corporations, government agencies and universities. The first device that could be called a personal computer was introduced in 1975. The Altair 8800 was made by Micro Instrumentation and Telemetry Systems. It included an Intel 8080 processor and 256 bytes of memory. There was no keyboard, and instead programs and data were entered using switches. There was no monitor, and instead results were read by interpreting a pattern of small red lights.

To unlock this lesson you must be a Study.com Member.
Create your account

Register to view this lesson

Are you a student or a teacher?

Unlock Your Education

See for yourself why 30 million people use Study.com

Become a Study.com member and start learning now.
Become a Member  Back
What teachers are saying about Study.com
Try it now
Create an account to start this course today
Used by over 30 million students worldwide
Create an account