Paul is a GIS professor at Vancouver Island U, has a PhD from U of British Columbia, and has taught stats and programming for 15 years.
History of Computers
A computer is an electronic machine that accepts information, stores it, processes it according to the instructions provided by a user and then returns the result. Today, we take computers for granted, and they have become part of our everyday activities. While computers as we know them today are relatively recent, the concepts and ideas behind computers have quite a bit of history - time for a whirlwind tour of how we got to the age of email, YouTube and Facebook.
Early Computing Devices
Attempts by humans to develop a tool to manipulate data go back as far as 2600 BC when the Chinese came up with the abacus. The slide rule was invented in 1621 and remained widely used until the emergence of electronic calculators in the 1970s. Both these examples of early devices were mechanical and on a human scale.
In 1830 the English mathematician Charles Babbage conceived an analytical engine, which could be programmed with punched cards to carry out calculations. It was different from its predecessors because it was able to make decisions based on its own computations, such as sequential control, branching and looping. Almost all computers in use today follow this basic idea laid out by Babbage, which is why he is often referred to as 'the father of computers.' The analytical engine was so complex that Babbage was never able to build a working model of his design. It was finally built more than 100 years later by the London Science Museum.
The First Electronic Computers
Many different types of mechanical devices followed that built on the idea of the analytical engine. The very first electronic computers were developed by Konrad Zuse in Germany in the period 1935 to 1941. The Z3 was the first working, programmable and fully automatic digital computer. The original was destroyed in World War II, but a replica has been built by the Deutsches Museum in Munich. Because his devices implemented many of the concepts we still use in modern-day computers, Zuse is often regarded as the 'inventor of the computer.'
Around the same time, the British built the Colossus computer to break encrypted German codes for the war effort, and the Americans built the Electronic Numerical Integrator Analyzer and Computer, or ENIAC. Built between 1943 and 1945, ENIAC weighed 30 tons and was 100 feet long and eight feet high. Both Colossus and ENIAC relied heavily on vacuum tubes, which can act as an electronic switch that can be turned on or off much faster than mechanical switches, which were used until then. Computer systems using vacuum tubes are considered the first generation of computers.
Vacuum tubes, however, consume massive amounts of energy, turning a computer into an oven. The first semiconductor transistor was invented in 1926, but only in 1947 was it developed into a solid-state, reliable transistor for the use in computers. Similar to a vacuum tube, a transistor controls the flow of electricity, but it was only a few millimeters in size and generated little heat. Computer systems using transistors are considered the second generation of computers.
It took a few years for the transistor technology to mature, but in 1954 the company IBM introduced the 650, the first mass-produced computer. Today's computers still use transistors, although they are much smaller. By 1958 it became possible to combine several components, including transistors, and the circuitry connecting them on a single piece of silicon. This was the first integrated circuit. Computer systems using integrated circuits are considered the third generation of computers. Integrated circuits led to the computer processors we use today.
Computers become quickly more powerful. By 1970 it became possible to squeeze all the integrated circuits that are part of a single computer on a single chip called a microprocessor. Computer systems using microprocessors are considered the fourth generation of computers.
In the early 1970s computers were still mostly used by larger corporations, government agencies and universities. The first device that could be called a personal computer was introduced in 1975. The Altair 8800 was made by Micro Instrumentation and Telemetry Systems. It included an Intel 8080 processor and 256 bytes of memory. There was no keyboard, and instead programs and data were entered using switches. There was no monitor, and instead results were read by interpreting a pattern of small red lights.
This computer was mostly used by hobbyists and hackers who would take the devices apart and build their own. Two of these hackers, Stephen Jobs and Steve Wozniak, created a personal computer in 1976 that had a keyboard, display and disk storage. They called it the Apple I. Around the same time, two other hackers, Bill Gates and Paul Allen, started to develop software for the Altair 8800 and founded Microsoft.
Other computer systems were developed by Radio Shack and Commodore. The next breakthrough came in 1981 when IBM decided to create the personal computer, or PC. Made mostly from already existing components, the IBM PC became a mass-market product in part because the design was made available to other manufacturers. The operating system of the IBM PC was MS-DOS by Microsoft.
The 1980s saw a rapid growth in the use of computer systems. By the mid-1980s, both Apple and Microsoft released operating systems with a graphical user interface. This is when personal computers started to look a lot more like the devices we use today. Since then there have been numerous technological advances and computers have become easier to use, more robust and much faster, but the fundamentals of how personal computers work were developed in this period, from around 1970 to 1985.
Trend in Processing Power
One key development in computing has been the growth in processing power. Modern computers use transistors, which are microscopic switches that control the flow of electricity. These switches can be turned on or off, which makes it possible to store binary information. More transistors means that more information can be processed. The capability of computers has grown from a single transistor in the 1950s to millions of transistors in modern processors.
This trend is known as Moore's law. Gordon Moore was one of the co-founders of Intel and described the trend that the number of transistors on integrated circuits doubles approximately every two years. He predicted this back in 1975, and so far his prediction has been proven to be quite accurate.
While the conceptual idea behind a computer was developed in the 19th century, the first electronic computer was developed in the 1940s. Early computers used mechanical relays and vacuum tubes, which were replaced by transistors and later by integrated circuits, which led to the microprocessors we use today. Personal computers were introduced in the 1970s. Companies like Apple, IBM and Microsoft were involved in developing the hardware and software that resemble the modern computer. The processing component of computers has increased according to Moore's law, doubling every two years.
After completing this lesson, you should be able to:
- Recall the early history of computing with machines starting with the abacus
- Explain the creation of analytical machines for the war effort (1940s)
- Identify the changes made by mavericks like Jobs, Wozniak, Allen and Gates
- Describe the advances due to microprocessing
To unlock this lesson you must be a Study.com Member.
Create your account
Register to view this lesson
Unlock Your Education
See for yourself why 30 million people use Study.com
Become a Study.com member and start learning now.Become a Member
Already a member? Log InBack