The History and Development of Computers

A computer is a programmable electronic machine that performs very highspeed operations as well as assembles, stores, correlates and processes information. The root word of computer, compute, comes from the Latin word computare which means “to reckon” or “to count”. Therefore, in the normal sense of the word, compute simply means to count or calculate; and computer means a person who counts or calculates or a device to handle calculations, i.e., a calculator. However, we all know that the word computer has now, more or less, come to represent the electronic computer system.

The history of computers is quite an interesting study. It is generally agreed that the abacus is probably the world’s first manual computer or calculator. It was first used by the Chinese more than 2,500 years ago and later by Romans and Greeks. A typical Chinese abacus has columns of breads separated by a crossbar. The beads, which represent numbers, are strung on wires or narrow wooden rods attached to the frame. Each column has two beads above the crossbar and five below it. Each upper bead represents five units, and each lower bead equals one unit. It can be used to add, substract, multiply, and divide and to calculate square roots and cube roots

However, the first true automatic computing machines were unknown until the 1600’s. in 1642, Blaise Pascal, the famous French mathematician, invented the first automatic calculator. The device performed additions and substractions by means of a set wheels linked to each other by gears. The first wheel represented the units, the second wheel represented the tens, the third stood for the hundreds, and so on. When the first wheel was turned ten notches, a gear moved the second wheel forward a single notch. The other  wheels became enganged in a similar manner.

Later, German mathematician Gottfried Wilhelm Leibniz improved the work of Pascal in the 1670’s. Leibniz added gear and wheel arrangements that made multiplication and division possible. Leibniz also sought a counting system that would be easier for a machine to handle than the decimal system. He developed the binary numeration system.

An important contribution to the development of binary mathematics was made in the mid-1800’s by George Boole, an English logician and mathematician. Boole used the binary, system to invent a new type of mathematics. Boolean algebra and Boolean logic perform complex mathematical and logical operations on the digits 0 and 1. Thus, a mechanical representation of binary mathematics would require the representation of only those two digits. This advance shaped the development of computer logic and computer languages.

Joseph Marie Jacquard, a French weaven made the next great contribution to the development of the computer. In 1801, he invented the Jacquard loom, which used punched cards to automate this process for the first time. The cards had patterns of holes punched in them and were placed between the rising needles and the thread. The presence or absence of a hole could be compared to the two digits of the binary system. Where there were holes, the needles, rose and met the thread. Where there were no holes, the needles were blocked. By changing cards and alternating the patterns of punched holes, it became possible to mechanically create complex woven patterns.

The Jacquard loom inspired English mathematician Charles Babbage. During the 1830’s, Babbage developed the idea of a mechanical computer that he called an analytical engine. He worked on the machine for almost 40 years. When performing complex computations or a series of calculations, the analytical engine would store completed sets of punched cards for use in later operations. Babbage’s analytical engine contained all the basic elements of an automatic computer-storage, working memory, a system for moving between the two, and an input device. But Babbage lacked funding to build the machine.

In 1888, American inventor and businessman Herman Hollerith devised a punched card system, including the punching equipment, for tabulating the results of the United States census. Hollerith’s machines used electrically charged nails that, when passed through a hole punched in a card, created a circuit. The circuits registered on another part of machine, where they were read and recorded. Hollerith’s machines tabulated the results of the 1890 census, making it the fastest and most economical census up to the date. In a single day, 56 of these machines could tabulate more census information than could 6 million people together!

Governments, institutions, and industries found uses for Hollerith’s machine. In 1896, Hollerith founded the Tabulating Machine Company and continued to improve his machines. In 1911, he sold his shares in the company and its name was changed to the Computing-Tabulating-Recording Company and was later changed again to International Business Machines Corporation (IBM) in 1924.

The first special-purpose electronic digital computer was constructed in 1939 by John V. Atanasoff, an American mathematician and physicist. In 1944, Howard Aiken, a Harvard University professor, built another digital computer, the Mark 1. The operations of this machine were controlled chiefly by elctromechanical relays or switches.

In 1945, J. Presper Eckert, Jr., and John William Mauchly, engineers at the University of Pennsylvania, completed one of the earliest general-purpose electronic digital computers. They called it ENIAC (Electronic Numerical Integrator And  Computer). ENIAC contained about 18,000 vacuum tubes instead of relays. The machine occupied more than 140 square meters of floor space and consumed 150 kilowatts of electric power during operation. ENIAC operated about 1,000 times faster than the Mark 1. It could perform about 5,000 additions and 1,000 multiplications per second. ENIAC also could store parts of its programming.

Although ENIAC worked rapidly, programming it took e very long time. Eckert and Mauchly next worked on developing a computer that could store even more its programming. They worked with John von Neumann, a Hungarian-born American mathematician. Von Neumann helped assemble all available knowledge of how the logic of computers should operate. He also helped outline how stored programming would improve performance. In 1951, a computer based on the work of the three men became operational. It was called EDVAC (Electronic Discrete Variable Automatic Computer). EDVAC strongly influenced the design of later computers.

In 1951, Eckert and Mauchly completed a more advanced computer called UNIVAC (UNIVersal Automatic Computer). Within a few years, UNIVAC became the first commercially successful computer. Unlike earlier computers, UNIVAC handled numbers and alphabetical characters equally well. It also was the first computer system in which the operations of the input and output devices were separated from those of the computing unit. Like ENIAC, UNIVAC also used vacuum tubes.

The first UNIVAC was installed at the U.S. Census Bureau in June 1951. The following year, another UNIVAC was used to tabulate the results of the United States presidential election. Based on available data, UNIVAC accurately predicted the election of President Dwight D. Eisenhower less than 45 minutes after the polls closed.

The invention of the transistor in 1947 led to the miniaturization and production of faster and more reliable electronic computers. Transistor replaced the bulkier, less reliable vacuum tubes. In 1958, Control Data Corporation introduced the first fully transistorised computer, designed by American engineer Seymour Cray. The following year, IBM introduced its first transistorised computers.

Miniaturisations continued with the development of the integrated circuit (a complete circuit on a single chip) in the early 1960’s. This device enabled engineers to design both minicomputers and highspeed mainframes with large memories.

By the late 1960’s, many large businesses relied on computers. Many companies linked their computers together into networks, enabling remote offices to share information.

Computer technology improved rapidly during the 1960’s and by the early 1970’s, the entire workings of a computer could be placed on a handful of chips. As a result, computers became very much smaller.

5 Responses to “The History and Development of Computers”

Trackbacks

  •