Header

Larry’s Computer Classes

image
center ad

Learn Computers with Video's

Beginner Video's Classes

Basic computer skills and settings

Intermediate Video's Classes

Microsoft Word, PowerPoint and Excel

Advanced Video's Classes

Create and design your own website

Click For Video's

Computer History

Previous tutorials are listed in the left column.

The History of Computers

A computer is a device that can be programmed to do a finite set of arithmetic or logical operations. The first electronic digital computer was developed in the 40 ’s by the United Kingdom and United States. Originally they were the size of a large room and used as much power as several hundred modern day computers. These computers were mainly used by the military. The history of the modern computer begins with two separate technologies, automated calculation and programmability.

In 1642, the Renaissance developed the mechanical calculator, a device that could perform all four arithmetic operations. In 1801 Joseph Jacquard made an improvement to the textile loom by introducing a series of punched cards. These punch cards were inserted into a machine which would in turn tell the machine to create different patterns. The result was an important step in the development of computers because the use of punched cards can be viewed as an early form of programmability.

In 1837 Charles Babbage was the first to conceptualize and design a fully programmable mechanical computer which he called his analytical engine. Limited finances prevented him from completed it but his son Henry Babbage, completed a simplified version of the analytical engine's computing unit in 1888. Late 1880’s Herman Hollerith invented the recording of data on a machine-readable medium. After some initial trials with paper tape he settled on punched cards. To process these punched cards he invented the tabulator and the keypunch machines. These three inventions were the foundation of the modern information processing industry. Large-scale automated data processing of punched cards was performed for the 1890 United States Census by Hollerith's company, which later became the core of IBM. By the end of the 19th century a number of ideas and technologies had begun to appear such as Boolean algebra, the vacuum tube, thermionic valve, punched cards and tape, and the teleprinter.

George Stibitz while working for Bell Labs in 1937 invented and built a relay-based calculator he named The Model K, it was the first to use binary circuits to perform an arithmetic operation, hence the first digital calculator.

Computers using vacuum tubes were in use throughout the 50 ’s but by the 60 ’s tubes had been largely replaced by semiconductor transistor based machines, which were smaller, faster, cheaper to produce, required less power, and were more reliable. The first transistorized computer was demonstrated at the University of Manchester in 1953. In the 70 ’s integrated circuit board technology and microprocessors evolved the computer world.

In 1951 Lyons Electronic Office became the first computer to run a regular routine office job. The UNIVAC was the first commercial computer developed in the U.S., with its first unit delivered to the U.S. Census Bureau. It was the first mass produced computer selling more than 45.

The IBM 701 was another commercial computer developed and it was the first mainframe computer produced by IBM. The IBM 650 was popular because of its smaller size. There were three microprocessor designs that came out at about the same time; the first was produced by Intel the 4004, the TMS 1000 from Texas Instruments and Garret AiResearch the Central Air Data Computer.

The first processors were 4-bit, but 8-bit models came out in 1972. 16-bit models were produced in 1973, and 32-bit models soon followed. AT&T Bell Labs created the first fully 32-bit single-chip microprocessor, which used 32-bit buses, 32-bit data paths, and 32-bit addresses. The first 64-bit microprocessors were in use in the early 90’s in some markets, though they didn’t appear in the PC market until the early 2000s.

The first personal computers were built in the early 70’s. Most of these were limited-production and worked based on small-scale integrated circuits and multi-chip CPUs. The Altair 8800 was the first popular computer using a single-chip microprocessor. It was also sold in kits meaning purchasers had to assemble their own computers. 1977 saw the Commodore PET, the Apple II, and the Tandy Corporation’s TRS-80. These three computer models eventually went on to sell millions. These early PCs had between 4kB and 48kB of RAM. The Apple II was the only one with a full-color, graphics-capable display, and eventually became the best-seller with more than 4 million units sold.

The first laptop built was the Osborne 1, in 1981. It had a 5" monitor and was large and heavy. The first laptop with a flip top was produced in 1982, but the first portable computer that was actually marketed as a "laptop" was the Gavilan SC in 1983. Laptops grew in popularity as they became smaller and lighter. By 1988, displays had reached VGA resolution, and by 1993 they had 256-color screens.

The first mass-produced netbook was the Asus Eee PC 700, released in 2007 in Asia, but were released in the US soon afterward. Other manufacturers quickly followed suit releasing additional models throughout 2008 and 2009.

Mobile computing is one of the most recent major milestones in the history of computers. Mobile computing really got its start in the 80 ’s, with the pocket PCs. They largely fell out of favor by the 90 ’s being replaced by Smartphone’s. Smartphone’s have truly revolutionized mobile computing; most basic computing functions can now be done on a Smartphone, such as email, browsing the internet, and uploading photos and videos.

How does a computer multitask? A computer with one processor can run several programs at once. This is done with a signal called an interrupt signal, which will periodically tell the computer to stop executing instructions and do something else. Remembering what it was executing prior to the interrupt the computer can return to that task later. If several programs are running at the same time then the interrupt generator might be causing several hundred interrupts per second. This method of multitasking is termed time-sharing.

Computers have been transferring information between each other from multiple locations since the 50 ’s. In the 70 ’s computer engineers at research institutions throughout the USA began to link their computers together using telecommunication technology. In time the network spread and became known as the Internet. In the 90 ’s the spread of applications like e-mail and the World Wide Web combined with the development of cheaper faster networking technologies like Ethernet cards saw computer networking grow exponentially. Wireless networking often utilizing mobile phone networks has meant networking is increasing even in mobile computing environments.

THANK YOU

For other articles also see previous tutorials.