This beginners’ guide to the computer is from a 1999 school book.
Computers are machines that store, recall information and make calculations.
In a sense, computers have been with us for a long time. Ancient Sumerians stored data on clay tablets. The Chinese invented the abacus. These days a computer is an electronic machine that processes digital information at high speeds.
Although computers are still new, they have had a profound effect on our lives. Most of our activity depends on computers. You can’t buy a loaf of bread or visit a bank without using a computer. Indeed, the computer business is now the world’s largest industry. It wasn’t always that way.
Computers evolved over many years. Charles Babbage, an English inventor is the grandfather of modern computing. In the 1830s Babbage designed a mechanical Analytical Engine. It incorporated many ideas familiar to modern computer scientists. Although work started on building the Analytical Engine, it was never finished. In fact, it is doubtful if 19th Century engineering tolerances were good enough for the machine to work.
Babbage’s collaborator was Lord Byron’s daughter, Augusta Ada King, Countess of Lovelace. Ada Lovelace is regarded at the first computer programmer. She wrote notes showing how the engine could do complex mathematical calculations.
At the end of the 19th century, the American government turned to mechanical tabulating machines and punched card readers to measure its fast growing population. This was the start of data processing.
In the mid-1920s, one of the companies making punched card equipment changed its name to International Business Machines or IBM. IBM and its rivals developed mechanical accounting machines in the first half of the 20th Century. During the Second World War IBM worked with the US government to develop electronic calculating machines to handle specific mathematical calculations. One early digital machine was the Harvard Mark 1.
Digital computing’s breakthrough was to use the binary number system – which can represent any number by using combinations of zero and one (or in electronic circuits, off and on). Simple electronic circuits can process binary numbers at fast speeds. Moreover, all kinds of information: numbers, text, pictures and music can be converted to and from a binary format.
In 1946, ENIAC – the Electronic Numerical Integrator And Computer – was born. ENIAC’s claim to fame was that it was a general-purpose machine; it could be altered to perform different types of calculation. In the late 1940s, a team at Manchester University in the North of England took this a stage further and built the first programmable computer – that is, one where the type of calculation is determined by a set of commands known as a program.
First generation computers used electronic valves. The second generation in the 1950s were built with transistors. Transistors were smaller and faster, but they didn’t require teams of engineers to replace faulty valves. This made them much cheaper to run, which meant companies could afford to use them.
By the 1970s computers were built with integrated circuits or chips. Early chips had dozens of transistors on a single device. From that point on, computer technology continued to get faster, smaller and cheaper as chipmakers crammed more transistors and hence more computer functions on mass-produced chips.
First and second generation computers were mainframe machines. The 1970s saw the birth of minicomputers and, later, microcomputers. In 1981 IBM sold its first desktop microcomputer, which it called the personal computer’ or PC.
Once engineers figured out how to squeeze usable processing power into a desktop package, the portable computer wasn’t far behind. Portable computers evolved from early luggable machines through laptops and notebooks to modern palmtop handheld computers like the Palm Pilot. In some respects modern, mobile phones represent the logical extension of this trend.
Early computers were isolated outposts of processing power. Engineers realised they could send the binary signals used inside computers to other devices — hence the birth of computer networking. Early computer networks were developed to speed the flow of data to and from a computer. A remote warehouse might send inventory data to a central office. Over time the communications potential of computer networks became more important.
One of the first new uses of computer networking was electronic mail and file swapping. Originally used by scientists to collaborate on projects it captured the world’s imagination. The scientist’s used a communications protocol called TCP/IP (Transmission Control Protocol/Internet Protocol); they called their network the Internet.
Members of the public started accessing the Internet in the late 1980s, in the early 1990s it commercialised and by the mid-1990s almost every commercial computer network or data service was linked to the Internet. The World Wide Web, a simple, graphical way of displaying and linking information, drove much of this growth.
Until the late 1980s most people thought of computers as electronic copies of humans. We talked of electronic brains and machine intelligence. Worriers lost sleep speculating that one-day machines might want to replace us. The focus was on processing power. Then, almost overnight computers went from being a rival species to a place we visited. The Internet explosion of the late 1980s and early 1990s sparked talk of cyberspace and web-surfing.
More important, computing moved from its application and data-centric model to a communication-centric model. This change altered the way we use computers. Some of today’s computing devices don’t offer much processing – that’s all handled on big remote computers – but they are strong on communications.
Tiny handheld computers using wireless networks are finding business applications as diverse as logistics and technical support – people supporting Coca-Cola equipment use the technology to log customer reported faults and reorder cans of drink. Email is still the most popular application. Consumer applications include online banking and checking weather, news and sports results.
All these functions can be handled at least as well by mobile phones. The Wireless Application Protocol (WAP) is a set of specifications that lets developers use the Wireless Markup Language (WML) to build networked applications designed for handheld wireless devices. It works within the constraints of limited memory and CPU; small, monochrome screens; low-bandwidth (i.e. slow data) and erratic connections. WAP is a de facto standard, with wide telecommunications industry support.
The telecommunications industry is developing new wireless services which can deliver faster and more reliable services so it will be possible to conduct video conferencing using a mobile phone. Another development is the rise of Application Service Providers – a new breed of computer software that rents remote, delivered applications.
All computers, from the humblest processor to the large supercomputers used by scientists share four basic components:
Input and Output – the components that control how information moves between a computer and the rest of the world. Input can be a keyboard, mouse or touch screen. It can also be voice recognition, an optical scanner, digital camera or a data communications link. Output can include speech, screens or printers.
Processor – sometimes called the central control unit, this is the part of a computer that orchestrates everything else.
Processors pull instructions from the computer memory in sequence and act on these instructions. They either perform calculations or send commands and data to other computer components.
Memory – the place where data or program instructions are stored while they are not being used by the processor. Modern computers have different types of memory. Cache memory stores small amounts of vital information that is currently in use. Cache is fast semiconductor memory and can be located on the same chip as the processor. Ram (random access memory) is a fast semiconductor memory where information in use – but not immediate use – is stored. It tends to be many times larger than cache memory. Disc memory is slow, but tends to be larger again.
Software – Computer software is the set of sequenced instructions controlling the hardware. The name comes from a joke used in the early days of computing to distinguish programs from hardware; it’s the bit that doesn’t hurt if it drops on your foot.
Software comes in a variety of forms, the most important are: system software and applications software — though the distinction between these two types is becoming blurred.
System software controls the way a computer manages its internal processes. The best-known type of system software is the operating system. Historically an operating system was simply a suite of programs that managed input and output, and controlled the way application programs were dealt with. Increasingly modern operating systems contain utility programs and components of application software.
Applications software refers to programs, such as word processors or email clients, that perform tasks for the computer user.