This story was written for an online encyclopedia project during the dotcom boom.
Computers are machines that automatically store and recall information and make calculations. In a sense they’ve been with us for a long time. For example, the ancient Sumerian’s stored data on clay tablets and the Chinese invented the abacus. However, these days the term ‘computer’ tends to apply to electronic machines that process digital information at very high speeds.
Although computers are relatively new, they have had a profound effect on our lives. Most of our activity is depended on computers. You can’t buy a loaf or bread or visit a bank without using a computer. In deed, the computer business in its many guises is now the world’s largest industry. It wasn’t always that way.
Computers evolved over many years. Charles Babbage, an English inventor is widely regarded as the grandfather of modern computing. In the 1830s Babbage designed a mechanical Analytical Engine. It incorporated many ideas familiar to modern computer scientists. Although work started on building the Analytical Engine, the project was never finished. In fact, it is doubtful if 19th Century engineering tolerances were stringent enough for the machine to work.
Among Babbage’s collaborators was Lord Byron’s daughter, Augusta Ada King, Countess of Lovelace. Ada Lovelace is generally regarded at the first computer programmer because she wrote notes showing how the engine could be used for various complex mathematical calculations.
At the end of the 19th century, the American government turned to mechanical tabulating machines and punched card readers to cope with measuring its rapidly growing population. This was the start of data processing. In the mid-1920s, one of the companies making punched card equipment changed its name to International Business Machines or IBM. IBM and its rivals developed mechanical accounting machines during the first half of the 20th Century. During the Second World War IBM worked with the US government to develop electronic calculating machines to handle specific mathematical calculations. One early digital machine was the Harvard Mark 1.
The main breakthrough of digital computing was to use the binary number system – which can represent any number by using combinations of zero and one (or in electronic circuits, off and on). Simple electronic circuits can process binary numbers at incredibly fast speeds. Moreover, all kinds of information: numbers, text, pictures and music can be quickly converted to and from a binary format.
In 1946, ENIAC – the Electronic Numerical Integrator And Computer – was born. ENIAC’s main claim to fame was that it was a general-purpose machine; it could be altered to perform different types of calculation. In the late 1940s a team at Manchester University in the North of England took this a stage further and built the first programmable computer – that is, one where the type of calculation is determined by a set of commands known as a program.
First generation computers used electronic valves; the second generation in the 1950s were built with transistors. Transistors were smaller and faster, but they didn’t require teams of engineers to constantly replace faulty valves. This made them much cheaper to operate, which meant companies could afford to use them.
By the 1970s computers were being built with integrated circuits or chips. Early chips had dozens of transistors on a single device. From that point on, computer technology continued to get faster, smaller and cheaper as chipmakers crammed more transistors and hence more computer functions onto mass-produced chips.
First and second generation computers were ‘mainframe’ machines, the 1970s saw the birth of minicomputers and, later, microcomputers. In 1981 IBM sold its first desktop microcomputer, which it called the ‘personal computer’ or PC.
Once engineers figured out how to squeeze useable processing power into a desktop package, the portable computer wasn’t far behind. Portable computers evolved from early luggable machines through laptops and notebooks to modern palmtop handheld computers like the Palm Pilot. In some respects modern, mobile phones represent the logical extension of this trend.
Early computers were isolated outposts of processing power, but designers quickly realised that the binary signals used inside computers could be easily transmitted to other devices — hence the birth of computer networking. Early computer networks were mainly developed to speed the flow of data to and from a computer. For example a remote warehouse might send inventory data to a central office. Over time the communications potential of computer networks became more important.
One of the first new uses of computer networking was electronic mail and file swapping. Originally used by scientists to collaborate on projects it quickly captured the world’s imagination. The scientist’s used a communications protocol called TCP/IP (Transmission Control Protocol/Internet Protocol); they called their network the Internet.
Members of the public started accessing the Internet in the late 1980s, in the early 1990s it commercialised and by the mid-1990s almost every commercial computer network or data service was linked to the Internet. The World Wide Web, a simple, graphical way of displaying and linking information, drove much of this growth.
Until the late 1980s most people thought of computers as being electronic copies of humans. We talked of electronic brains and machine intelligence. Worriers lost sleep speculating that one-day machines might want to replace us. The focus was on processing power. Then, almost overnight computers went from being a rival species to a place we visited. The Internet explosion of the late 1980s and early 1990s sparked talk of cyberspace and web-surfing.
More importantly computing move from its old application and data-centric model to its new communication-centric model. This change has seen the way we use computers change dramatically. Some of today’s computing devices don’t offer much in the way of processing – that’s all handled remotely on big computers – but they are very strong on communications.
Tiny handheld computers using wireless networks are finding business applications as diverse as logistics and technical support – for example people supporting Coca-Cola equipment use the technology to log customer reported faults and reorder cans of drink. Email is still the most popular application. Consumer applications include online banking and checking weather, news and sports results.
Just about all these functions can be handled at least as well by suitably equipped mobile phones. The Wireless Application Protocol (WAP) is a set of specifications that lets developers use the Wireless Markup Language (WML) to build networked applications designed for handheld wireless devices. It has been specifically designed to work within the constraints of limited memory and CPU; small, monochrome screens; low bandwidth (i.e. slow data) and erratic connections. WAP is a de facto standard, with wide telecommunications industry support.
At the time of writing the telecommunications industry is developing a new generation of wireless services which can deliver faster and more reliable services so that, for instance, it will be possible to conduct video conferencing using a mobile phone. Another development is the rise of Application Service Providers – a new breed of computer software utility that rents out remotely delivered applications.
All computers, from the humblest processors controlling a kitchen fridge to the large supercomputers being used by rocket scientists share the same four basic components:
Input and Output – the components that control the way information passes between a computer and the rest of the world. Among other possibilities input might be a keyboard, mouse or touch screen. It can also be voice recognition, an optical scanner, digital video camera or a data communications link. Output can include voice synthesis, a screen display and printer.
Processor – sometimes called the central control unit, this is the smart part of a computer’s hardware that orchestrates everything else. The processor pulls instructions from the computer memory in sequence and acts on these instructions, either performing calculations or sending commands and data to other computer components.
Memory – the place where data or program instructions are stored while they are not actually being used by the processor. Modern computers have a number of different types of memory. Cache memory stores small amounts of vital information that is currently in use. Cache is a very fast kind of semiconductor memory and can be located on the same chip as the processor. Ram (random access memory) is a fast semiconductor memory where information in use – but not immediate use – is stored. It tends to be many times larger than cache memory. Disc memory is much slower, but tends to be much larger.
Software – Computer software is the set of sequenced instructions controlling the hardware. The name comes from a joke used in the early days of computing to distinguish programs from hardware; it’s the bit that doesn’t hurt if it drops on your foot.
Software comes in a variety of forms, the most important are: system software and applications software — though the distinction between these two types is becoming increasingly blurred.
System software controls the way a computer manages its internal processes. The best-known type of system software is the operating system. Historically an operating system was simply a suite of programs that managed input and output, and controlled the way application programs were dealt with. Increasingly modern operating systems contain utility programs and components of application software.
Applications software refers to the programs, such as word processors or email clients that actually perform tasks for the computer user.