Difference Between Analog and Digital Computers

Table of Contents

Main Difference – Analog vs. Digital Computer

Analog and digital are two different types of computers. Historically, analog computers had been widely used however today, they are much less popular. The main difference between analog and digital computers is that analog computers work with continuous signals while digital computers can only work with discrete signals.

What are Analog Computers

Technically speaking, analog computers can refer to a plethora of devices. Even slide rules, used for doing calculations decades ago, can be considered to be a type of analog computer. The characteristic feature of an analog computer is that it can work with analog signals. These are signals that can take up any value along a scale.

Central to the electronic analog computer was the device known as the operational amplifier. Operational amplifiers have the ability to perform mathematical operations by producing an output voltage which is dependent on the input voltages supplied to it. Theoretically, analog computers could be infinitely precise but this is impossible to achieve practically. Electronic analog computers typically consisted of several dials and a number of connectors. Depending on usage, data could be read off using an oscilloscope or using electronic meters such as voltmeters.

Difference-Between-Analog-and-Digital-Computers-Analog_Computer.jpg

An analog computer in use (1968)

When it comes to modeling complex phenomena, analog computers have an advantage. However, building complex analog computers is a difficult and an expensive process. Analog computers were widely in use during the 1950s and the 1960s. However, the advent of integrated circuits meant that many digital components could be packed into smaller spaces. They were becoming cheaper, more powerful, less power-consuming and they were giving users the ability to obtain information more precisely. Today, an overwhelming majority of the computers in use are digital.

What are Digital Computers

Digital computers—i.e. practically anything that is called a “computer” under modern usage—cannot handle analog data. If you perform a calculation with a digital computer, it may seem that it can work with analog data but this is merely an illusion. In reality, digital computers can deal with only discrete signals. All signals supplied to the computer needs to be converted into a series of 1’s and 0’s (which really means a series of electric pulses). Since a computer can only work with a finite number of 1’s and 0’s at a time, theoretically a digital computer does not have the ability to represent data to an infinite precision. However, in reality, they tend to be better than their analog counterparts in most counts. Even though analog computers can be theoretically infinitely precise there are limitations imposed on analog computers by the limited ranges of their components and by the difficulty to read the data accurately when it is displayed on a continuous scale.

Difference Between Analog and Digital Computers

Types of Signals

Analog computers can handle analog signals.

Digital computers cannot handle analog signals; they only handle digital signals.

Electrical Noise

Analog computers have noise in their circuitry, and the noise is always carried along with the data.

Digital computers have relatively lower levels of noise, and it is easier to separate out noise from data.

Storage

In analog computers, storage is not as straightforward as it is in digital computers. Capacitors could be used to hold values, but over time, this is subjected to drift.

In digital computers, transistors store data discretely, in the form of 0’s and 1’s. 

Image Courtesy

“Man working at analog computer, 1968” by Seattle Municipal Archives (Item 78757, City Light Photographic Negatives (Record Series 1204-01), Seattle Municipal Archives.) [CC BY 2.0], via flickr

ncG1vNJzZmiolZm2oq2NnKamZ5Ses6ex0Z6lnJ1dl7K1w8SepWaZnpa5sLOMmqWdZZSetKrAwKVknKedpcK1sdGsZg%3D%3D