Tuesday, November 11, 2008

Digital Versus Analog --- What's It All Mean?

By Dan Shanefield

Until the 1980s, most music recordings were described as "analog." That includes 33 rpm "LP" discs, tape recordings, etc. Also, telephones and TVs were analog, and many people had never heard of the phrase "digital recording."

Then the compact disc ("CD") became popular in the world of music, and it was said to be "digital." In the near future (February 2009), most American television receivers that are not being used for cable TV will have to be digital, by Federal law. Why is this drastic change being forced upon us, and what's the difference?

First of all, the main problem with the old analog technology was poor accuracy. For example, if your bank's ATM machine were analog, a lot of errors would be made. If you requested "one dollar" from your bank, an ATM machine might send a signal to the bank saying "one volt of electricity." Using the old-fashioned analog technology, the signal is proportional to the meaning, so one volt might mean one dollar, and 0.9 volt would mean ninety cents. On a hot day the resistance of the wires could change, and the signal going along the wires would be only 0.9 volt, giving you only ninety cents when you requested a dollar. It is very difficult to prevent a long wire from changing its resistance at different temperatures, and also the battery voltage might change as time goes on.

With modern "digital" technology, the signal meaning "one dollar" is no longer a voltage proportional to its meaning. Instead, the signal sent from the ATM to the bank is changed to a code, something like the Morse code in a telegraph. The digital code consists of a series of "zeroes" and "ones." In each little bit of time there is either no voltage at all sent over the wire (a "zero"), or else a one-volt signal is sent (a "one"). This is called "binary," because, for each individual "digit," there are only two possibilities.

To minimize errors, a voltage anywhere from 0.8 to 1.2 is considered to be a "one." This system is very tolerant of small changes, because many types of variation do not change the message. For example, a ten percent drop in the voltage, like 0.9 volt instead of one volt, would still count as a "one," and it would not lead to an error. (These examples are just for explanation, and the voltages actually used might be different in different systems, but these illustrate the idea.)

In the digital world, a request for "ten dollars" could be a one followed by a zero. A "hundred dollars" could be a one followed by two zeroes in a row, and a "thousand dollars" could be a one followed by three zeroes.

In order to transmit a number like $279, the code still has to be limited to only zeroes and ones, something like Morse code. But it uses a lot of them, in a very complex manner, too complicated to explain in this short article. However, the good news is that the American-invented "integrated circuits" can easily and cheaply compute all these things, and also they can send messages containing millions of zeroes and ones in just a short time ("megacycles per second").

Another accuracy advantage of digital is that you can get further precision by adding more digits, like saying $279.03 instead of just rounding it off to $279. In scientific work a digital number often includes a long string of decimals, in order to be very precise. An example is 279.0316, which would probably have at least a small error if it had to be communicated in analog.

The original concept of only using zeroes or ones came from the old telegraph (not telephone) equipment, where there was either a certain fixed voltage or none at all, and never a fraction. Also, this idea was later used by automated telephone systems, where switching from one person to another person was done by "relays," which were little electromagnetic switches that were either all-the-way on or all-the-way off. These were assembled together by Bell Labs to make primitive digital computers. Later, it was found that even with the newer "transistors" instead of relays, there was much-improved accuracy if signals were still kept in binary code, so that's what modern computers use, for many things other than just voice or music recordings.

Nowadays, your voice transmitted over the telephone would start out as an "analog" signal, with the voltage proportional to the loudness (like 1.7 volts or 2.9 volts). However, in a big city your voice signal gets "digitized" when it goes to the telephone company's office, where the signal is "encoded" into a lot of short "on or off" digital segments and transmitted that way, in order to be accurate. At the other end of the wire, the signals have to be "decoded" (changed back to analog with the loudness proportional to the voltage), because our ears are strictly analog. That analog signal is what goes into the other person's earphone. Why do they bother with all this? Because with efficient "integrated circuits," it is both cheaper and also more accurate to do it this way. In a competitive world, we have to do what is better, even if it is complicated, as long as it's not expensive.

Similarly, the music recorded on a CD or iPod or DVD gets changed to digital format for much-improved accuracy ("high fidelity"), but then at the loudspeaker, it's back to analog for your ears. Also, with TV, a "digital" picture has the brightness and color broken down into billions of short segments, all just zeroes and ones when being transmitted over the air, for the same reason ("high definition," or "HD"). However, your eyes are strictly analog, just like your ears. Therefore your TV receiver has to decode the digital TV signal back to analog (voltage proportional to brightness), in order to go from the TV screen to your eye.

Now in the twenty-first century, there is the additional ability of digital technology to cram a truly enormous amount of information onto a tiny area on your DVD disc. (The zeroes and ones take up very little space.) A few years ago, this would have been too expensive, but with modern electronics it can be very cheap, as well as having very high quality. It takes a lot of technical knowledge to do all these things, but we have that now. It is philosophically debatable as to whether we really "need" all this, but many consumers do seem to want it, especially for watching sports on "HD" TV, and for "high fidelity" music, all at relatively low cost. In modern scientific and financial work, digital numbers have really become necessary.

--------------------------------------------------------

(To see my resume', search google for shanefield CV.
To read some true stories, search google for shanef28 and click on People.)