The Full Wiki

Digital audio: Wikis


Note: Many of our articles have direct quotes from sources you can cite, within the Wikipedia article! This article doesn't yet, but we're working on it! See more info or our list of citable articles.

Did you know ...

More interesting facts on Digital audio

Include this on your site/blog:


From Wikipedia, the free encyclopedia

A sound wave, in gray, represented digitally, in red (after a zero-order hold but before filtering)

Digital audio uses digital signals for sound reproduction. This includes analog-to-digital conversion, digital-to-analog conversion, storage, and transmission. In effect, the system commonly referred to as digital is in fact a discrete-time, discrete-level analog of a previous electrical analog. While modern systems can be quite subtle in their methods, the primary usefulness of a digital system is that, due to its discrete (in both time and amplitude) nature, signals can be corrected, once they are digital, without loss, and the digital signal can be reconstituted. The discreteness in both time and amplitude is key to this reconstitution, which is unavailable for a signal in which time or amplitude or both are continuous. While the hybrid systems (part discrete, part continuous) exist, they are no longer used for new modern systems.


Overview of digital audio

Sampling and 4-bit quantization of an analog signal (red) using Pulse Code Modulation.

Digital audio has emerged because of its usefulness in the recording, manipulation, mass-production, and distribution of sound. Modern distribution of music across the Internet via on-line stores depends on digital recording and digital compression algorithms. Distribution of audio as data files rather than as physical objects has significantly reduced the cost of distribution.

From the wax cylinder to the compact cassette, analog audio music storage and reproduction have been based on the same principles upon which human hearing are based. In an analog audio system, sounds begin as physical waveforms in the air, are transformed into an electrical representation of the waveform, via a transducer (for example, a microphone), and are stored or transmitted. To be re-created into sound, the process is reversed, through amplification and then conversion back into physical waveforms via a loudspeaker. Although its nature may change, analog audio's fundamental wave-like characteristics remain the same during its storage, transformation, duplication, and amplification.

Analog audio signals are susceptible to noise and distortion, unavoidable due to the innate characteristics of electronic circuits and associated devices. In the case of purely analog recording and reproduction, numerous opportunities for the introduction of noise and distortion exist throughout the entire process. When audio is digitized, distortion and noise are introduced only by the stages that precede conversion to digital format, and by the stages that follow conversion back to analog.

The digital audio chain begins when an analog audio signal is first sampled, and then (for pulse-code modulation, the usual form of digital audio) converted into binary signals—‘on/off’ pulses—which are stored as binary electronic, magnetic, or optical signals, rather than as continuous time, continuous level electronic or electromechanical signals. This signal may then further encoded to combat any errors that might occur in the storage or transmission of the signal, however this encoding is for the purpose of error correction, and is not strictly part of the digital audio process. This "channel coding" is essential to the ability of broadcast or recorded digital system to avoid loss of bit accuracy. The discrete time and level of the binary signal allow a decoder to recreate the analog signal upon replay. An example of a channel code is Eight to Fourteen Bit Modulation as used in the audio compact disc.

Sound quality

The goal of any sound reproduction system is to achieve sonic perfection, which cannot happen in practice due to a number of unavoidable effects:

  • Non-uniform frequency response, the result of reactive effects that are present in all electrical circuits. Variations in frequency response affect analog signals and cause the apparent loudness of music or speech at any given frequency to deviate from the original as perceived by the listener. The digitizing process can further introduce frequency response errors due to resolution limits innate to the analog-to-digital encoding scheme.
  • Hum, an undesirable byproduct of powering audio devices from AC mains power sources, as well as operating them in the vicinity of AC powered apparatus. Hum can be minimized by proper circuit design, as well as optimum placement of components in the device enclosure.
  • Noise, caused by random high frequency signals that are often injected into the audio device from external sources. Noise may also appear due to switching transients in semiconductors, as well as from other components within the device. In vacuum tube-powered equipment, the tubes themselves will generate thermionic noise, which appears in the output as a hissing sound (a form of pink noise). Noise may be introduced by quantization errors in the capturing circuitry, as well as by mechanical imperfections in microphones and speakers. Both hum and noise affect analog signals, but if present at the input end of the chain, will be carried throughout the entire recording/reproduction process.
  • Compressed dynamic range, due to amplification limits in analog signals and the encoding scheme used in the digitization process. The resulting effect is that the range from minimum to maximum loudness in the reproduction process will not be as great as that of the original.

In order to achieve high fidelity, high quality components and a highly engineered circuit design are required, which increases overall cost.

Conversion process

A digital audio signal starts with an analog-to-digital converter (ADC) that converts an analog signal to a digital signal. The ADC runs at a sampling rate and converts at a known bit resolution. For example, CD audio has a sampling rate of 44.1 kHz (44,100 samples per second) and 16-bit resolution for each channel (stereo). If the analog signal is not already bandlimited then an anti-aliasing filter is necessary before conversion, to prevent aliasing in the digital signal. (Aliasing occurs when frequencies above the Nyquist frequency have not been band limited, and instead appear as audible artifacts in the lower frequencies).

Some audio signals such as those created by digital synthesis originate entirely in the digital domain, in which case analog to digital conversion does not take place.

After being sampled with the ADC, the digital signal may then be altered in a process which is called digital signal processing where it may be filtered or have effects applied.

The digital audio signal may then be stored or transmitted. Digital audio storage can be on a CD, a digital audio player, a hard drive, USB flash drive, CompactFlash, or any other digital data storage device. Audio data compression techniques — such as MP3, Advanced Audio Coding, Ogg Vorbis, or FLAC — are commonly employed to reduce the file size. Digital audio can be streamed to other devices.

The last step for digital audio is to be converted back to an analog signal with a digital-to-analog converter (DAC). Like ADCs, DACs run at a specific sampling rate and bit resolution but through the processes of oversampling, upsampling, and downsampling, this sampling rate may not be the same as the initial sampling rate.

Subjective evaluation

History of digital audio use in commercial recording

Commercial digital recording of classical and jazz music began in the early 1970s, pioneered by Japanese companies such as Denon, the BBC, and British record label Decca (who in the mid-70s developed digital audio recorders of their own design for mastering of their albums), although experimental recordings exist from the 1960s. The first 16-bit PCM recording in the United States was made by Thomas Stockham at the Santa Fe Opera in 1976 on a Soundstream recorder. In most cases there was no mixing stage involved; a stereo digital recording was made and used unaltered as the master tape for subsequent commercial release. These unmixed digital recordings are still described as DDD since the technology involved is purely digital. (Unmixed analog recordings are likewise usually described as ADD to denote a single generation of analog recording.)

Although the first-ever digital recording of a non-classical music piece, Morrissey-Mullen's cover of the Rose Royce hit "Love Don't Live Here Anymore" (released 1979 as a vinyl EP) was recorded in 1978 at EMI's Abbey Road recording studios, the first entirely digitally recorded (DDD) popular music album was Ry Cooder's Bop Till You Drop, recorded in late 1978. It was unmixed, being recorded straight to a two-track 3M digital recorder in the studio. Many other top recording artists were early adherents of digital recording.

Digital audio technologies

Digital audio broadcasting

Storage technologies:

Digital audio interfaces

Audio-specific interfaces include:

Naturally, any digital bus (e.g., USB, FireWire, and PCI) can carry digital audio. Also, several interfaces are engineered to carry digital video and audio together, including HDMI and DisplayPort.


  • Borwick, John, ed., 1994: Sound Recording Practice (Oxford: Oxford University Press)
  • Ifeachor, Emmanuel C., and Jervis, Barrie W., 2002: Digital Signal Processing: A Practical Approach (Harlow, England: Pearson Education Limited)
  • Rabiner, Lawrence R., and Gold, Bernard, 1975: Theory and Application of Digital Signal Processing (Englewood Cliffs, New Jersey: Prentice-Hall, Inc.)
  • Watkinson, John, 1994: The Art of Digital Audio (Oxford: Focal Press)
  • Bosi, Marina, and Goldberg, Richard E., 2003: Introduction to Digital Audio Coding and Standards (Springer)

See also

Got something to say? Make a comment.
Your name
Your email address