Klarinet Archive - Posting 000512.txt from 1997/11

From: Jonathan Cohler <cohler@-----.net>
Subj: Re: Recording Quality of Clarinet
Date: Thu, 13 Nov 1997 11:23:21 -0500

I've been off the list for a couple of weeks, and it looks like I've missed
an interesting discussion.

See my comments below.

-------------------
Jonathan Cohler
cohler@-----.net

>> Jrykorten@-----.com wrote:
>> >
>> > Regarding the degredation that occurs during the recording process: I
>>agree
>> > whole heartedly. For me, ever since CD's predominated the market, I
>>have had
>> > a great deal of difficulty listening to either the piano or the clarinet.
>> > Turns out the CD wipes out all the upper harmonics! (A 10kHz sine wave
>>on a
>> > CD looks like a triangle wave). It's probably for this reason that I can't
>> > get excited about any clarinetists sound from a CD. In fact Shifrin's
>>sound -
>> > which used to excite me so - now sounds like anybody else when I listen to
>> > him on CD. And I'm sure this is because of the CD recording process.

If I understand what you're saying here, then it is incorrect. First let's
consider, your statement that "the CD wipes out all the upper harmonics"
and therefore a "10kHz sine wave on a CD looks like a triangle wave".

A CD doesn't wipe out any harmonics. It is merely a storage device for
digital information. Therefore, I assume that you mean that some Analog to
Digital conversion process that occurs between the microphones and the
digital storage format are somehow "wiping out all the upper harmonics".
Before I address that, let me point out that a triangle wave has upper
harmonics (in fact all the odd harmonics -- just like a square wave, but
with different coefficients), and a sine wave has NO harmonics (by
definition). If anyone is interested in the mathematical formula for
square waves and triangle waves, check any elementary text on waves.

Now, it is certainly true that a bad Analog-to-Digital (A-to-D) converter
can create digital data that doesn't accurately represent the orginal
sound. But it is also true that a high-quality A-to-D converter, which are
plentiful on the market today, create digital data that essentially
"perfectly" represent the original sound. By "perfectly" I mean as far as
the human ear can tell.

The other step in the CD recording and playback process is the process in
which the digital data stored on the CD is converted back into sound
through a Digital-to-Analog (D-to-A) converter that then drives a
pre-amplifier, an amplifier and ultimately some speakers. It is certainly
true that bad D-to-A converters can produce bad sound, but once again there
are many essentially "perfect" D-to-A converters on the market today.

Therefore, the only differences between the digital and analog recording
processes are the type of information that is stored and the A-to-D and
D-to-A steps which exist in digital and not in analog. In the case of
Analog, there are many major problems with the storage media, such as tape
hiss (due to bias current), smaller dynamic range and generational loss as
one makes copies. In the case of Digital, the storage media is essentially
perfect, there is no generational loss, and there is no equivalent of tape
hiss.

So it all comes down to the D-to-A's and the A-to-D's. Any record company
worth its salt these days is using converters that are "perfect".

The big difference in recording clarinet (and other woodwind instruments)
is in microphone placement. Because of the radiation patterns of sound
from woodwind toneholes, a small change in microphone placement can make a
large change in the tone quality that is captured on the recording. That's
why if you lift your clarinet up and point it at the listener it sounds
much brighter, and if you move it from side to side that quality of sound
changes dramatically.

I would agree with you that most recording companies do a VERY poor job of
capturing the clarinet sound. But virtually all of the problems are due to
poor microphone placement, poor choice of microphones, and poor acoustic
spaces in which the recordings are made; and virtually none of the problems
are due to their A-to-D converters. Of course, it is the consumer that
owns the D-to-A converter (in his/her CD player or other equipment), and
therefore if the user owns a cheapo playback device, that could be the
problem too (just like a cheapo turntable/pickup doesn't make LPs sound as
good.).

Conrad Josias wrote:

> Since nearly everyone in the audience knew Information Theory,
>Boyck then immediately acknowledged that the Nyquist Frequency, the
>highest-frequency component measurable without aliasing (seeming to be
>of a different frequency, like the difference between the source and
>sampling frequencies), was half the sampling frequency, or 20 kHz.
>But that component would be reconstructed not as a sine wave, but as a
>two-point triangular wave with artificial and unpredictable amplitude
>reduction and phase displacement. And a 10-kHz component would have
>appreciable stair-case-type amplitude distortion as well as phase
>distortion.

This is not correct. The Nyquist Theorem says that if an analog signal is
digitally sampled at a frequency F then using those digital samples one can
PERFECTLY reconstruct all frequency components of the original signal up to
a frequency of 1/2 * F. The statements about triangular waves above or
just wrong. It is a mathematical fact.

That is why the sampling of 44.1 KHz was chosen for CDs. The logic was
that human beings can only hear up to 20KHz in extreme cases. Therefore,
by sampling any sound at 40KHz one can reproduce PERFECTLY any audible
sound. The extra 4.1KHz in the chosen rate was due to some technicalities
that are beyond the scope of this discussion.

Now, there is a movement afoot to double the sampling frequency up to the
range of 96KHz, and the logic behind this is that although human beings
cannot hear individual sine waves above 20KHz, there is some newer evidence
that suggests that sounds above 20KHz may make some marginally noticeable
differences to tones whose fundamental frequency lies in the audible range
below 20KHz. By sampling at 96KHz, that allows one to PERFECTLY reproduce
all harmonic components up to 48KHz, which is FAR above the audible range
for human beings.

It will be many years before this becomes commercially viable.

>
> The hue and cry from the audience included, "Who cares? My
>middle-aged hearing doesn't even go up to 10 kHz."

Wrong. Most middle-aged people can hear up to 15KHz.

> Boyck told those people that, even with their compressed
>frequency range, they could somehow perceive higher components and that,
>like most other people, their hearing would be adversely affected by
>subtle additions of high-frequency phase distortion associated exclusively
>with the digitizing process.

Wrong. "The digitizing process" has no high-frequency phase distortion
associated with it. Some bad A-to-D converters may have some
high-frequency phase distortion. Certainly, phase distortion problems are
much more prevalent in analog recording equipment.

>
> He then presented the results of several experiments that
>involved making music recordings, digital and analog, from the
>same analog audio feeds. Everyone agreed that the digital recordings,
>which used commercial sampling rates and resolution were inferior
>soundwise to the analog recordings.

They may agree, but they would be wrong. Today's CDs store 16 bits of data
per sample. This corresponds to 96dB of dynamic range. Far superior to
the dynamic range of analog recording media. As explained above, the
sampling rate allows for essentially perfect reproduction of any sound up
to 20KHz, the limit of human hearing.

> The digital recordings, to be sure,
>were quieter and of generally good quality. But everyone agreed that
>an indefinable something was missing. In a remarkably sobering moment,
>a large majority of the audience agreed that, for want of a more
>quantitative term, the analog recording had superior "presence."
>

This was a meaningless non-scientific excercise. No double blind study has
ever been done to prove the assertions made above.

> Professor Boyck concluded that part of his lecture by predicting
>improved results from the digitizing process as sampling rates
>approached 500 kHz. Although I don't follow the audio engineering field
>closely, I am aware that contemporary commercial sampling rates are
>increasing, which would affect the design of new CD and DAT players.

As mentioned above, all commercially available CDs are sampled at 44.1KHz.
There are the beginnings of a movement to sample at 96KHz (or thereabouts),
and even this increase is of highly questionable value (it will make
recordings more expensive for sure), but certainly there is nothing
approaching 500KHz, which would be a complete waste of storage capacity for
NO discernable difference.

I hope this clarifies some of the mythology that is so fervently spread by
the "audiophile" zealots.

I believe and adhere to the principle that if some aspect of sound is real,
then more than one person can hear it, and they hear it without being told
that they are supposed to hear it! Bring out your double blind studies!

Cheers.

-----------------------
Jonathan Cohler
cohler@-----.net

   
     Copyright © Woodwind.Org, Inc. All Rights Reserved    Privacy Policy    Contact charette@woodwind.org