Friday, September 18, 2009

MP3 Player Performance

I purchased a Zune HD earlier this week and I intend to fully review the device here on this blog over the next little while.  So this is the first of several posts with my findings and thoughts on the device.  This first one represents the technical findings of how the Zune HD compares with other common MP3 players.

So as part of that I’m going to try to explain a little bit about the technology behind why a particular MP3 player sounds good or bad, beyond the technical specifications given by the manufacturer.

This post will be a little techie in nature, but I’m going to try to break the technological jargon down into such a way that anyone can understand it should they so desire.  I think it’s important to put some of this information out there, as there are a lot of misconceptions about which players have the best sound quality. 

For this test I included all of the players I have at my disposal.  If someone would like me to test another player, I’d be happy to do so, as long as you are willing to loan it to me for a couple of days.

The players I have included in this test are an iPod Classic (80GB), iPod Touch (2nd Generation), Zune 8GB (flash-memory based), Zune 80GB (hard disk drive), and the brand-new Zune HD introduced this week (flash memory). 

Frequency Response

The frequency response of a player is a measurement of how equally produces different frequencies of audio across the audible frequency range.  It is expressed in terms of a frequency range (usually 20 Hz to 20 kHz, the typical range of human hearing) and the response (expressed in +/- decibels). 

Decibels are a bit of a tricky thing to understand.  They are logarithmic in nature, so a 3dB change doesn’t sound like 1/3 of a 10dB change.  In terms of human hearing, a 10 dB change represents a perceived doubling of volume, and since dB are logarithmic, 20 dB is perceived as twice as loud as that.  The smallest change that an untrained ear can detect is generally about 2-3 dB.  (3 dB is considered “barely perceptible, while 5 dB is “clearly noticeable.”)

In practical terms, frequency response is manifested as a difference in volume between different notes.  If a middle C is heard at 80 dB, but a C in the next higher octave is heard at 76 dB, there is a 4 dB difference in volume between the two.  This is a difference that would be heard, but would not seem particularly significant.  In an ideal world frequency response would be +/- 0 dB over 20 Hz to 20 kHz.  So when measuring the output of a device, the volume level wouldn’t change by any more than 0 dB over that entire range of frequencies.  Fortunately, most electronics these days are able to deliver something very close to that.

In fact, all of the devices I tested were within +/- 0.5 dB over the entire 20-20 range.  The Zune HD was the only one with an actual measurable variation in its response, dropping 0.5 dB between 17 kHz and 20 kHz.  But considering that (A) this range is at the highest frequency range of human hearing and that even the most highly trained ears just aren’t very sensitive to it, and (B) the change is only 0.5 dB so very few people would even be able to pick up on it, this is essentially negligible, and the end result is that all players do extremely well in this area, at least in lab tests.  The Apple devices, on the other hand, while generally flat, did exhibit a very slight exaggeration of frequencies above 10 kHz, but the numbers didn’t show this as something that anyone would be able to detect audibly.  As far as I am concerned, all of the devices tested do extremely well, with extremely little difference between them.

Harmonics

Before I post too much on the results of testing, I need to explain harmonics a little bit.

For each note on the scale, and every sound that we hear, there is a fundamental (primary) frequency, and harmonic frequencies that define the sonic character of the sound.  The first defined harmonic is the second harmonic.  It is a sound produced at exactly twice the frequency of the fundamental. The 3rd harmonic is at three times the original frequency, and so on. 

A piano sounds different than a guitar playing the same note primarily because of the different harmonics of the two instruments.  Different instruments emphasize the different harmonics differently.

The relationship between fundamental frequencies and harmonics is, well, complicated.  But there are a few general rules:

  • Fundamental frequency defines the frequency (note) that we hear, such as a middle C.
  • Even numbered harmonics (2, 4, 6 times the original frequency) are generally considered pleasing. 
  • Odd numbered harmonics (3, 5, 7 times the original frequency) are generally considered to be harsh

In terms that musicians will understand, take a look at the harmonic frequencies created based on a fundamental, in this case a C2. 

Harmonics Based on a C2 Fundamental

Even Harmonics

Odd Harmonics

Harmonic

Equivalent Note

Harmonic

Equivalent Note

2nd

C3

3rd

G3

4th

C4

5th

Between Eb4, E4

6th

G4

7th

Between A4, Bb4

8th

C5

9th

D5

10th

Between Eb5, E5

11th

Between F5, Gb5

So if you play a C2 on an instrument with a strong 3rd harmonic, a substantial portion of what you hear is the same frequency as a G3.  If you are playing something in a major key (C2 in this example) and the most common chord is the I (C) chord, this might not be too bad because one of the notes of that chord is probably a G2.  The harmonic actually creates some degree of harmony with the chord being played.  In the case of the I chord, even the 5th harmonic isn’t too awful, as it falls near an E, also a note in that chord.  But when you move to another chord like ii, iii, IV, V, or vi, these odd numbered harmonics start to clash with the chord being played.  The IV chord, for example, is made up of F, A, and C.  Adding the 3rd harmonic of the C (a G) clashes pretty badly with the F and A.

Even numbered harmonics, on the other hand, tend to fall at octave intervals.  In fact, the 2X and 4X harmonics fall directly at the next higher two octave intervals.  So they tend to sound pleasing.  An instrument with strong even harmonics will be pleasing and easy to listen to.

In terms of electronics, though, harmonics are difficult to avoid, particularly when it comes to digital devices.  Most inexpensive digital devices have trouble reproducing recorded sounds without adding harmonics, especially the undesirable odd harmonics, just as part of their very nature.  This is one of the reasons that digital doesn’t sound as warm and friendly as old analog recordings, like tapes and records.  Digital devices “like” to produce the ugly odd harmonics, and it can be difficult to bring them under control.  It can be done, but it is expensive to design and build electronics that avoid this problem.

With that said, each device manufacturer has to make a compromise between affordability and high quality sound output.  Different devices fall at different places on this scale, and this is what this post is about.  With that, it’s time to introduce the test results.

PMP Performance - Sweep Harmonics

What I did here was to create a file that plays all frequencies from 20 Hz to 20 kHz at equal level.  I then played that file on each of the devices I tested, using the best possible audio file the device would support.  The diagonal yellow line for each device indicates fidelity compared to the original signal, and anything else in the image (purple) is additional, unwanted sound added by a device.

The first block is the original, unaltered file, to show what the output of an absolutely perfect device would look like.  A perfect device would output the original recording exactly as it had been created.  No such device really exists, but this block shows what it would look like if it did.

The second block is the Control.  This shows the response of the sound device I used in testing, which is an $800 sound card device I use for recording in my studio.  It adds just a hair of background noise (hiss), in the form of the general purple background.  It also adds a bit of 3rd order harmonic, seen as a faint line sloping upward above the yellow line.  You can also see reflections of that harmonic as lines that alternate downward and upward starting below the “ro” in the word Control.  These are artifacts introduced by the audio capture hardware, and generally would not actually be heard when listening to the output.  As long as the reflections remain faint lines, they can be ignored as they do not represent the output of any given device.

The next block is the output of the brand-new Zune HD released this week.  It adds a little bit more 3rd order harmonic than my sound card, but it is generally very faint as well.  A 5th order harmonic is also faintly visible.  Beyond that no detectable harmonics are being added.  I was actually pleasantly surprised how well the device did, considering the relatively low cost of the Zune hardware, especially when compared to the Control device.

The 4th block is a little bit troubling to me.  It represents the output if the iPod Touch (2nd generation).  Not only does it output quite a bit of noise (hiss), represented by the bright purple background, its 3rd and 5th harmonics are very prevalent as well.  In real world terms this means that the player sounds not only noisy (again, bright purple background), but harsh (bright purple lines).  Output of the iPod Touch was disappointing, and definitely subpar for a modern music player.  It was also hard to test, as it was difficult to find a proper balance between optimal volume with the least background noise, and the least distortion manifested in the form of those evil odd harmonics.  The more I turned it up the more prevalent the brighter the harmonic lines became, but when I turned it down the background noise level increased relative to the test signal.  What you see here is the best balance I could get out of the device, with the volume level at about the 67% point.  (All other devices could be tested at maximum volume.)

The next two blocks represent my other Zunes, the 80GB hard-drive and 8GB flash memory-based players, respectively.  These two devices supposedly use the same audio hardware, and leaked information about the Zune HD indicated that it used the same audio yet again, and the test results seem to confirm that.  Performance on these two Zunes was quite good, differing only in the frequency response from the HD model.

The last block is the output of the iPod Classic.  It is generally pretty good as well.  Compared to the Zunes, it adds a measurable amount of 2nd order and 4th order harmonics, with similar amounts of 3rd order harmonics.  In an absolutely ideal world, a device shouldn’t add any harmonics at all, but if it has to, it is going to sound better if they are even rather than odd.  The odd harmonics on the Classic are stronger than the even harmonics, but neither are out of control.  The measurable amount of even harmonics may make the device sound “warmer” than other players, though technically speaking this isn’t “accurate” or faithful to the original recording.

Overall most devices did pretty well on this test.  The notable exception is the iPod Touch, which was generally quite noisy and added what I consider to be unacceptable amounts of distortion in the form of the unpleasing odd harmonics.  In the real world this means that the iPod Touch sounds more harsh and grating than the other players.  The distortion and noise levels were high enough that they would be audible in A/B testing with any of the other devices.

While the consistency of performance across Zune devices was predictable, the inconsistency between the two iPods was not expected.  There has been a rumor that over the years Apple has made more and more compromises on the audio hardware in their iPods with each passing generation, and the results of this test seem to confirm that this may be true, as the older Classic had considerably better performance than the newer iPod Touch. 

Many sites on the Internet have praised the Zune for its generally high audio quality, and shunned the iPod line for its poor audio quality with respect to other devices on the market.  While these reputations may be at least partially true, the actual difference between the two isn’t that significant, with the notable exception of the Touch, whose performance was actually disappointing.  But for the most part the performance on the iPod Classic was fine, nearly equaling that of the Zunes, at least in the areas of background noise and harmonic distortion.

Distortion Under Load

The above test was taken with essentially no load on the players.  Which means that they weren’t having to work in order to push the little speakers in a pair of earphones or headphones.  So the numbers really represent an ideal situation, where the connected headphone doesn’t introduce a “load” on the player.  But I thought that to be totally idealistic, so I decided to test under the most brutal of all conditions, with the players turned up to the maximum volume with acceptable amounts of distortion with an inefficient pair of headphones attached.  While the previous test represents the best test case scenario, this test represents the worst.

PMP Performance - Sweep Harmonics - Load

This start to get a little more interesting here.  The performance of the devices starts to diverge a little bit.

The Zunes introduced a great deal more distortion at maximum volume than they did with no load attached.  (The distortion level on the iPods didn’t change a whole lot.)  What is even more interesting to me, though, is the amount of distortion found in the Zunes in even harmonics.  That generally isn’t seen in electronics quite like this.  It is entirely possible that it was actually the headphones creating these distortion lines, as they, as mechanical devices, would be more likely to produce even harmonics.  But I don’t have any way to measure that, so it will have to be left at conjecture rather than observed phenomenon.

Generally speaking, though, I did notice some trends.  It looks like Microsoft allows the Zunes to be driven more into volume levels where distortion would be introduced than Apple did with the iPods.  Apple was a little more conservative in the design of the player.  Apple essentially designed more headroom into the iPods.

With that said, though, the volume levels being used for this particular test would be excessive and even dangerous for anyone’s ears for any sort of listening beyond very short bursts.  The players should never be turned up to this level in the real world, lest hearing damage occur.  At more acceptable levels, the amount of distortion on all players, both Zune and iPod, began to resemble the previous test much more closely.  In the end, while this test for distortion at maximum load was interesting, it ends up not being meaningful.  In truth, the previous no-load test much better represented how devices will actually sound at healthy listening volumes.  So interesting to look at, yes.  Meaningful or significant, no.

What does it all mean?

In short, aside from the iPod Touch, any of the tested devices will be capable of delivering pretty high quality audio.  I wouldn’t hesitate to recommend any of the Zunes, or the iPod Classic to anyone worried about sound quality.  The differences are barely measurable, let alone audible.

The iPod Touch, on the other hand, is another case entirely.  Not only is it excessively noisy, it adds what I would consider unacceptable amounts of unpleasing distortion to the audio signal, at least for a modern device.  Its performance would have been acceptable (or even “good”) just ten years ago, but it doesn’t meet the expectations I would have for a piece of consumer electronics in 2009.  It is too easy and too cheap to produce high quality audio these days for Apple to have let this slip. 

When it boils down to it, I believe that it was probably a financial decision on their part.  While higher quality audio electronics probably would have only cost an additional couple dollars per unit, when you consider how many million of these things Apple has sold, that few dollars adds up to quite a lot of money in the end.  It would have been nice if Apple had put more emphasis on sound quality than the bottom line.  They would have had a better product.

With that out on the table, though, I’m sure that most people aren’t interested enough in sound quality to even care that compromises have been made.  Most people are content enough with the downright awful earphones that Apple ships with iPods (to the detriment of their ears), so why would they care about the sound quality of the device?  For those that do care, though, the Touch is probably better avoided, while the other players would probably be just fine.

1 comment:

Pooja said...

The Zune Media Player comes packed with good features. The touch screen is of high quality. It comes with entertaining accessories like a photo viewer, games and above all the podcast management. The sound quality is very nice. Battery life is very satisfying. In terms of performance,its really very good. For more details refer zune mp3 player

Google Search