We spend a lot of time watching and listening to our smartphones and tablets. The younger you are the more likely you are to turn to them for watching a movie or TV show instead of an actual TV. For a lot of us it is our primary source of music with our own content or streaming services. Very rarely when new phones or tablets are announced does a company place any emphasis on the quality of the audio.

Display quality also used to receive very little attention. As more and more people reported on the display performance, more companies started to take notice. Now benefits like “Full sRGB gamut” or “dE < 3” are touted on new products. So now we are going to introduce a new set of testing for smart phones and tablets, audio performance.

To do this right we went to the same company that all the manufacturers go to: Audio Precision. Based out of Beaverton, OR, Audio Precision has been producing the best audio test equipment out there for over 25 years now. From two channel analog roots they now also test multichannel analog, HDMI, Optical, Coaxial, and even Bluetooth. Their products offer resolution that no one else can, which is why you will find them in the test and production rooms of almost any company.

Just recently they introduced a brand new set of audio tests for Android devices. Combined with one of their audio analyzers, it allows us to provide performance measurements beyond what has been possible before. Using an Audio Precision APx582 analyzer we set out to analyze a selection of Android phones to see what performance difference we can find. More phones and tablets will follow as these tests can be run.

The Test Platform

The test platform is the Audio Precision APx series of audio analyzers. For this initial set of tests I used an APx 582 model, which has two analog outputs and 8 channels of analog inputs. The outputs are not necessary as all of the test tones are provided by Audio Precision for playback on the devices. For each set of tests we can add a load, simulated or real, to see how the device handles more demanding headphones. For this article I am sticking with only a set of the updated Apple Earbuds. They are probably the most common headphone out there and easy to acquire to duplicate testing. For future tests the other loads will be AKG K701 headphones and Grado SR60 headphones. Both models are popular, and I happen to own them.

There are a few main tests we are going to use for all these reviews. Those key tests are maximum output level, Total Harmonic Distortion + Noise (THD+N), Frequency Response, Dynamic Range (as defined by AES17), and Crosstalk. These tests are the exact same ones that manufacturers will be running to verify their products. Most of these tests will be run at maximum output levels. Most amplifiers perform best at close to their maximum levels, as the residual noise compared to the signal decreases, and so that is what they are typically tested at.

We might add more tests as we decide they are relevant to our testing. I will also attempt to go back and fill in as much data as possible from previously reviewed devices as time permits. Now to look at the tests and see our results for our initial set of phones.

Comments Locked


View All Comments

  • DanNeely - Sunday, December 8, 2013 - link

    I'm curious how well your Grado headphone's are holding up? I bought a pair of SR80's for use at work last winter; but the wire started to develop damage a month or two ago. If I move the wrong way while wearing them I can get brief bursts of static in one ear, and can mute that ear by pinching the cable just above the Y. I suspect the damage was caused by the post in the headband allowing the earcups to spin freely, combined with the unmarked cable making it hard to notice anything less than a half dozen or so revolutions of twist. I'm wondering how much of this is bad luck on my part vs poor design/manufacturing.
  • Marovincian - Monday, December 9, 2013 - link

    I had similiar concens with my Grados (actually mine are Allesandro ms1i). I sent grado an email suggesting that they put a stripe on the "Y" wires so that you could more easily straighten them out. They said that they would pass it along to their design team. Then they sent me a free T-Shirt. Classy company for sure.
  • ManuLM - Sunday, December 8, 2013 - link

    Quite a good initiative thanks, it is too hard to get these numbers nowadays.

    I would suggest you guys build up a database over time of phones performance (see headphoneinfo awesome job for instance).
    I also suggest that you add to your test the maximum output delivered (power or voltage swing into load). This is interesting, because if a phone clips at high volumes, but its output power is 10dB above the others in average, then the normal user will simply not see the drawback (altough I admit this is initially poor job from the company in tuning the audio system).
    It also helps to chase the brands which deliver lower output power, that can turn to a problem on more demanding headphones (high impedance requiring higher voltage swing). Some users will fancy some extra power on their headphone output (even if this might not be safe for their ears).
    Last point, some high-end IEMs have quite low impedance, that demand fairly high current specially in the high energy low frequency, creating bass roll off. A simple frequency response check on a low impedance IEM would show this.
  • RandomUsername3245 - Sunday, December 8, 2013 - link

    I like the idea of audio testing, but I am disappointed by the methods used in this article: why would you bother testing a device at maximum volume when you know it is clipping badly? You should reduce the volume to a setting where it does not clip and then continue the review. You can then report the maximum useable volume setting on the device.

    The maximum volume on an iPhone is reported to be in excess of 100 dB. Listening at this volume for even a short period (15 minutes) on a consistent basis will permanently damage your hearing. Why not test these devices at reasonable volume levels?

    (hopefully not too flawed analogy follows...)
    If you are comparing two overclocked computers for maximum performance, you set them to their highest stable clock rate and then benchmark. You do not set one to a clock rate that causes continual crashing, and then report that it failed several of the benchmarks. I think this is comparable to audio review for the clipping cellphones. You might argue that the device should support any user-accessible volume level, but historically it is very common for audio amplifiers to allow users to adjust the gain until the output clips. Apple is an unusual case that limits the user to only access non-clipping gain settings.
  • ManuLM - Sunday, December 8, 2013 - link

    audio systems are tested at max performance (there are many reasons for that, including the fact that when you sell something, all usage range of the system should be good), so analogy with OC is not ideal.
    I agree with you though that testing at nominal volume could help, as an adder only of max volume testing
  • eio - Monday, December 9, 2013 - link

    yes, power of drive is a good factor in a benchmark. but performances at different loads should not be compared directly.
  • eio - Monday, December 9, 2013 - link

    a ideal test may have several series of performance graphs with several steps of incrementing loads...
  • RandomUsername3245 - Wednesday, December 11, 2013 - link

    Late reply...

    Like I said in my previous comment, it is common for audio amplifiers to allow you to adjust the gain past where the amplifier will start to clip. You should never expect a car stereo or home theater amplifier to allow you to run at maximum gain without clipping, so why should you expect a phone's headphone amplifier to behave differently?

    The proper way to run this test is to adjust the amplifier to maximum non-clipped gain and then run the test.
  • willis936 - Sunday, December 8, 2013 - link

    The day has finally arrived? Good data with some surprising results. I think I'm mostly surprised at how well all of the devices perform. I think dynamic range is perhaps the most important test here simply because most people won't be listening at max volume on headphones and pushing the noise floor down as low as possible is important for quiet listening.

    Were these tests done on the AKG K701? That is well known as a difficult to drive pair of cans without an amp. If a phone can drive those loudly with good measurements then it's certainly good enough for anything I'd use it for. Testing should be done worst case and if there's time more typical cases. When using my phone as a line out I'll typically leave it 3 steps below max because I expected there to be output stage power issues (seen as dramatic clipping on LG's stuff :x) on my phone. Any lower and as you noted the static noise floor lowers the SNR.

    I was a little surprised at the weak channel separation in the otherwise amazing iphone. Channel separation is already a p big issue. Even with expensive headphones it's easy to test and ballpark a crosstalk of worse than -60dB by ear just from the jack to the drivers.

    I'd like to make a request for some data of testing devices (1 iphone and 1 iconic android per year?) going backwards to see a progression (or maybe lack thereof) of audio quality in smartphones over the past 4 or 5 years.
  • willis936 - Sunday, December 8, 2013 - link

    Oh, and thanks for the excellent write up and all of your hard work! I'm looking forward to future data.

Log in

Don't have an account? Sign up now