Mea-Culpa: It Should Have Been Caught Earlier

Section By Andrei Frumusanu

As stated on the previous page, I had initially had seen the effects of this behaviour back in January when I was reviewing the Kirin 970 in the Mate 10. The numbers I originally obtained showed worse-than-expected performance of the Mate 10, which was being beaten by the Mate 9. When we discussed the issue with Huawei, they attributed it to a firmware bug, and pushed me a newer build which resolved the performance issues. At the time, Huawei never discussed what that 'bug' was, and I didn't push the issue as performance bugs do happen.

For the Kirin 970 SoC review, I went through my testing and published the article. Later on, in the P20 reviews, I observed the same lower performance again. As Huawei had told me before it was a firmware issue, I had also attributed the bad performance to a similar issue, and expected Huawei to 'fix' the P20 in due course.

Looking back in hindsight, it is pretty obvious there’s been some less than honest communications with Huawei. The newly detected performance issues were not actually issues – they were actually the real representation of the SoC's performance. As the results were somewhat lower, and Huawei was saying that they were highly competetive, I never would have expected these numbers as genuine.

It's worth noting here that I naturally test with our custom benchmark versions, as they enable us to get other data from the tests than just a simple FPS value. It never crossed my mind to test the public versions of the benchmarks to check for any discrepancy in behaviour. Suffice to say, this will change in our testing in the future, with numbers verified on both versions.

Analyzing the New Competitive Landscape

With all that being said, our past published results for Kirin 970 devices were mostly correct - we had used a variant of the benchmark that wasn’t detected by Huawei’s firmware. There is one exception however, as we weren't using a custom version of 3DMark at the time. I’ve now re-tested 3DMark, and updated the corresponding figures in past reviews to reflect the correct peak and sustained performance figures.

As far as I could tell in my testing, the cheating behaviour has only been introduced in this year’s devices. Phones such as the Mate 9 and P10 were not affected. If I’m to be more precise, it seems that only EMUI 8.0 and newer devices are affected. Based on our discussions with Huawei, we were told that this was purely a software implementation, which also corroborates our findings.

Here is the competitive landscape across our whole mobile GPU performance suite, with updated figures where applicable. We are also including new figures for the Honor Play, and the new introduction of the GFXBench 5.0 Aztec tests across all of our recent devices:

3DMark Sling Shot 3.1 Extreme Unlimited - Graphics 

3DMark Sling Shot 3.1 Extreme Unlimited - Physics 

GFXBench Aztec Ruins - High - Vulkan/Metal - Off-screen GFXBench Aztec Ruins - Normal - Vulkan/Metal - Off-screen 

GFXBench Manhattan 3.1 Off-screen 

GFXBench T-Rex 2.7 Off-screen

Overall, the graphs are very much self-explanatory. The Kirin 960 and Kirin 970 are lacking in both performance and efficiency compared almost every device in our small test here. This is something Huawei is hoping to address with the Kirin 980, and features such as GPU Turbo.

Raw Benchmark Numbers The Reality of Silicon And Market Pressure
POST A COMMENT

84 Comments

View All Comments

  • sing_electric - Tuesday, September 4, 2018 - link

    At some point, Huawei (and other Chinese OEMs) need to decide whether they want to build their brands globally or just in their home market.

    "Other Chinese OEMs lie so we've got to as well" ends up doing nothing but providing ammunition for those that say that Chinese phones are "cheap," under-performing knock offs.

    The Nexus 6p showed many years ago that Huawei can make good hardware. HiSilicon's chips obviously aren't doing them many favors in the GPU department, but that just means they need to target appropriate segments where they are competitive (I'm convinced that there's a large niche of people who want stylish devices that feel premium but don't really care much about performance), rather than lying and perpetuating a stereotype that will hurt their brand long after they've abandoned those practices.
    Reply
  • A5 - Tuesday, September 4, 2018 - link

    Calling the 6P "good" hardware is a bit generous. The battery subsystem has a devastating defect rate, especially since the phone is sealed.

    At one point Google ran out of refurbs and had to give out Pixel XLs to people as replacements.
    Reply
  • ventrolis - Tuesday, September 4, 2018 - link

    Are the charts for Aztec Normal/High flipped? Somehow I imagine the 'High' test would be more difficult and have lower frame rates than 'Normal'. Reply
  • Andrei Frumusanu - Tuesday, September 4, 2018 - link

    Thank you for pointing it out, indeed the labels were flipped. Reply
  • CityZ - Tuesday, September 4, 2018 - link

    Why not simply do a combined performance & power test where you run the benchmark continuously until the phone shuts down? If a phone maker tries to cheat for the performance side, they'll look bad on the run-time side. If a phone throws up a "I'm running too hot" screen, consider that the end of the test. Such a test not only shows how fast your game may perform, but also for how long you can game. Reply
  • Andrei Frumusanu - Tuesday, September 4, 2018 - link

    Screen resolution, V-Sync and other device differences makes this kinda hard. In my view there's no added value over just peak & sustained performance as well as just measuring power. Reply
  • wow&wow - Tuesday, September 4, 2018 - link

    "A Cheating Headache"

    The worst one should be the "Intel's repeatedly not following the specs" that causes the problems of "Meltdown", requiring OS memory relocation, the industry's 1st and only, and "Foreshadow" that the mitigation can only "reduce" the risk but "not eliminated" it!
    Reply
  • Xex360 - Tuesday, September 4, 2018 - link

    Because I don't consider phones to be gaming devices, for that I have a PC and a console, so benchmarks are worthless to me, the most important things in a phone are the OS (Unfortunately I'm stuck with android, iOS isn't well suited for my use), screen (high resolution and no notch) and finally battery life (around one day, an OLED screen). Reply
  • A5 - Tuesday, September 4, 2018 - link

    Android increasingly uses the GPU to render the OS, and apps like Google Maps use it extensively as well. Sustained GPU performance isn't just relevant to gamers. Reply
  • eastcoast_pete - Tuesday, September 4, 2018 - link

    First and foremost: Thanks Andrei and Ian! This kind of article is why I come to Anandtech again and again (and more frequently than other computer tech websites). Yes, those benchmarks are not only misleading, they also steer the manufacturers towards optimizing for an artificial use (benchmarking), often at the expense of actually optimizing their smartphones for real world use. Who knows just how much better Huawei's phones could have been for everyday use if the time and energy invested in cheating for benchmarks would have instead gone into optimizing their phones for productivity, real world applications, and battery live (Huawei gets a bonus for historically having large capacity batteries, though!). As they are, those benchmarking suites would probably come in handy if one needs to use the phone as a hand warmer in winter.

    One minor edit: the "High" and "Normal"labels on the the Aztec Ruins graphs are probably switched around. The fps numbers for "high" are really high for all devices. However, that is a very minor, cosmetic, point in an otherwise very good article!
    Reply

Log in

Don't have an account? Sign up now