Depending on which model you get, the ZenFone 2 will come with either Intel's Atom Z3560 or Atom Z3580 SoC. The former has a max boost clock of 1.8GHz, while the latter has a max of 2.33GHz. It's also important to note that the more expensive model with the Z3580 comes with 4GB of LPDDR3 1600 memory, with the Z3560 models only sporting 2GB. While it's not unheard of to see one of Intel's Atom SoCs in an Android device, this is the first time I've seen it in a smartphone with such wide availability. I'll be talking more about x86 on Android in the final review, but the main point is that the nature of Java applications means that users don't need to be concerned about whether their device uses ARM, MIPS, or x86.

The model we have for testing is the high end Z3580 + 4GB configuration. While I can't make any concrete statements, based on the thermal constraints of smartphones I would expect that Z3580 will outpace Z3560 in race to sleep scenarios where the SoC runs at a high speed for a short period. In more sustained workloads I think that the two phones will perform similarly as they throttle down to manage heat. The extra 2GB of RAM may also be a factor, but it feels more like a future proofing measure, as we don't even see the 3GB of RAM in flagship devices being saturated. Android RAM requirements are also trending downward rather than upward due to the majority of Android sales being inexpensive devices.

To evaluate the performance of the ZenFone 2, I've run it through our standard suite of web and native benchmarks. This section mixes together the more CPU bound and GPU bound benchmarks, and I've just posted the overall score for many of the tests. I'll be looking at the scores for the subtests of benchmarks in the full review to see exactly what areas it performs better in than others.

Basemark OS II 2.0 - Overall

The overall score in Basemark OS II paints the ZenFone 2 in a good light. Its performance when considering every aspect of the hardware ends up being in the same range as the 2014 flagship smartphones. Some of the newest devices pull ahead, with the Galaxy S6 being very far ahead due to its UFS memory. All the devices ahead of it also happen to cost upwards of $500, $600, and even $700, so ASUS and Intel should be proud.

PCMark - Work Performance Overall

PCMark is a benchmark that emphasizes real world tests. It performs actions like playing and seeking within videos, writing text and into files, etc. All of these are race to sleep scenarios where the CPU should run at its high frequency for a short period and then return to a low frequency to save power. The quad core Atom Z3580 with its 2.33GHz peak frequency pulls ahead of every other smartphone, including the Galaxy S6 which was the previous leader. I'm very impressed with the ZenFone 2's performance here, and it does indeed translate into the same real world tasks that PCMark runs.

3DMark 1.2 Unlimited - Overall

The 3DMark benchmark has both GPU and CPU components. In this test the ZenFone 2 does surprisingly well, with a score that sits among the newest flagship devices.

GFXBench 3.0 Manhattan (Offscreen)

Last, but not least, we have GFXBench's Manhattan benchmark. In this test, we see that the ZenFone 2 performs roughly as well as the iPhone 5s, with a deviation that is small enough to attribute to variance between different runs of the test. It makes sense that the ZenFone 2 performs around as well as the iPhone 5s, as they both share the same PowerVR G6430 GPU. Since the clock speed for the GPU in Apple's A7 chip is unknown, it's not clear whether the ZenFone 2 is at any performance advantage or disadvantage due to driver differences between iOS and Android.

Overall I'm more than pleased with the performance of the ZenFone 2. It's not even worth comparing it to other phones in the $200-300 price range, as it performs at the level of last gen and in some cases current generation flagship smartphones. The GPU in Intel's Moorefield line is definitely the area that needs the most improvement when the next generation of mobile Atom parts comes around. Intel will undoubtably continue to use GPUs from Imagination Technologies, and it will be interesting to see if Intel moves to the PowerVR 7 series with their next generation of SoCs.

Initial Thoughts

Based on the time I've spent using the ZenFone 2 so far, I'm impressed with what ASUS and Intel have managed to create within their price constraints. In some respects, the ZenFone 2 competes with phones that cost substantially more money. In others, it doesn't do quite as well, but it usually outperforms other devices in the $200-300 range by a significant margin, and this is where its real strength seems to lie.

There's a lot to like about the ZenFone 2. While it doesn't support an aluminum or steel chassis, the design is still very nice. Personally, these 5.5" devices are still a bit too big for me to use comfortably, but that's more of a personal preference. The display is also very sharp, although the accuracy could be improved. I understand that CABC can be helpful in certain cases to improve battery life, and even Apple employs it in certain scenarios such as full screen video. That being said, employing it constantly can be problematic, and ultimately I don't think the display issues it can introduce are worth it.

In terms of performance, the CPU is competitive at times with this year's fastest flagship devices. Apple, Qualcomm, and Samsung should probably be a bit concerned about ASUS and Intel's efforts to get this kind of performance into devices that cost only $200 or $300, as it significantly compresses the gap between them and $650 flagship phones. Meanwhile the GPU is similar in performance to last year's flagship smartphones and tablets. I think the GPU should definitely be the main area of focus for Intel with future SoCs. Otherwise compared to more similar phones in the $200-$300 range, the ZenFone 2 is well ahead when it comes to both CPU and GPU performance.

Based on what I've seen so far, the choice between the ZenFone 2 and other devices at the same price is looking to be an easy choice if you're concerned with performance first and foremost. The bigger question to tackle for our full review may very well be just what tradeoffs you do make between the ZenFone 2 and the major manufacturer's flagship phones. At $199/$299, ASUS isn't intending to go to go head-to-head with the likes of the iPhone and Galaxy S6, but if the ZenFone 2 can deliver enough of the flagship experience then it has the potential to be a spoiler for users who can accept the compromises in exchange for significant savings.

Wrapping things up, there are still many aspects of the ZenFone 2 to explore. I'm currently in the process of evaluating the camera, battery life, NAND performance, and ASUS's Zen UI for Android. I hope to be able to present that information to all of you soon, and I hope that you'll join me again when the full review of the ZenFone 2 is completed.

Comments Locked


View All Comments

  • lilmoe - Monday, May 18, 2015 - link

    I wish Intel would focus more on GPU in all of their lineup of silicon.
  • MonkeyPaw - Monday, May 18, 2015 - link

    That's what they did with CherryTrail. They shrunk the die, kept the CPU largely unchanged, and jumped to Gen 8 graphics with more EUs. Surface 3 has a newer GPU than Surface Pro 3!
  • lilmoe - Tuesday, May 19, 2015 - link

    Yea they did, but it's not enough though. It falls short of smartphone SoCs ATM.
  • Alexvrb - Thursday, May 21, 2015 - link

    It's a Windows x86 device... with a new GPU. A major driver update could turn things around, especially combined with DX12. They still need to pick up the pace though. Case in point:

    These Atom chips in question are powered by PowerVR graphics, not Intel. G6430 to be specific, so they should be reasonably quick. But Intel should be more aggressive in adopting the latest graphics cores from PowerVR - these should have been using GX6450, as seen in the iPhone 6. The G6430 was what Apple was using last gen.
  • akdj - Saturday, May 23, 2015 - link

    Hi Alex. I agree with all you say, and wise to point out this isn't a graphic solution from Intel. That said, I'm not so sure about the second sentence. Driver updates certainly can help but as you say, it's last year's PVR silicon and there's a ceiling of headroom just as there is with the TDP of the CPU, the SoC's overall temp is governed by the GPU more often (as it's literally 'all display' ;)) than the CPU.
    However, seeing the advances Intel has managed on the desk and in the lap, I'm not betting against them, nor nVidia. While they 'missed the train' on its first stop, they've both got the resources to catch up. The knowledge, equipment and the know how to do so. Intel iGPUs over the life span of the core 'i' series since Core2Duo have gone from able to draw your desktop without mouse tracers to today's Iris Pro 6xxx series. While not 'monsters' in comparison to the discrete lineup of GPUs on the market, they're certainly comparable to the discrete cards of yesterday. IE, I've got a 15" rMBP from 2012. It's a core i7/2.7GHz w/16GB of RAM, 786GB SSD and the Intel HD4000/nVidia 650m (kind of a 655/660 as Apple clocked with nVidia's tutoring, the 650m to near parity with the 660m). The latter with 1GB of dedicated VRAM, the integrated GPU shares ...I believe up to 3/4 of a GB of the system RAM. Possibly more. But I believe that's in the newer IP 5200 that gets 1.25GB on the iGPU, 2GB on the 750m discrete from 2013 & 2014.
    Today's 6xxx IP iGPU series without throttling on many of the passively cooled new core 'm' low power 4.5w chips currently being measured --- in an environment capable of keeping the GPU power sustained (& cooled) -- the current iGPU computationally equals that power of the 650m, in some cases bests it's abilities to manipulate large PS Raw batches, crunch genome projects ...even video rendering, trans or encoding. A couple of these cases, the newest Haswell iGPU is quicker than the current rMBP 15" flagship offering that's remained, the 2GB 750m. That's pretty amazing progress from the Sandy Bridge (28nm? Or was it 32?) HD3000 ---> HD4000 in Ivy Bridge's die shrink to 22nm, and truly blossoming in the 5xxx & 6xxx series. And @ 4.5watts to match beefy discrete cards from a couple years ago (I'm not ignorant to the fact Apple's using an ancient discrete solution of their flagship laptop BUT Intel's delay in the 35/45watt quads that fit into those rigs haven't appeared! Last of the batch apparently. But along with Haswell and the new core processor, I can only imagine the performance capabilities of the internal GPU alone. Remember the rMBP sold in 2014 as a $1999 option without discrete graphics - solely relying on the 5200 IrisPro. And it's incredibly fluent. They're making a helluva run 'catching up' integrating a nice graphic package on to their silicon as they continue reducing size to 14nm. I'd guess ...if I was a betting man, I'd wager within five years they're as dominant and influential in mobile SoC design as they're today in X86/desk and laptop chip design. But I don't think there will be only a single 'AMD' competitor. Imagine & Qualcomm, as you point out PowerVR series are still 'off the shelf' and perhaps the last piece to Apple's 'A' series 64bit design. Samsung's Exynos SoCs are fast as HELL, & seemingly they're the only chip manufacturer or OEM able to turn 64bit in a generation. Qualcomm, it's 810 woes and concurrent development of the 808 as a stop gap at 32bit showed just how off guard the A7 caught the,, now twenty months ago
    Tegra {nVidia} xxxTrail/Atom/Celeron - did I read 'Pentium' (the word as a descriptor) will also. E resurrected? ...anyway, Intel obviously has choices and Baytrail, Cherrys and I'm thinking the Dingleberry line that follows ;)---will continue to benefitting simply by decreasing silicon, increasing transistor Count and their differing solutions with trigate graphene or nano nanew (Mork man!) molecular silicon from organically ....whatever it is, Intel is now wise to the Massive exorcism of space wasting, power hungry desktops and heavy laptops that last 67 minutes unplugged folks are going through (we've ALL got one at the 'grind', right? Your've got 'the box'. WTH do you need one at home when an iPad, Samsung Note pro 12.1" v5/7100trV model (could t resist, I've got the Xoom still and long for the days the tab had a unique name lol) or a two/three pound laptop with PCIe SSD storage that reads and writes at a Gb/s, lasts ten to 12 hours on a charge ( I routinely get 13-15 on my 2014 13" MacBook Air ), or an iPhone 6+, Note 4, Nex 6 a new HTC LG the S6 or HuawhyIcan'tspelltheUaii Chinese brands, Xiaomi (sp too?) as well --- having a 5"+ phone (I'm a Note user strictly for my business, iPhone for personal I've been ambidextrous since 2008) is nice, large and legible. Add to the recipe the display technology progression over the last decade. Sensor miniaturization of sensors; accelerometers, gyros, barometers and proximity ...GPS consumer access and phenomenal LTE speeds that surpass many folks' home ISP bandwidth --- 'Mobile', regardless of what you and I and the rest of us geeks 'need' for our task don't reflect on today's normal, reasonable person. Leave the computer at home along with the work it contains. Even if you're going on vacation your smartphone and/or tablet has the power, connect-ability and high speed, reliable and efficient speeds with any and every piece of what used to be referred to as software is now the 'app'
    I did my taxes on mine this year. On my iPad and I own a home, cabin and business. Plenty of 'stuff' that doesn't allow me to use the 1040EZ any longer ;). I can check email, Linked, Twit and FB, projects through MS's office suite, Adobe's creative suite or Apple's iWork. I can record an album. Draw a picture. Edit photos or motion/video and save, share or photoshop someone's face on the cop's. Read a book, surf, respond to texts and tweets and take calls ...even make calls if so inclined. Calendars and calculators ...a 'viewfinder' of 9.7" of pure perfect display to monitor your multi cam setup and adjust exposure - ISO - Shutter speed and aperture as well as 'mix' out your broadcast through Multicam editing and transitions of viewpoint to keep a viewer compelled. Write a novel. Fly a plane and file flight plan, diversion airports, real time weather and traffic -- gas necessary, weights and CG balances. (I fly for a living and have for almost thirty years in Alaska ...the iPads have changed significant a jump in technology it rivals the transition from steam gauges to glass 'pits. There's no END to what one can do with today's smartphones and tabs. They're not tablets. They're not phones. They're now pocket computers faster than the laptops we were using last decade just five years ago.
    Intel isn't standing idly by. They've got a lot Invested in the 'ultra book' sector. Some $300,000,000 to help boost sales. And we're still, at the genesis. Some may call the new Yoga or MacBook under powered or 'slow' in comparison to their previous machine. Truth is it's as fast as the machines we were very content with two years ago, using less than a third of the power @ 4.5watts. That's amazing dexterity and they'll continue to make in roads graphically to the point they'll be a major player IMHO.

    Sorry for the novel. My bad
  • javier_machuk - Tuesday, May 19, 2015 - link

    i agree, a 5 inch version of this phone at this price would be a hot seller, far better specs than the rest of the pack in that price, the size its the only thing that pull me off.
  • Hrel - Wednesday, May 20, 2015 - link

    It IS a 5" phone. They used a very thin bezel to put a 5.5" screen in a 5" chassis. Compare the device dimensions to the 5" LG G2, it's smaller :)

    If you go to Asus's website they explain this in more detail on the product page for this phone.
  • asfletch - Wednesday, May 20, 2015 - link

    Er...Asus may claim they've made it like a 5", but the LG G2 is waaay ahead on screen to bezel ratio. It's about 138mm tall by 71mm wide. The ZenFone 2 is about 152mm tall by 77mm wide. That's noticeably bigger than even the G3.
  • SoC-IT2ME - Thursday, May 21, 2015 - link

    Hrel - your post is full of fail. This is a much larger phone than G2.
  • blzd - Thursday, May 21, 2015 - link

    Um no. LG G3 is a small 5.5" phone, this is a larger 5.5" phone.

Log in

Don't have an account? Sign up now