Power Consumption

There's a lot of uncertainty around whether or not Kepler is suitable for ultra low power operation, especially given that we've only seen it in relatively high TDP (compared to tablets/smartphones) PCs. NVIDIA hoped to put those concerns to rest with a quick GLBenchmark 2.7 demo at Siggraph. The demo pitted an iPad 4 against a Logan development platform, with Logan's Kepler GPU clocked low enough to equal the performance of the iPad 4. The low clock speed does put Kepler at an advantage as it can run at a lower voltage as well, so the comparison is definitely one you'd expect NVIDIA to win. 

Unlike Tegra 3, Logan includes a single voltage rail that feeds just the GPU. NVIDIA instrumented this voltage rail and measured power consumption while running the offscreen 1080p T-Rex HD test in GLB2.7. Isolating GPU power alone, NVIDIA measured around 900mW for Logan's Kepler implementation running at iPad 4 performance levels (potentially as little as 1/5 of Logan's peak performance). NVIDIA also attempted to find and isolate the GPU power rail going into Apple's A6X (using a similar approach to what we documented here), and came up with an average GPU power value of around 2.6W. 

I won't focus too much on the GPU power comparison as I don't know what else (if anything) Apple hangs off of its GPU power rail, but the most important takeaway here is that Kepler seems capable of scaling down to below 1W. In reality NVIDIA wouldn't ship Logan with a < 1W Kepler implementation, so we'll likely see higher performance (and power consumption) in shipping devices. If these numbers are believable, you could see roughly 2x the performance of an iPad 4 in a Logan based smartphone, and 4 - 5x the performance of an iPad 4 in a Logan tablet - in as little as 12 months from now if NVIDIA can ship this thing on time.

If NVIDIA's A6X power comparison is truly apples-to-apples, then it would be a huge testament to the power efficiency of NVIDIA's mobile Kepler architecture. Given the recent announcement of NVIDIA's willingness to license Kepler IP to any company who wants it, this demo seems very well planned. 

NVIDIA did some work to make Kepler suitable for low power, but it's my understanding that the underlying architecture isn't vastly different from what we have in notebooks and desktops today. Mobile Kepler retains all of the graphics features as its bigger counterparts, although I'm guessing things like FP64 CUDA cores are gone.

Final Words

For the past couple of years we've been talking about a point in the future when it'll be possible to start playing console class games (Xbox 360/PS3) on mobile devices. We're almost there. The move to Kepler with Logan is a big deal for NVIDIA. It finally modernizes NVIDIA's ultra mobile GPU, bringing graphics API partity to everything from smartphones to high-end desktop PCs. This is a huge step for game developers looking to target multiple platforms. It's also a big deal for mobile OS vendors and device makers looking to capitalize on gaming as a way of encouraging future smartphone and tablet upgrades. As smartphone and tablet upgrade cycles slow down, pushing high-end gaming to customers will become a more attractive option for device makers.

Logan is expected to ship in the first half of 2014. With early silicon back now, I think 10 - 12 months from now is a reasonable estimate. There is the unavoidable fact that we haven't even seen Tegra 4 devices on the market yet and NVIDIA is already talking about Logan. Everything I've heard points to Tegra 4 being on the schedule for a bunch of device wins, but delays on NVIDIA's part forced it to be designed out. Other than drumming up IP licensing business, I wonder if that's another reason why we're seeing a very public demo of Logan now - to show the health of early silicon. There's also a concern about process node. Logan will likely ship at 28nm next year, just before the transition to 20nm. If NVIDIA is late with Logan, we could have another Tegra 3 situation where NVIDIA is shipping on an older process technology.

Regardless of process tech however, Kepler's power story in ultra mobile seems great. I really didn't believe the GLBenchmark data when I first saw it. I showed it to Ryan Smith, our Senior GPU Editor, and even he didn't believe it. If NVIDIA is indeed able to get iPad 4 levels of graphics performance at less than 1W (and presumably much more performance in the 2.5 - 5W range) it looks like Kepler will do extremely well in mobile.

Whatever NVIDIA's reasons for showing off Logan now, the result is something that I'm very excited about. A mobile SoC with NVIDIA's latest GPU architecture is exactly what we've been waiting for. 

Introduction
POST A COMMENT

141 Comments

View All Comments

  • jasonelmore - Wednesday, July 24, 2013 - link

    Keplar has the best performance per watt in the world. No matter which platform your talking about.

    This only makes sense, now keplar is fully scale-able.
    Reply
  • Refuge - Thursday, July 25, 2013 - link

    Thing is, its coming to market too late if you ask me, and to be perfectly honest I wouldn't expect it to do much more than perform equally with its competition.

    THe only thing I can see saving this and making it a huge success would be if the 20nm yields are crap and they can't make enough for demand.
    Reply
  • Arnulf - Wednesday, July 24, 2013 - link

    The name is Kepler, as in Johannes Kepler:

    http://en.wikipedia.org/wiki/Johannes_Kepler
    Reply
  • takeship - Wednesday, July 24, 2013 - link

    See also for reference: Tegra 2 performance claims, Tegra 3 power claims. Reply
  • godrilla - Wednesday, July 24, 2013 - link

    This thing can run crysis 1 six years later 100x more efficient than my 6 year old 8800 ultra, now that is impressive. Reply
  • chizow - Wednesday, July 24, 2013 - link

    I guess Logan is the reason why Nvidia was so confident in it's IP licensing overtures. Hopefully we'll see it licensed out to some of the bigger players out there. It'd be a shame to see Kepler mobile tech wasted away if it's necessarily tethered to Tegra. Reply
  • happycamperjack - Wednesday, July 24, 2013 - link

    I believe that this is nvidia's marketing push to Apple. Everything nVidia's doing lately (such as opening up mobile GPU licensing) is pointing to nVidia wanting to bed with Apple products. If they are successful, we can expect Apple products to feature nVidia GPU in their late 2014 or 2015 lineups. Their PowerVR chips are lagging behind competitions in terms of cutting edge features like latest OpenGL implementations. Apple is no doubt looking into updating their GPU lines. Nvidia's GPU makes a lot of sense for Apple (experiences with TSMC SoC, lack of Android phones using nVidia's GPU this year). Reply
  • Krysto - Thursday, July 25, 2013 - link

    I'm sure Apple would evaluate it. But I think they'll just wait for Maxwell, a year after that.

    I was already thinking Apple might quit Imagination in the next few years, because Imagination will be making their own MIPS chips, and try to get more into the Android world, and I don't think Apple will like that very much.

    Plus, there's the technical reasons. I don't think Imagination will match Kepler/Maxwell anytime soon, probably not even in performance, let alone in features. It's really REALLY hard to support all the latest and advanced OpenGL features - see Intel, who's had tons of trouble making proper drivers over the years, and they're still barely at OpenGL 4.0 with Haswell.
    Reply
  • happycamperjack - Thursday, July 25, 2013 - link

    I guess it will all depends on how good PowerVR series 6 GPU will be. And who knows, might even use intel SoC in 2015 as the new 4.5w Haswell has been very impressive! If it has the performance near the i5 in Surface Pro, it's a no brainer for Apple to seriously consider Haswell or its successors. One thing for sure, it's gonna be harder than ever for Apple to figure out which road to take in the next 2 years. Reply
  • watersb - Friday, July 26, 2013 - link

    Apple currently has the resources to take all the "roads", then pick the one that works best for them. All before releasing a product.

    Their culture of secrecy is mostly because they do not want clumsy ideas presented to the end-user.
    Reply

Log in

Don't have an account? Sign up now