Intel Haswell GT3e GPU Performance Compared to NVIDIA's GeForce GT 650Mby Anand Lal Shimpi on January 9, 2013 4:22 PM EST
- Posted in
- Trade Shows
- CES 2013
Haswell isn't expected to launch until the beginning of June in desktops and quad-core notebooks, but Intel is beginning to talk performance. Intel used a mobile customer reference board in a desktop chassis featuring Haswell GT3 with embedded DRAM (the fastest Haswell GPU configuration that Intel will ship) and compared it to an ASUS UX15 with on-board NVIDIA GeForce GT 650M.
Despite the chassis difference, Intel claims it will be able to deliver the same performance from the demo today in an identical UX15 chassis by the time Haswell ships.
The video below shows Dirt 3 running at 1080p on both systems, with identical detail settings (High Quality presets, no AA, vsync off). Intel wouldn't let us report performance numbers, but subjectively the two looked to deliver very similar performance. Note that I confirmed all settings myself and ran both games myself independently of the demo. You can be the judge using the video below:
Intel wouldn't let us confirm clock speeds on Haswell vs. the Core i7 (Ivy Bridge) system, but it claimed that the Haswell part was the immediate successor to its Ivy Bridge comparison point.
As proof of Haswell's ability to fit in a notebook chassis, it did have another demo using older Haswell silicon running Call of Duty: Black Ops 2 in a notebook chassis.
Haswell GT3e's performance looked great for processor graphics. I would assume that overall platform power would be reduced since you wouldn't have a discrete GPU inside, however there's also the question of the cost of the solution. I do expect that NVIDIA will continue to drive discrete GPU performance up, but as a solution for some of the thinner/space constrained form factors (think 13-inch MacBook Pro with Retina Display, maybe 11-inch Ultrabook/MacBook Air?) Haswell could be a revolutionary step forward.
Post Your CommentPlease log in or sign up to comment.
View All Comments
mrdude - Saturday, January 12, 2013 - linkAlmost certainly the higher end and uber expensive models. eDRAM implementation isn't cheap so expect to pay a pretty penny for it. There will only be a small set of SKUs with GT3+eDRAM.
Going forward, I think it's currently the most logical solution for CPU/GPUs as dual channel RAM is probably going to go the way of the dodo bird.
Spunjji - Monday, January 14, 2013 - linkI'm pretty sure it's the latter. Cerise thinks otherwise. I'm going on past Intel history, Cerise is going on... hmm.
nicolbolas - Sunday, January 13, 2013 - linkthe odds are this part (with eDRAM) will only be on parts that are usually over 300-400 dollars.
if a OEM wants do they might be able to get a GT in an ultrabook for 1250, maybe 1150:
would look at the i7 model and add 150-250 more dollars (~50 for GT3 > GT2 + 100-200 for EDRAM), however the large issue is that Intel might only offer the EDRAM on the highest end models....
if they offer all i7's with it and it is only 150 more dollars or so for GT3 AND EDRAM than it could be as low as 1050/1100 without discounts (based on Newegg)
thebluephoenix - Thursday, January 10, 2013 - linkIt's a DDR3 version of GT650M, I presume, not GDDR5 like in Mac Book Pro 15 Retina or Alienware M14X R2.
shiznit - Thursday, January 10, 2013 - linkAnand, did they tell you how much eDRAM the chips has? Is it on die or package?
Will the regular mobile SKU (13" Macbook Pro) have the eDRAM?
Wolfpup - Thursday, January 10, 2013 - linkI might have to stop insulting Intel.
Of course we'll have to see what parts this actually ships in, how it actually performs, how the drivers are (for new AND old games), and also how many corners they cut with image quality...like right now it's not even a valid comparison, given Intel's doing less work than Nvidia and AMD to get their worse FPS on Ivy Bridge.
Of course even if this is all true, I'll STILL be disappointed as they're now blowing an enormous number of transistors on a GPU that should be *optional*. All of them could be spent on CPU, or even just making the chip cheaper.
rootheday - Thursday, January 10, 2013 - linkI am tired of people pulling out their old conceptions about Intel's graphics drivers - yes, 3 years ago Intel had a lot of game compatibility issues.
Not so any more. As far as I know, Ivybridge works with basically any recent or older games. The few remaining issues are mostly with a couple old games that have coding bugs (e.g. Fallout3) or don't understand that "dedicated" graphics memory doesn't mean anything on integrated parts (e.g. GTA IV, Empire Total War, PES 2009...).
re "corners cut with image quality" and "not even valid ... Intel's doing less work" - show me side by side screen shots or youtube videos where there is any difference in image quality between Ivybridge vs AMD vs NVidia with identical game settings. On Sandybridge the anisotropic filtering quality was lower, but Ivybridge fixed that. Intel doesn't do the sort of game profile "cheats" to texture or render tartget formats that AMD and NVidia do.
nicolbolas - Sunday, January 13, 2013 - linkthe real problem was Intel had only 100 games officially supported (even non-supported ones work often however). That means if you non-officially support game does not work or starts to not work you have no way of knowing if/when Intel will work to fix it.
The quality of drivers is still overstated as you said, but it is not as bad as the AMD's (ATI's) drivers are horrible v. Nvidia's.
rootheday - Monday, January 14, 2013 - linkFull disclosure: I work for Intel in the graphics driver team.
There is a lot of misunderstanding about that list of 100 or so games. I am assuming you are referring to this page:
This list isn't about "officially supported" games. Rather it is about performance (playable frame rate).
If you read the wording on the page carefully you will see that this list is about which games were known to deliver a playable experience INCLUDING ON HD2500 (GT1). The list that is playable on HD4000 was much longer and included more demanding games but unfortunately only one web page was published to cover both HD2500 and HD4000 so the list there was culled to those that were playable on the lowest common denominator. Obviously we wouldn't include any games on the list if they had compatibility issues - that is a given.
We take compatibility very seriously.
Besides the list above (and the longer list I mentioned for HD4000) where we test for both functionality and performance, there are also hundreds of other older games where we run automated tests for compatibility but the testing approach impacts frame rate such that we can't use the that data to make performance claims.
Intel also has application engineers and our testing labs working with game developers to test hundreds of new games each year on Intel hardware and drivers before the games are released to ensure they are compatible and provide feedback to the game developers on how to tune them for performance.
We take any reports of game compatibility issues seriously for current hardware. Due to resource constraints and the code freeze for releases we may not be able fix user reported issues immediately but I can assure you that quite a number of issues reported to communities.intel.com have been addressed within a couple driver releases this last year. When the issue turns out to be a game bug, we contact the game developer to see if they will issue a patch. In a few cases, we have found the issue to be OEM BIOS bugs and have refered users to the motherboard website for a BIOS update.
Message to the gaming community: Intel wants to deliver the best gaming experience we can. Please let us know of any issues you see, providing good steps on how to reproduce the issue.
nathanddrews - Thursday, January 10, 2013 - linkYou people act as though Intel must either:
1. Make a 5W chip for a $500 laptop that gets over 60fps in steroscopic mode whilst playing BF3 @ 1080p with ultimate details and max AA/AF.
2. Not even try.
What is Intel to do? Everyone hates on them because their IGP sucks, so they improve it to the point where it can play a relatively modern and popular game at 1080p with high settings and fluid framerates. WTF are you complaining about?
Wait, nevermind, I don't really want to know.