Did Intel Admit that GPU's are Faster than CPU's?

Tagged: intel, nvidia, Computer Hardware, Gaming
Source: techgage - Read the full article
Posted: 6 years 17 weeks ago

For most people, there is no such thing as a CPU vs. GPU battle, but for companies like Intel and NVIDIA, there is such a thing, and it's heated. Naturally, gaming wouldn't be stellar if the only acceleration came from the GPU, and likewise, we wouldn't get too far with our OS running off of a GPU, but in between, there exists a battleground where neither is prepped to come out victorious.

At an event held in France called the International Symposium on Computer Architecture (don't worry, this is the first I've heard of it also), Intel made a rather interesting statement that could be taken one of a few ways. In gist, the company debunked claims of GPU's being 100x faster than CPU's, by stating that they're only 14x faster.

With a statement as interesting as this, not to mention one that seems to give a hearty nod to GPGPU, it's little wonder that NVIDIA wasted little time in broadcasting Intel's statement online. The result was a blog post made by the company's General Manager GPU Computing, Andy Keane. In it, there is much poking at Intel, but aside from that the company also backs up this "100x" statement with the help of claims from various organizations.

Based on these, 100x is a modest figure, as the Massachusetts General Hospital found a 300x increase in performance when running a Monte Carlo simulation for photon migration on a GPU vs. CPU. The exact product models used to come up with this conclusion are not listed, however.

The highlight of the story, though, isn't the fact that Intel downgraded NVIDIA's claims, but rather that it admitted that GPU's have the capability of delivering 14x the performance over a CPU. How often does something like that happen?

Long-time readers of our site of course knew that this has long been the case, but nothing has changed in recent memory in the CPU vs. GPU battle. For most tasks, the CPU excels, which even NVIDIA would have a difficult time in discrediting. In other cases though, such as applications and scenarios that can make good use of a heavily parallel processor and is able to get by without a CPU, performance gains, sometimes huge, are going to be seen.

I'm personally still waiting for more proof on the GPGPU front, because up to now we've seen multi-media encoding and not much else. There have been the odd password cracker and medical solution, but for most end-users, the reason to move over to a GPGPU mindset has yet to come. I can't wait until we reach the day where we can run something as robust as games on our GPU's! Oh wait...

The real myth here is that multi-core CPUs are easy for any developer to use and see performance improvements. Undergraduate students learning parallel programming at M.I.T. disputed this when they looked at the performance increase they could get from different processor types and compared this with the amount of time they needed to spend in re-writing their code.



Joined: 04/05/2010
Posts: 236

eum isn't a gpgpu somewhat the same as amd is trieing to do whit fusion they try to make a cpu combined whit a cpu to let them work very closely and later they will make it a homogenous unit (apu)

so the gpu does the tings where the cpu is bad at and the cpu does the tings where the gpu is bad at

D3PyroGS's picture
Joined: 06/08/2010
Posts: 2

The problem is that for writing general purpose programs for the GPU, they must be very parallelizable. This is because graphics cards are, obviously, made first to do operations specifically for 2D and 3D applications, and then for more general purposes. This is a good thing, because the more the hardware is made for generic programming, the less catered it will be towards gaming (the thing we all use them for the most). Something we talked about in our game development class was this idea of things starting out specific and slowly becoming more generic at the cost of performance for the thing it was originally made for, only to be replaced by some new thing created, again, specifically for that purpose. And the cycle continues.

hollowtek's picture
Joined: 06/24/2010
Posts: 25

that is interesting. definately something i'm going to keep up with.