I've been a journalist/reviewer in the 3D graphics industry for over a decade. I can still remember walking through Fry's Electronics and seeing Western Digital's Paradise Tasmania 3D and actually getting excited about the Yamaha-powered graphics chip. Chris Angelini, the managing editor of Tom's Hardware US, and I go way back, with our first jobs in online journalism traced back to 3DGaming.com more than a decade ago.
Having been there from the beginning, I've seen the rise and fall of countless graphics manufacturers: S3, 3DLabs, Rendition, 3dfx, as well as board manufacturers like Orchid, STB, Hercules, the original Diamond, and Canopus. But as wild and crazy as the last decade was for visual computing, the next decade is going to be even more exciting, not only in what technology will offer to consumers, but in the upcoming arms race in visual computing.
A lot has been said about the impending death of the dedicated GPU. If you look at the history of dedicated upgrade products for consumer PC technology, they all eventually reach the point of diminishing returns and then integration. However, while it is inevitable that the dedicated GPU will eventually disappear, it’s not going to happen in the next decade.
Integration of computer technology only happens after the evolutionary process of reaching the point of diminishing returns on quality and performance is reached. We can see evidence of this with sound cards, video processing, and even monitors.
What follows is a discussion on the future of 3D graphics. Is the GPU on its death bed? Will AMD, Intel, and Nvidia continue to be relevant? This is purely an opinion piece, but it is based on more than a decade of experience.
Copyright 2016 © Godem Online Inc. | Web and server solutions by NewTech Solutions.