The PC: simply less interesting than it was in 1998-2003

Tagged: PC, Computer Hardware
Source: Ars Technica - Read the full article
Posted: 4 years 4 weeks ago

Readers periodically contact me to ask why I no longer write the kinds of in-depth CPU architecture articles I once did. In light of the current discussion happening about editorial content at Ars, it seemed appropriate to comment on how I see things in 2010.

In a nutshell, I can sum up my take as follows: "computing" has changed radically since 2003. CPUs have not.
The good old days

When we first started Ars, the Pentium Pro had recently launched and commodity microprocessors were undergoing a radical architectural shift from fairly primitive (in-order, dual-issue) to very complex (out-of-order, very wide issue). Clockspeed increases were driving commodity hardware into performance territory that had previously been the sole province of high-end RISC workstations. PCs were well into the very exciting middle of their CPU-driven move from the office desktop into the home, on the one hand, and into the server closet, on the other.

After a brief flirtation with hyperpipelining in the first half of the decade, Intel returned to the Pentium Pro lineage that it had abandoned with the Pentium 3, and the chipmaker began focusing on incremental improvements instead of radical, Pentium 4-style overhauls. Today's Nehalem and Westmere processors are direct evolutionary descendants of the original Pentium Pro.

As for the other big CPU makers, they're in the same boat. The basic microarchitecture of AMD's x86 processors is even closer to the original K8 of 2003 than Nehalem is to the Pentium III (though AMD will be making a significant departure from this venerable design with its upcoming Bulldozer core).

The IBM POWER7's individual cores are descendants of the once-novel POWER4 architecture that I described in my articles on the PowerPC 970.

As for Itanium's progress since 2003... little needs to be said. Sun was doing some interesting work, at least on paper, but that's all over as of the Oracle acquisition.

Then there are the game consoles, which I covered in detail prior to their launch, and which haven't changed a lick since.

Processor microarchitecture is just less interesting, and it hasn't changed substantially since I was actively doing in-depth articles on the topic. It's also much less important in the multicore era. Per-core performance is nice, but the main factors affecting real-world application performance are all located outside the cores themselves. The sizes and placements of on-chip caches; the types and speeds of I/O available to a processor; the way that the system is partitioned among CPU, GPU, main memory, graphics memory, and storage—all of these factors must be tuned and balanced carefully, and any one of them is just as important as microarchitecture in delivering performance and power efficiency.

Incidentally, one can see the impact of these trends at ISSCC, where microprocessor sessions are now 90 percent devoted to power management and fabrication process issues—core microarchitecture issues like pipeline depth and block diagrams will typically get a single slide and a mention in passing. This is in stark contrast to the days when presenters would spend most of their time walking attendees through the details of pipelines and functional units, talking about branch prediction, issue queues, instruction latencies, and the like.

All of this power management stuff is, frankly, quite boring. I can't muster the interest to write much about it, and even if I could, I promise that you couldn't muster the energy to read it.
The mobile scene: still boring

You might object that the mobile processor space is red hot. Intel's Atom is a brand new microarchitecture, and then there's the Cortex family from ARM. But Atom looks enough like the classic Pentium MMX that this three-pager I did in 2008 covers it pretty well.

As for Cortex A8 and A9, they're probably worth doing something lengthier on than I've done so far, but there's nothing to my knowledge that's really new or exotic. They're conventional from a microarchitecture perspective, with most of the interesting stuff going on at other levels of abstraction and design.
Other stuff is more interesting than PCs and CPUs

All of this isn't to say that there's nothing new on the horizon for CPUs. I mentioned AMD's Bulldozer above, which is suitably weird and worth looking closely at. The CPU + GPU fusion efforts also hold promise—when these devices share more than just a die (like, maybe sharing a memory space), then they'll be a lot more interesting. And, as I said above, I have yet to really dive into either of the Cortex designs. Finally, I've never given programmable GPUs the kind of detailed coverage that they deserve, so there's plenty to be done there.

So why don't I take two or three weeks and sequester myself to produce a deep dive into one of the above topics? Because, as I said above, the CPU just doesn't matter as much any more, and neither does the PC.

In 1998 and for a few years afterwards, covering "the PC" was covering computing—at least, the part of computing that was on offense and not defense—and covering microprocessors and motherboards was covering the PC. PCs, motherboards, CPUs, and fixed-function GPUs were the main things going on in computers, at least from a hardcore geek perspective. But this is profoundly no longer the case.

Programmable GPUs, enterprise storage, mobile gadgets, data centers, and more have all made both commodity and business computing a much richer and more varied ecosystem. This fact makes the GPU the only class of semiconductors I have real guilt about not doing in-depth tech articles on.

For the rest, I'd rather cover mobiles, cloud architecture, display and interface technology, Internet-scale software, and similar innovations on a day-to-day basis than disappear for some number of weeks to produce another CPU tech article. This has been the case since at least 2007, and it looks like it will be the case for the near-term future.

 

Comments

Anonymous

for me, the pc DID NOT become less interesting. it had the opposite effect.
as a pure pc gamer, gaming on a pc is by far the best because of the detail and depth capable in a pc game.
game developers are losing sight of this and shipping out games that appeal too much to the flash and fluff crowd and forget that the pc audience consist of people who have vivid imaginations and enjoy games with more substance than a button masher game.

if not for pc's, how would most or all console games be programmed?

for my money, give me a pc game with something where it counts, the story and less flash and fluff.

Anonymous

I am a GAMER.

I hate cellphones/smartphones/mobiles, really I do. I only carry a super cheap cell to make calls because no more pay phones around.
I don't enjoy having to charge the battery, small screens not 17", the way you interface with it, horrific games, paying through the nose to get any semi-useful feature.

Cloud computing sucks for gaming because I am one of those who like to purchase games as a physical product if it be on a disk or other medium. If they could figure out a way for us to have both at the same time, then and only then will I care about cloud.

Tiv
Tiv's picture
Offline
Joined: 08/12/2009
Posts: 3584

Pretty much you are right. We expect a certain number of advancements these days as corporations take their sweet time pacing each launch. Maybe every now and then something cool comes along, but most of the time it's never as good as we expected or long over due.

I sleep fine at night knowing we are banning people who deserve it.  Tivon
Don't test my skills, I was trained by myself! Check out my Gaming Videos!

Administrator
Administrator's picture
Offline
Joined: 08/04/2009
Posts: 220

I think we have just become desensitized to the shear pace of technological progression...