AMD Radeon HD 5450 Versus Oldest To Almost Decent High-End Video Cards

No replies
GraysonPeddie's picture
Joined: 10/29/2006
Posts: 570

Hi. There something in my mind that makes me wonder how far the video cards have became. Sure, the 5450 is not a video card for hardcore gamers, but does it really help to take a step back through time and see how well the 5450 stacks against the old video cards? I would certainly won't use any video card that is better than a 5450 as I don't think that's a very fair comparison. Sure, the 5450, even an AMD Radeon 4350, is certainly an unfair comparision to old video cards, like the AMD Radeon 8700 or the 9700 (that came before the AMD Radeon X800, but note that an AMD Radeon 9700 would certainly beat an X300 and perhaps an X1300, but maybe not the AMD Radeon X2400? I'm not sure). Of course, I won't get everyone started talking about ancient video cards like an ATI Rage Fury video card...LOL! I do like playing old games, like MechWarrior 2, even console ports that everyone loves to hate, such as Final Fantasy VII and VIII, but gee man, I've got to stop talking about those games and those old video cards -- that's so late 1990s. Heh heh! :)

So, I'm interested in very-high-end 2000-2005-era video cards (no dual-video card configurations or dual-GPUs) that can be compared with a Radeon HD 5450. So my question is, how well do the 5450 stack up against the high-end video cards?

Please pardon me if my question is not concise and clear enough for anyone to understand. I'm not expecting everyone to do their homework for me and I really wish I have every high-end video cards as far as back to AMD Radeon 9700 to the recent AMD Radeon HD 3850 or so (well, I'll need to look for benchmarks for 5450 and 3850, but then how am I going to compare the video cards with the same computer specs? I'm not sure...). I know it's a lot to ask and ponder, though. I really don't know how well the Radeon HD 5450 can keep up with the frame rates if I play Doom III and the first version of Far Cry, but then I'm a single-player console-type gamer who really does not feel comfortable using a mouse when it comes to FPS, so I wouldn't ever think about joining a multiplayer club. :)

Oh, here's one more question. Do you think the GeForce 8800 can stack up against a bit higher than a 5450--something in comparison with HD 5570? I would like to play The Last Renmant, a JRPG game, at 1920x1080, which is my TV's native resolution, but I could probably go with 1280x720, as I'm pretty much not worried about blocky text (no matter how small or big my TV is, as I have an LG 50" 50PK950 HDTV) as I used to play emulated games like Super Mario RPG and Chrono Trigger using a Super Nintendo emulator. However, I'd much prefer to have nice-looking detailed graphics. I know my question should probably go in the Gaming forum, but when it comes to system requirements, it only comes down to this: How does GPU X stack up against GPU Y in terms of generation? Can I play Game X that have a system requirements mentioning GPU X with GPU Y? Even in GPU Z if it has a smaller manufacturing tech (40 nm vs 55 nm)? Am I making any sense here?

PC: Tt Core V21; Kaveri APU, 16GB RAM, GTX 960, Arch Linux
Server: Rosewill Legacy V6-S, AMD Athlon 5350 APU, 8GB RAM, 90W DC-IN PSU, Ubuntu Server