How I Screwed Up my PC Upgrade Path
by MrCHUP0N on Comments
Here's a lesson for all you neophyte PC hardware tinkerers: when a technology is getting phased out, don't keep on investing in it.
Case in point: me, and the AGP slot.
You see, way back in the mid-90's, graphics cards went into the same slots -- PCI -- that sound cards and internal modems plugged into. PCI slots shared bandwidth with each other. In plain English, this means that everything plugged into PCI slots would have to share the same "highway". 3D graphics acceleration for personal computers received a big boost when the Accelerated Graphics Port (AGP) was introduced onto computer motherboards. AGP provided what was essentially a service ramp that gave graphics data its own little free road on which to travel.
When I started college, I wanted a card that both accelerated my 3D graphics and allowed me to pipe external video sources into my monitor, so that I could play my game consoles at my own desk if someone was watching TV out in the common room. That card ended up being the ATI All-in-Wonder 128. It was perfect: it made Unreal Tournament and look great, and I could play and record bouts of Soul Blade from my desk.
As the years went on, I continued to prioritize the ability to play console games via my computer screen while not sacrificing the fidelity of my PC games. ATI released the All-in-Wonder Radeon 8500, then All-in-Wonder Radeon 9800 Pro, then the All-in-Wonder X800 XT. I hungrily bought each of these as they came down in price, keeping myself up-to-date with games like Doom 3.
The catch was that these cards all came out for the AGP standard, but PCI-Express was quickly on the rise. Meanwhile, I stubbornly clung to the notion that I'd be able to record console footage at my whimsy. Never would I give up buying these All-in-Wonder products! Never would I give up on the AGP standard! And other such nonsense.
During all of that commotion, I upgraded from the old Athlon XP processors to AMD's new kid on the block: the dual-core Athlon X2 3800+, which fit into a Socket 939 motherboard. This was sweet: I had an insanely powerful chip for a great price. The catch was that, by the time I bought it, I could only find one or two motherboards that both supported the chip AND my AGP video card. Instead of giving up right then and there, I clung fast to the antiquated standard. After all, I was saving more money by not having to buy a new PCI-Express video card. Right? ...Right?
Fast-forward to today, where the latest and greatest videocards hardly ever come out for the AGP standard, and where there is a dearth of Socket 939 motherboards with PCI-Express. This traps me in two ways: if I want to upgrade my video card to something substantial, I have to change my motherboard. Because of the lack of motherboards that my processor can actually fit into, I have to buy a new processor. Likewise, if I want to upgrade my processor, I have to either find a new motherboard that still uses the AGP standard or get rid of my current video card.
Funny thing: during my whole stubborn mentality of not leaving behind my ability to record console game footage, two things happened. 1) Adaptec's Gamebridge USB came out. 2) I realized that I wasn't even recording much footage anymore anyways. Had I realized this in time, there would still be an abundance of Socket 939 boards; I would have just ditched the All-in-Wonder after it had run its course and made the shift to PCI-Express. By the time I finally gave up on the old dying dog standard, though, it was too late.
The silver lining is that I'm still happy with my video card. It does play Crysis, albeit on medium with many settings turned low. I'm just mad that I don't even have the option of upgrading only the part that I really need to. I won't need to get new RAM or a new processor for another 18 months, but I'm pretty sure I'll be ready for a video card upgrade in six months. I should be able to just take out my current video card and drop in a new one without putting my machine under massive transplant surgery... but because I'm a fool, I can't.
The moral of the story is not that you shouldn't play games on your PC (because that's a flat-out fallacy and you know it), but that you shouldn't be an idiot like me and cling to a dying standard whilst messing around in an incredibly fast-moving industry.
Log in to comment