Thursday, September 13, 2007

What I don't get about video game consoles

All 3 of the next-gen video game consoles on the market today must have cost a small fortune to develop. They all have custom CPUs & GPUs, proprietary controllers, protected disc formats, blabbety blabbety. In fact, the last I heard the 'profit margin' on each PS3 sold was -$200. That's right, the best thing you could do to help Sony's bottom line right now is buy a 360 or a Wii. That's how much this stuff is costing.

So let's say you're Sony, and you've just finished the main components for your video game console. CPU jointly developed with IBM for super computers? Check. Custom GPU developed by NVidia and capable of outputting at 1080p? Check. Brand-spanking new high density optical drive that also plays high definition movies? Check. Now we just need to put ram in the system . . . wait, we're only giving it 256 megs of video ram, and 256 megs of system ram?

Microsoft is just as bad as Sony. They were a little smarter, going with a unified architecture that lets the GPU access more than 256 megs of ram if it needs it (and lets face it, most of your ram is going to be used for textures). But they still put the same suck amount of ram in their system.

So here's my question for the console designers out there: why do you guys spend so much on custom chips, and then cheap out on ram? For goodness sake, it isn't like ram is expensive. You could buy 1 gig of ram retail for $30. Assuming you didn't get any wholesale or bulk discounts on the stuff, would it really kill Sony's bottom line if they lost an extra $30 on each system?

You might be wondering why it matters so much to me. It's simple: All the fancy-pants hardware in the world doesn't amount to a hill of beans if the system doesn't have enough ram to store high res textures. Take the ps3. It easily overpowers the 360 in terms of raw horsepower. Its CPU is the exact same as the CPU in IBM's blade servers and supercomputers. The ps3's GPU runs at a faster clock speed than that of the 360 GPU, and has more pixel and vertex shader pipes (although to be fair, the 360 uses its pipes in a more efficient way by using a unified architecture). The ps3 also uses blu-ray discs, meaning each disc can hold 40 freakin gigs more data than each 360 disc. And yet, when you compare cross platform titles between the two consoles, the 360 versions tend to look a little better.

Why is this? There are theories out there, but my guess is its all about the ram. Sony's GPU could run at double the clock speed of Microsoft's, and it wouldn't matter. If you can't store high res textures, then you can't display them. And without high res textures, all you've got is a ps2 with more polygons and bloom effects.

Oh, and let's not leave the video game wunderkind, the wii, out of this. They're clocking in at an impressive 91 megs of ram (good luck trying to figure out the way its all configured, by the way). 91 megs! All pcs built in the past 15 years have had at least 126 megs of ram. Now I realize the wii doesn't need as much ram since it only outputs in 480p, but seriously, <100 megs is ridiculous. Don't expect any wii games to blow you away any time soon.

If anyone from the console industry would like to explain why you guys prefer investing in big expensive stuff at the expense of RAM, please post here or send me an email. I'm waiting.

No comments: