direct-x-10.png I posted a blog on DirectX 10 a few weeks ago. I was mainly ranting about how Microsoft is forcing gamers to upgrade to Vista to enjoy the graphical bliss of Direct X 10. Well, I did some research and here are the findings:
Part of Direct X 10′s original design allowed graphics cards to share memory with the system. This means that if a game was needed more memory than the graphics card could provide it would simply borrow system memory.
ATI had no problems implementing this feature into their DX 10 graphics cards but Nvidia had some issues. So, what was a concrete feature of DX 10 became an optional one.

This feature was the reason that DX 10 could not work on XP. In order for a graphics card to share system memory XP would have to undergo serious changes. Since this feature is optional, why couldn’t MS release a version of DX 10 for XP without the memory sharing feature included. Gamers are already shelling out a ton of money to play the newest games. We need new graphics cards, sometimes new motherboards, etc.
Maybe I would feel better about Vista if I didn’t feel so backed into a corner. I just want to play the games.
I’d like to close this rant with a confession: I’ve upgraded to Vista. I’m still dual-booting with XP but Vista seems to be working fine. I did have to kick it’s butt a little by turning off some of the annoying features. I’ll show you how to do that soon.
crysis-compare.jpg
This is a shot of the soon-to-be released game Crysis in which they compare the in game shots to real photographs. Yep, this insanity is rendered on DX 10 for Vista only.