Video Game Talk (Part 2)
View Single Post
06-21-2013, 04:51 PM
Join Date: Mar 2011
Originally Posted by
The jump from DirectX 7 (PS2) to DirectX 9 (PS3/360) was exponential. It was an astronomical gain that completely changed paradigms about lighting, and shader models.
The jump from DirectX 9 to DirectX 11 is very incremental.
Skyrim from Xbox360 to PC would be a great example of the gains, but it wasn't exactly at the highest end of what was achievable on the Xbox 360.
Next-Gen is equivalent to a low/mid gaming PC.
The jump from 480p to 720p was MASSIVE
The jump from 720p to 1080p is hardly noticeable unless they're compared side by side on a 50" TV.
Same goes for how a lot of the art is made.
The jump from Faked Global Illumination to Real-time Global illumination probably won't even be noticed, but will have costed millions to implement. Slightly crisper shadows, and maybe a few extra post-processes.
I would LOVE to be proven wrong though.
Im certainly in no position to argue with you as my only experience with graphics in games comes from playing them
. I do appreciate all the insight as I find it very interesting.
I was just incredulous of your claim that 720p and 1080p are tough to distinguish as often when I boot up a game for the first time, my computer defaults it to 720p (albiet with other setting lowered as well) and I just go "wtf, this looks terrible". Then after switching to 1080p and pulling up the other settings it looks fantastic. I suppose the differences are much easier for me to notice with my face a foot from my screen than they would be for someone playing on a console from 10 feet away.
Just out of curiosity (this maybe an awkward question, but Im not sure how to word it), what is it about having a better GPU that makes the biggest difference in how a game looks? Im just thinking specifically of how I think games look way better on a high end desktop rig than even on my laptop, let alone a console.
View Public Profile
Find More Posts by Neatman