In what ways are console gaming holding PC gaming back?

  • warning: Creating default object from empty value in /home/buckman/public_html/neo/modules/advanced_forum/advanced_forum.module on line 492.
  • warning: Creating default object from empty value in /home/buckman/public_html/neo/modules/advanced_forum/advanced_forum.module on line 492.
  • warning: Creating default object from empty value in /home/buckman/public_html/neo/modules/advanced_forum/advanced_forum.module on line 492.
  • warning: Creating default object from empty value in /home/buckman/public_html/neo/modules/advanced_forum/advanced_forum.module on line 492.
Bill Loguidice's picture

I recently tweeted - to some degree in frustration after reading the same tired complaint yet again - "For all those who insist console gaming is holding PC gaming back, I'd like to know what that might be other than slightly nicer graphics." In other words, we continue to hear talk that this almost six year old console generation is responsible for holding back what the state-of-the-art in PC gaming can be. But really, keeping in mind that both the Xbox 360 and PS3 are capable of 1080p and full surround sound, and have default controllers with lots of buttons, how exactly are consoles holding PC game designs back? Sure, PC's have more memory, storage and polygon-potential, as well as more buttons thanks to its default keyboard, but really, what game designs would be getting exactly if consoles didn't exist? Flashier versions of current games don't count.

What games would PC developers be giving us if they weren't "held back" by consoles? How much more power is really needed given the designs currently being unleashed? I can't think of one game released where I thought, "boy, more processing power/memory/storage would really make this game so much better". If a dev said, "I have this really radical idea, but I can't do it because consoles are holding me back," THEN I'd listen and maybe even agree. Wanting more polygons is not a design issue.

On Facebook - where my tweets also automatically go - we're having an interesting discussion about some of the possibilities, but I don't buy what's being said. For instance, even though Civilization V was designed expressly for the PC, a commenter thought that its interface design was held back because of the influence of consoles in the thought process of the designers. In other words, Civilization IV, which was apparently designed at a time when console ports (or console originals) were a less pervasive presence, was not influenced by the thought that interfaces should be simplified and/or get out of the way as much as possible, and as a result featured a more sophisticated and better interface than Civilization V. To me, any perception that Civilization V's interface was somehow dumbed down is incorrect. Instead, if there's any issue with the interface, it's just bad design, period, and has nothing to do with whether consoles exist in the world or not. I also don't think any of the Civilization games are a good example for anything, simply because Civilization 1 was perfected right out of the box. Sure, the rules became more refined and sophisticated, as did the artificial intelligence and options, but all the essentials were in place way back in 1991 (and that engine could arguably accommodate most of the new rules and additions), so technological limitations have little to do with anything in the case of the Civilization series.

So, what are your thoughts on this multi-layered, hot button issue?

Comments

Bill Loguidice
Bill Loguidice's picture
Offline
Joined: 12/31/1969
Countering
Xan wrote:

Point 2 can be refuted by thus: If there were no consoles, then developers would enhance games for the PC as they did back in the days of 3DFX graphics.

Well, first of all, it's not really consoles you have to worry about, but more like smartphones and tablets. The extraordinary resolution of the new iPad aside, developer energy is going to be diverted more and more to those even further limited platforms as well, so you'll have both consoles AND mobile to divert interest away from the PC. With that said, the PC is in no way hurting for games from what I can see, with a particularly vibrant indie community and more interest in development (even if many times they're just ports) than there has been in YEARS. Part of that is also attributable to what you consider a downside--that there are enough PCs of sufficient minimum spec out there to make it worthwhile.

Also, it's unfair to compare the 3DFX days - which was just after the initial boom of dedicated PC 3D cards and a time when developers were completely making the switch from classic 2D sprite-based work to full polygons (which, I'd like to point out, was driven to a larger degree by the PlayStation 1, a console) - to today's far more mature market. Audio-visuals have "stabilized", with no known paradigm shift in sight, like we had from text to simple sprites, then to mega colors and resolutions, then to polygons of ever increasing quality.

Look, clearly, most of us (me included) DO want better technology, but I think it's a hollow argument that it will get us noticeably better looking games, let alone better playing games. Whatever happens, it's still all incremental. We're a long way off from the next "wow" moment and perhaps will need glasses-free 3D and/or hologram technology before there will be another paradigm shift equivalent to past ones.

n/a
Xan (not verified)
It will get us a lot better

It will get us a lot better looking games. Have you seen those photo realistic renders game developers send out as previews for games? With a modern graphics card you could play with that level of fidelity if you optimized a game for it. Never on the console, but if a developer made a game exclusively for, for example, the GTX 600 series graphics cards, we'd have an amazing photo realistic game. However there is no profit in that, especially when you bring consoles into the mix. Take consoles away, and have only one graphics card, and I'll wager technology would improve ten-fold.

Bill Loguidice
Bill Loguidice's picture
Offline
Joined: 12/31/1969
More comments
Xan wrote:

It will get us a lot better looking games. Have you seen those photo realistic renders game developers send out as previews for games? With a modern graphics card you could play with that level of fidelity if you optimized a game for it. Never on the console, but if a developer made a game exclusively for, for example, the GTX 600 series graphics cards, we'd have an amazing photo realistic game. However there is no profit in that, especially when you bring consoles into the mix. Take consoles away, and have only one graphics card, and I'll wager technology would improve ten-fold.

Again, I LOVE technology and always want MORE and BETTER, but I don't feel like I'm missing out on anything at the present time, and I think others feel the same way. I'm actually kind of enjoying something of a breather from a moving technological target. Everything looks good now. Sure, we CAN and WILL do better, but I don't see a particular need to "rush." Also, target renders and source renders are one thing, but it's a whole other issue trying to put it into an actual game when you have to worry about performance issues related to frame rate, gameplay, AI, etc. (i.e., the other technology in the computer).

It's too much to expect to take away consoles, take away mobile, and just have only one graphics card on PC. The former won't happen of course, and in regards to the latter, that's not how the PC has worked since the 80's, and in fact would make the PC nothing more than a more finicky console by standardizing on anything like a single video card, defeating your whole goal.

n/a
clok1966
Offline
Joined: 01/21/2009
yes and no
Xan wrote:

Just use logic. If you want to make the latest game and need the latest technology to do so, why would you make it for 6 year old technology?

this is very true.. but you look at tech today and a game released today.. it was designed on tech from a year or two ago.. You can guess on current tech but you will never have it while making the game..Unforntantly thats not how its looked at.. SALES, development costs.

Sales=Consoles.. there are almost no games that sell better on PC then the console.. so from a PURE money standpoint you devlop for console and port to PC. And even if you are looking at he PC side.. if you go look at STEAM and machine specs.. the AVERAGE machine is a long ways from the top end one, so developing on a PC you will limit your sales bigtime for developing for the greatest hardware.

Where thesi a even out factor is graphics.. we are reaching a roof here.. there is alwasy room for imporvment, but the major leaps are gone for awhile.. we are seeing tiny changes now.. and the console and PC can hang.. the consoles only need HD .. so they can keep up with PC running 2X the res.. whre console will possilby fall behind is the pure brute HP required for physics and such.. but they ahve a small advantage thiere too as they are custom, they can offfload some of that sometimes.. where a PC has to brute force it.. which the high end stuff can do.. but the average PC might not be able too..

I love my PC and will stand by it.. 100% but consoles are great at what htey do..

Matt Barton
Matt Barton's picture
Offline
Joined: 01/16/2006
I guess the question here is

I guess the question here is to what extent having a known, standardized platform trumps better hardware. The big thing is coding, specifically, the kind of close-to-the-machine stuff where a tiny bit of optimization can make a huge difference.

I'm not convinced that 6 years is enough time for full optimization to take place. If they had another 6 years to work with, I'm sure you'd see a steady improvement in quality as coders found ways to squeeze more power from the engine. I'd be surprised if current games for the 360 and PS3 are fully tapping the full power of the platforms; probably more like 60-70% efficiency.

It's helpful to look at any popular system and compare the early games with the mid-life games and finally the after-market stuff.

n/a
Xan (not verified)
I'm sorry but that's just

I'm sorry but that's just simply not true. The consoles at this moment are operating at least 120% efficiency. They have to use tricks and abuse draw distance and field of view in order to get some performance. There is no way to get more power from the hardware, it's not like there are secret areas that have yet to be unlocked. It is being used fully.

Anonymous (not verified)
Exactly, that's why one of

Exactly, that's why one of the most impressive games to this day is the first crysis which COULD NOT run on the consoles. After they used cryengine 3 for crysis 2 there is tricks to reduce and play with resolution in order for the game to play smooth on the consoles (and don't forget low AA, Filtering, Shaders). Why do people think crysis 2 had to release an updated pack with better textures? Well because crysis 2 paled in comparison to crysis 1 and the backlash they recieved from PC gamers was enormous. Sure it "Looked" shinnier and newer, but they (and the developers even ADMITTED THIS) had to "dumb down" the game so it would effectively be playable on the consoles. Crysis 3 which is in production will have a full split. There will be consoles version then a far superior PC version of the game. PC preorders are throughthe roof on that game being they will optimize the game for the PC superior quality. Even Bthesda is looking into doing the same with the next elder scrolls game quoting "PC gamers have been ignored long enough. It's time we give them what they deserve. Better textures, smoother models, controls made specifically to handle mouse & keyboard, and an all around amzing experience."

Metro 2033 looked terrible on xbox and PS. Jag edges, motion blur to the max to keep up with FPS, flatter textures. On PC well it was very impressive. On PC, xbox, and PS the sounds sucked the same though. Guns were very unrealistic sounding.

I was reading an saw something about ipad and iphone games, or portable devices. Is it me or have I NEVER been excited to run home and fire up that new game I just got on my ipad/iphone? "Oh boy! Can't wait to play that new angry bird series!" Those game are good for where your on the toilet or your 5 year old nephew is being a pain in the A. Annnnnnd.....the controls suck on touchscreen devices. I made the mistake of downloading Final fantasy and sonic to my ipad. Horrible.

Bill Loguidice
Bill Loguidice's picture
Offline
Joined: 12/31/1969
More
Anonymous wrote:

I was reading an saw something about ipad and iphone games, or portable devices. Is it me or have I NEVER been excited to run home and fire up that new game I just got on my ipad/iphone? "Oh boy! Can't wait to play that new angry bird series!" Those game are good for where your on the toilet or your 5 year old nephew is being a pain in the A. Annnnnnd.....the controls suck on touchscreen devices. I made the mistake of downloading Final fantasy and sonic to my ipad. Horrible.

Again, it's an oversimplification and generalization that all games on mobile are Angry Birds-like and that controls suck. The two examples you gave for controls sucking were games optimized for something other than a touchscreen--they were traditional controller conversions. As with anything, if a game is optimized specifically for a platform's specific strength - be it motion tracking, touchscreen, or whatever - it will be superior over traditional controls. The reverse is also true.

Clearly you're rather PC-centric, an evangelist if you will. Nothing wrong with that, but personally, I prefer a broader, more interesting spectrum of gaming interests where I'm indeed exposed to constant innovation beyond just pretty pictures...

n/a
Anonymous (not verified)
For me it's all about the

For me it's all about the whole experience and everything to the max. There are great games out there (gears of war, and I guess Moden warfare?) with lacking graphics and just game play in mind and I accept that, BUT if consoles could do better graphics don't you think the deveopers would be making games that are better looking? Graphics havn't improved simply because they can't. Like someone said before me consoles are already running at 120% and have to reduce AA, texture sizes, and resolution just to get to an acceptable 30fps (minimum exceptable FPS for gaming). And don't for get they add a huge amout of motion blur to console games as, the GPU's simply can't process the data in an acceptable FPS.

Have you seen what the new batman game looks like on PC vs Console? Remarkable differences. Consoles have less shadows intergrated, less partical matter in physical environment. Fire looks more plastic than anything. Now on PC everything is spot on and with Physx added to the game, it looks incredible.

Maybe I'm just too much of an enthusiest to where everything right down to the last detail has to be spot on or atleast give me the option to maximize my PC gaming experience with advanced settings. Don't hold my experience back because others can't do what I can.

Bill Loguidice
Bill Loguidice's picture
Offline
Joined: 12/31/1969
Discussion points
Anonymous wrote:

For me it's all about the whole experience and everything to the max. There are great games out there (gears of war, and I guess Moden warfare?) with lacking graphics and just game play in mind and I accept that, BUT if consoles could do better graphics don't you think the deveopers would be making games that are better looking? Graphics havn't improved simply because they can't. Like someone said before me consoles are already running at 120% and have to reduce AA, texture sizes, and resolution just to get to an acceptable 30fps (minimum exceptable FPS for gaming). And don't for get they add a huge amout of motion blur to console games as, the GPU's simply can't process the data in an acceptable FPS.

Have you seen what the new batman game looks like on PC vs Console? Remarkable differences. Consoles have less shadows intergrated, less partical matter in physical environment. Fire looks more plastic than anything. Now on PC everything is spot on and with Physx added to the game, it looks incredible.

Maybe I'm just too much of an enthusiest to where everything right down to the last detail has to be spot on or atleast give me the option to maximize my PC gaming experience with advanced settings. Don't hold my experience back because others can't do what I can.

You have to fundamentally understand the difference between user experience between playing a game on a console via a TV - even an HDTV - and a PC with a monitor. Typically, TV's are viewed from a distance back on a very, very large screen, whereas PC monitors are viewed up close on maybe a MAX 27" - 32" screen. You don't actually need the same sharpness/visual fidelity you need on the PC side because of said viewing distance and other factors.

I'm sorry your personal PC experience feels watered down, but the market is not going to change, and frankly I don't really want it to. Again, I like the idea of developers getting to know a platform intimately and being able to focus on elements other than visuals for a period of time. It's also a far more inclusive approach, meaning MORE PC gamers can play games on their PCs, rather than just a few who have the resources to update their computers on an annual basis. Of course, we always have to be careful of going too lowest common denominator (which is a given, I think), for instance, with "Minecraft" just released on the Xbox 360 and a vocal minority belly aching over the inability to play splitscreen multiplayer on an SD TV. My answer, get a frickin' HDTV finally - they're cheap enough - but heck, if that's what they have to play it on, who am I to say how it should be?

n/a

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.