Website CVG is reporting that Crytek boss Cevat Yerli has claimed that developers' focus on PS3 and 360 is holding back game quality on PC--a format he believes is already "a generation ahead" of modern day consoles. I say, "too bad, Yerli, it's good for us gamers!". I've been pining for a reasonably stable PC spec to stop the technological arms race since the days of the 486 PC, but it's never happened. Ever since more recent times when the PC has had to take a back seat in software sales to consoles, it's been the console hardware that's been dictating what kind of big budget software has appeared on PC's (outside of a few high profile exceptions from the likes of companies like Blizzard). Why do I consider such a scenario a win? Simply because we NEED periods of 5+ years or so of stability in order for software developers to catch up to the hardware and start to butt up against the limits of what is possible. If the hardware remains a moving target, then there's less chance for normal coding challenges to be minimized, which leads to more opportunities for innovation since more focus can be placed on design rather than wrestling with the technology. With budgets already in the millions of dollars and team sizes in the hundreds, access to more power is obviously not the answer to the call for better games. Despite what some would like us to believe, there is no noticeable (i.e., real world, not benchmarks) technological divide between high end PC's and the PS3 or Xbox 360 outputting 1080p. Modest platforms like the iPhone and Nintendo DS have already long since proven that it's not necessarily power that succeeds, it's clever design. With that said, no matter what side of the debate you're on, I'd think it's hard to argue with how pleasant the idea is that the hardware we have in our possession now should be able to play the latest games for at least a few more years before requiring an upgrade, right?
One thing about modern consoles and computers is that they are beneficiaries of each other's advances. The PC platform basically refines the hardware technologies that filter down to the consoles (audio/visual advances, processor technologies, online connectivity, etc.). The consoles then provide a stable platform for game developers, whose popular games often filter back to the PC platform (GTA, Call of Duty, etc.).
The PC provides many "alternative" gaming experiences, such as adventure games, deep RPG's, freeware games, MMO's, and so forth. The whole "motion-sensing" genre of gaming just really hasn't hit the PC mainstream yet, though, and is pretty much console exclusive. There's definitely a great deal of crossover between the two worlds, where great games can be had on both platforms.
Top of the line PC's may indeed have much more power than a current console system, but maybe that's the problem; the average home PC doesn't seem that much more powerful, or even equaling the console systems' capabilities. I'm talking about onboard "generic" graphics and sound chips that are installed on typical budget PC's. And not many people are going to plunk down the money to buy a decked out "gaming" PC when in two weeks time, their system will be obsolete.
As for the "Crytek" comment, it appears Mr. Yerli is commenting on the "tech." But how much tech do you really need for a great game? Many might argue that the consoles are refining and perfecting the gameplay experience for the masses, not hindering the gameplay experience!
I'm a PC gamer, but it's hard not to notice that there are great game advances on the console front. Heck, we have people jumping around in front of their TV sets now to play a game (with no handheld controller)! How is that not an advance? How is that holding the PC back???
Right. The question becomes just what would a developer do targeting the highest of high end PCs in lieu of Xbox 360 and PS3? Yerli seems to imply more polygons or something, which we know is not the answer to better games.
While I certainly stand behind the fact that you don't need cutting edge hardware to make a good game, I STRONGLY disagree that we need years of stability in order for programmers to "catch up." That is what I call "going stale."
While programmers pushed the limits of the hardware of years past in order to get the most out of their hardware, it is just not a good model for success these days. I fully acknowledge the ability for programmers to push the limits of hardware from the past and make something better than the next guy (or gal), but they HAD to do it because the hardware was just ...primitive... from the moment of its release. How often does memory, storage, or... integer size factor in these days? Not much. It is simply a matter of "making a good game."
You can certainly compare games from the start of a platform's release to those that come out at the end of its release, but that isn't true progress - It is just squeezing more out of aging hardware. Would you have liked to do that at the start of the console's life? Most definitely. But I do not see this as a function of the programmers catching up, but rather the console hardware developers creating an easier dev kit for the programmers to use and tweak.
I would like to see developers continue to make great games AND push the hardware. These things do not have to be mutually exclusive. We don't have to limit the hardware in order to make good games. Likewise, we don't have to keep inventing new hardware and ASSUME that better games are going to come from it.
In the business world, developers are always looking for the next thing to come out to make their lives easier and cut costs. Suddenly that function you had to write yourself is now available as part of the dev kit, can be tweaked to your needs, and now do it in half the time/memory/etc.
None of this should have anything to do with how a game is fundamentally designed, but new hardware is going to make developers scratch their chins a bit - This not because they don't know what to do, but rather they want to take advantage of new hardware because it ultimately comes down to selling games. People do buy new computers and new consoles all the time, and they want the software to take advantage of the hardware. This doesn't only excite the consumer - It excites the programmers as well.
The fact that a company would hold back on PC development is pretty sad - There should be developers out there pushing the limits of PC gaming at all times. Besides - it isn't like those that code a game are also the ones that design it. Those days are long gone.
Being a pure PC gamer (I once owned a PS2 for 3 weeks before selling it) then I must say I find the worst thing about consoles "holding back" PC games not being the advances in technology, but the fact that so many games are designed for both types of system. The design of a console game is much different than that of a PC game, especially when it comes to the user interface.
Take Oblivion or Fallout 3, and compare them to older, PC only RPGs. The UI is terrible in these modern games compared to the old. I understand that you need large, easily readable fonts and controller based input devices when playing on a console, but the design philosophy for the PC is so much different and yet we get a copy-paste from the lowest common denominator.
Dragon Age: Origins did try and fix this a bit by having different UIs depending on the platform but it still had a big console feel to it.
Now, some may say that this is a necessary evil in modern gaming, the cost of producing games means you have to seek out every possible market segment, hence creating cross platform games. Maybe this is true, I don't know. But it sure does nothing to enhance the appeal for playing on a PC.
/nostalgic rant over
To me, it's simple numbers. You have about 90 million Xbox 360's and PS3's in the world. How many millions of PC's are out there with even the equivalent power of an Xbox 360 or PS3, let alone MORE power? So naturally the highest end consoles will drive most technological games, not the highest end PC's. Again, though, I'm trying to fathom what the highest of high end PC users would be missing out on anyway right now versus if their specs were the source platform. That's the part that I'd love to know. All three platforms support HD visuals, surround sound, online multiplayer, etc., so the only thing I can think the Crytic guy would be complaining about would be going beyond HD resolutions (and do we REALLY have to go beyond the max 1920x1080p that the consoles offer at this stage?) or maybe more support for things like physics co-processors, which need to be scalable on the PC anyway (like in Mafia 3) because not everyone will have one. I could see if this were Wii versus PC, but we're talking PS3/Xbox 360 versus a high end PC, and there just isn't enough of a technological difference that my non-developer brain can figure out to justify any pining for developers to target the highest end possible PC spec over consoles.
Issues of interface are a different matter, of course. Naturally interfaces should be optimized either for consoles or PC's. I don't get that part, why PC users often get the interface shaft. While console interfaces are often very usable, you can do quite a bit more when you don't have to worry about distance from the screen and you're guaranteed a mouse. So in that area, I agree 100%, and certainly of all the things to change in a game, menu options should be the easiest and cheapest to customize per platform.
It's an interesting dilemma for sure. As Bill says, the numbers pretty much guarantee that all major games will be designed for consoles, with the PC getting hasty ports or the occasional game such as Starcraft II that just wouldn't make sense on a console. On the other hand, I don't understand why folks like Crytek can't make games for both that simply look much better on the PC. Why not just scale down all the graphics and such for consoles? Maybe then 360 or PS3 owners might notice the stark contrast and decide to improve their game.
I would love nothing more for the console market to dwindle and die, but it's not going to happen anytime soon I'm afraid. There are just too many people who can't wrap their heads around the PC and simply lack the intelligence to ever upgrade. That's pretty much always been the case; only a fraction of the population has the brainpower to be a PC gamer (and the market adjusts to that accordingly, providing more complex games like Civilization V for PC than consoles). In any case, I don't know why any PC gamer should be fascinated by console gaming except for a few high-quality exclusives such as Nintendo's franchises and fun novelties such as Kinect. Consoles are best suited for kids and teens; after that, you either have the education to move up to PC gaming or languish in arrested development (pun intended).
In any case, I agree with Bill that long generations are the best for innovation at the software level. We all know how much better C-64, NES, and SNES games looked as those generations ended than when they first began. As good as PCs might be as a platform, their open-ended nature is a blessing and a curse. I was hoping that the Windows performance index (or whatever they call it) would ameliorate this situation, but it doesn't seem to have done much. The biggest culprits in my opinion are Nvidia and ATI, who seem to be producing new GPUs are *lot* faster than they are developing solid drivers for them.
What I want to know is why hardware is moving so much faster than software.
In any case, I agree with Bill that long generations are the best for innovation at the software level. We all know how much better C-64, NES, and SNES games looked as those generations ended than when they first began.
But this is not the era of C-64, NES, and SNES. Things are much different now.
What I want to know is why hardware is moving so much faster than software.
Takes longer to developer software than hardware, and both are driven by money. Hardware developers aren't going to be held back by the software developers. The two are almost completely independent despite the fact we would like them to work together quite a bit more.
I don't agree with that, Matt. I don't think there's a way to make the games look much better on PC than consoles, at least in a real world manner that someone might actually notice. Honestly, there are so many configurations, that at best you might get a marginal visual improvement (and again, some additional physics boost for those with compatible hardware). To me, the smart gamers are on the console side where they can enjoy 1920x1080p visuals on a giant screen with proper surround sound speakers and far greater stability. There are so many limitations on the PC side that it seems like it's pretty much for masochists or for enthusiasts of certain game types (Starcraft II, Civ V) or those who really like warez.
I agree with him 100%. When making PC games in the past, many times they didnt run well on "current" hardware in the devolpment cycle. But when they where released they worked on the current stuff. If your an old school gamer (hehe i think most here are) back in the day 30FPS was consider quite good in most games. The human eye sees about 24FPS... of course thats a very general statement. most films are 24-25, but here is the trick, they use motion blur to look smooth. Old games (quake) diidnt use motion blur and needed speed to make up for it.. but again, most people felt anything over 30fps was "good". My how its changed. Nowdays reviews will rant and sound like the game is utter crap if its not 60FPS consistant. Since when did game play consists of FPS? I thought gameplay was about presentations ( yes FPS is part of that), control, style, fun....
With consoles you can read all the time about a game not ready for release as it cant do 60FPS, its not about how cool or new the ideas are, its about the FPS the game can achive. Game reviews will DESTROY a game if its not doing 60FPS, they will mention it over and over, beat it into the readers head. Its even worse now that we have 2 consoles fighting for top dog (while sadly they are both behind the real top dog) the 360 and PS3.. both are programed almost totally different. the 360 follows the traditional PC style (which 90% of game designers/ programers are useing) and the PS3 cell, which most agree is the future of Computer tech (multi core is a form of it really) but may be just a bit to early for a "game machine".
Right now the user base is close to the same on both consoles (Last I read the PS3 was about 15% behind worldwide, problebly ahead as MS counts RROD machines as still "sold") But the 360 has a better User base (higher attach rate) on games. This is mostly becuase the USA is by far the largest purchaser of 360's, the PS3 normally has better sales in all other countries (the USA is actually quite behind in NEW tech, most new gadgets are in other countries for years before the US) And the US market is drove by FPShooters... the one area FPS mean the most. So the FPS sales pitch game reviewers are shoving down readers throats is starting to get burnt into the mind set of buyers. One CLEAR example of this is Bayoneta, its slower on the PS3 and has in spots lower res graphics. The people who ported it to the PS3 have publicly apoligized for it, its a well known "crap" port if you read game mags and reivews. It supose dot be so horrible its almost unplayable. I have it for the 360 and the PS3 (it was $4.99 on close out seems it wasnt such a awsome game even if it was running "well") I compared um both on my 55" TV (I liked the game). I can honesly say when side by side, yes the 360 has the uper hand. Bu if i had never seenthe 360 version, the PS3 version played just fine, only when playing both did I notice. AND never did it effect gameplay...well there where slowdowns in spots (but both systems had those). The game was 100% playable on both systems, yet if I beleived the HYPE, the game was worthlesss on the PS3... YES, it was better on the 360, but again, if you believe reviews, it was a smoking turd on the PS3, and perfect in every way on the 360.
Sorry my point is.. with CONSOLE its all about Frames Per Second, if the textures are hi res or upsampled, if it runs in upscaled or native 1080P. its not about how good the game is. The reviews read like tech manuals. In the PC world with so many configurations and speeds, games had to be about content, gameplay, as many players didnt see those bells and whistles as some or many had to be turned off.
Consoles are catching up, but up to about 2-3 years ago the PC was king of inovation. The marketplace has changed that on most consoles, some truley cool games are showing up (most where made on the PC first). The big problem with consoles is the addons, plastic musical instraments, fake record players, cameras, wands, bigger hard drives, wireless dongles. Its not horrilbe to have the options, it horrible to see a 200 gig HD sell for 5X its value (the 360 120 gig drive used to be $180!!!! how cna you justify that? its not devolpment costs like Kinect ($50 worth of parts sold for $150)). PC prices have done nothing but go down, while console prices have done nothing but skyrocket. A console game is $60 the same game on a PC is $50, you start buying 10 or so games a year, suddenly that cheap console is costing you more than you thought, then add that throuhg a 5 year life cycle.. suddnely a $400 (original price of the 360) and $500 more (10 games a year for 5 years that cost $10 more than the PC counterpart) a Wireless dongle ($100) a upgraded Harddrive ($100), a 2nd controler $50, 5 years of LIVE to play online $250, suddenly that "cheap" console is now $1400...That would buy alot of PC power (and pc do alot more than consoles, so you get double duty out of them) and maybe even an upgrade or two...
Consoles are easy to use, plug and play, anybody can do that.. It used to be the livingroom thing, but PC work there too now and aregubly a better MEDIA player (with the amount of free software there is, and far more options for how they work then consoles this way or no way system).
opps way off track again.. my whole point was just that consoles have got into a arms war, and are about specs and not gameplay anymore. Face it, right now the only thing that sells well to the "masses" is First Person Shooters... yet he PC still enjoys far more INDIE hits (minecraft to point out one) 99% of the cool indie games Ont eh console marketplaces (or Phones) are somthing that was on the PC first, or a veration of some PC freeware game.
Crytek: "PC 'a generation ahead' of PS3 and 360, but being held back" - Captain Obvious is being obvious.
Seriously though, yes PC's are being held back by consoles. And the plain and simple fact is that games are made for the consoles then quickported over to the computer, crappy controls and all. It's not like in the old days when games were made on the PC and then ported carefully (or crappily - Diablo on the PS) to the Console. This doesn't mean that there aren't games out there that take advantage of newer PC hardware and look absolutely stunning. Look at Civ V. That game will bring a fast computer to it's knees on a huge continents map with 12 Computer players and 24 City states (trust me on that one.) There are more and more games that are slowly starting to take up the advanced graphics and utilize them while also making a console release. Look at Metro 2033 that game will make most computers cry when they run it in DX 11 mode (even on NVidia's Geforce cards, which it was programmed for.). But the ratio is still woefully low and I don't see it going anywhere in the near future. When the next generation of Consoles come out and have newer video hardware and everything else in them, then we'll see better games being released on the PC. Hell even Gears of War on the PC was pretty damn good when it came out. Shame that Epic went out like a crybaby and said they'll never sell another game on the PC because of piracy.