Tobold's Blog
Wednesday, October 26, 2011
Are hardware requirements back?

Heartlessgamer posted the recommended PC specs for Battlefield 3, and I couldn't help but notice that the relatively expensive computer I bought a few months ago is just barely above those specs. This arms race between hardware availability and PC game specs is something that went on for many years, but was very much subdued in the last couple of years. But it appears that games have caught up again with the hardware. That isn't necessarily a good thing, because I don't think everybody has a Windows 7 64-computer with a quad core, 4 GB of RAM, and a Geforce GTX 560 / ATI Radeon 6950 or better yet. Of course that is the recommended specs, the minimum requirements are lower. But having to tone down the graphics settings for a game to run smoothly is not something that makes you feel good about your PC.

Sometimes I wonder whether companies like NVidia subsidize game companies to make them produce games which don't run well on an average computer. Or is there really a demand for much more photo-realistic games out there? I often like cell-shade or comic-style graphics more than I like photo-realistic ones. And I'd rather have my weapons cause some sparkly special effect than realistic-looking blood and gore. But that might just be me.

Are games with high-end hardware requirements attractive to you? Do they make you want to upgrade or replace your computer? Or are you happily playing games with much lower requirements?

I suspect that increasing PC specs is another sign that game companies are pushing back against the current console cycle. I think that it's a good sign that the PC is on the ascendancy again. Hopefully that means Microsoft and Sony will have an incentive to move forward with new consoles.
Yes and no. The era of having a killer game that absolutely requires an upgrade is gone. The jumps are too small for that. For example, ID's and LucasArts' games heralded the arrival of VGA, 32-bit and sound cards, while Quake started the age of GPUs. By contrast, the current-era upgrades are much more incremental, and the focus has somewhat shifted away from polygons and to pixel shaders. Sharp edges on models were much more visible than the fact that the reflection off the blood isn't quite right.

I did expect physics coprocessors to be the next leap, but the demo games treated it as additional eye candy rather than an integral part of gameplay. For example, the physics-enabled version of Unreal Tournament 3 added hail and movable debris. That's not worth paying for. The hype fizzled out and physics processing was quietly refactored to use existing GPUs instead.

I do pre-emptively upgrade my system every few years, though. The last upgrade was two years ago and I'm in a habit of buying slightly more memory than recommended, so I'm still good for a while. I do admit that it feels odd playing stuff like Cave Story, Eversion or Minecraft with it. But I also impulsively purchase some AAA titles like Bulletstorm or Space Marine and expect them to run smoothly without additional upgrades.
It's very hard to make a game so graphically-Spartan that it reduces my enjoyment. Age of Empires II, gem of my childhood, is about the threshold at which bad graphics drive me away from a game.

That said, artistic style impacts my enjoyment of a game's graphics far more than technical specs. I find Minecraft and Terraria beautiful, despite their paltry technological demand.

I may not be a good representative of gamers at-large though; I willingly play text-based adventures even to this day. I'm sure that's a niche hobby, even within gaming.
It's crazy, isn't it. I am thinking of buying my partner a copy of Arkham City for Xmas but I'm already not sure if his PC will run it.
I'll just name one title here. Skyrim. Or rather, The Elder Scrolls. If you see how their grahpics improved over time, it's pretty damn amazing how much that adds to immersion (and awesomeness).

So yes, I'll buy new hardware, even if it was solely for Skyrim. Because the amount of immersion is already big in that game, and it's even much bigger with proper hardware :)
I actually have a Windows 7 x64 Quad-core with 4 GB of RAM and an ATI HD5700. It's quite expected (and no surprise) that in order to play the latest games, it's a requirement to have a beefy computer.

Game requirements will plateau over time, with the addition of recent post-processing shaders and multi-core machines distributing more work.
I think those games have their place, and I am happy the engineers have more incentive to move forward again.
For me ridiculous high recommended specs is usually a reason to not buy a game, why should I reward a company for this kind of nonsense.
i only play mmogs and rtsgames on my computer... for all the rest i got my 360. took some time to get used to a gamepad. and lets be honest... most of the games today are consolegames that are ported to the pc.
I strongly dislike high end requirements. I'm not that affected by the look that I get pleasure from seeing a higher polygon count in a model. After all I played back in the days when game models were made of large, clearly visible, squares.

All high requirements do is mean that sometimes a game won't run on one of my machines. Usually I notice in advance but occasionally I buy a game I can't run.

I think WoW has proved that, in the field of MMOs at least, low requirements are a massive advantage.
That is why I'm playing more and more on the PS3 rather than PC. With the exception of RTS's and exclusive PC titles I think I will stay on the consoles from now one.

And they are quite hardcore now, check Dark Souls. :)

Tell you what, if you try Dark Souls and write a good (good as in quality even if you bash the game) article about it i'll buy you a coffee.
Yes it is..I could spent money to upgrade my pc to max to play the game I love with super graphics but graphics alone cannot draw me to a game..

I will not play Battlefield 3 for example because it is not the style of games I like but if you ask me if I would like MMO games have super graphics the answer is yes and I am willing to spend money for better graphics..and I alwasy want to play the game I love in full settings to gain the little extra shine I can
Second post..cannot edit my first but if you can merge it it would be nice :P

I recently have read an article about DirectX. The article say that consoles have not powerful graphic cards but still can provide much more quality graphics that pc's with much better graphic cards.The reason to this is DirectX.

In consoles, game developers can "communicate" straight with the graphic card and get the 100% of its power, while in pc games game developers have to "communicate" with DirectX and DirectX tell them what they can do, what they cannot do and how to do it..and of course DirectX don't let them use graphic cards in a good way.

It also say that many game developers have asked microsoft to remove somehow the DirectX.I don't have much knowledge on pc so I don't know how it works but if they are right, do you think there is a marketing game behind in order for us to constantly buy and upgrade our graphics card, while our graphics card are very strong already?
Hah, remember when we were told that PPUs were the next big thing that all computers were going to need for nextgen games?
I noticed it too. The Witcher 2 supported high end PC's, BF3 surely does, the new Oblivion game does...

But it doesn't come as a suprise. The consoles are 6 years old and really, we shouldn't be happy anymore with graphics that are 6 years old anymore. I'm glad that I finally have a reason to upgrade my PC again, too long have I felt that all PC games were just console ports.

If you want to create a timeless game than yes, you do have to aim for a cartoony look. WoW did this and it's running fine on older PC's while still looking very good this day. Going for realistic graphics and there's no way that your game will still look fine seven years from now.

Shooters like BF3 and CoD however? They're meant to be played for exactly one year, until the new game in the series comes out. So it makes sense to go for the latest and greatest graphics.
I do hope that hardware requirements for games don't start rising drastically again. Especially since in terms of graphics we're already at a point where I can barely perceive technological changes making any difference to my play experience anymore. I see people comment that "this new game looks so much better than [some old game]" and I simply don't see it. Either I like the style or I don't, but in terms of fidelity the existing technology has already been able to satisfy me for years.
Much as I have enjoyed building and upgrading my own gaming PCs over the years I am not sure that I really want to go back to the bad old days when you needed a major upgrade every year to play the latest games.

Perhaps I am just getting old and crotchety but the upgrades don't seem to make as much difference any more in any case. When I replaced my 486 processor with a 586 Duke Nukem 3D went from freeze frame to smoothly playable. When I replaced my TNT2 with a Geforce Half Life 2 felt like a completely different game. Now when I upgrade I maybe get slightly better shadows ... meh.
Some of it is just advertising, I think. Bump up the hardware requirements on your not-so-special game, and you tap the market of people who buy needlessly powerful state of the art computers and need a game with those requirements to show them off.
Lassie: Skyrim's recommended specs are GTX 260 or higher, thats not a high spec, its almost 3 1/2 years old.
I don't think the requirements are that high. There are quad-core CPU for less than $100 (AMD Phenom II X4 830). 4 GB of memory is $25. I looked at the specs of the new PC that you posted and your graphic card is GTX 570. This is an upper-middle end graphic card, but it was released in Dec 2010, almost a year ago. The fact that a 1 year old middle-end graphic card is still comfortably above the recommended spec, is an indication that the requirements are not absurdly high.
Playability has always been more important to me than graphics. Having played games since the Apple ][ days, more eye candy is nice but not a requirement. That said, if a game is good but its lowest graphics setting still results in bad framerates, it does get distracting. For MMOs, solid bandwidth and low latency is more important to me, especially in twitchy raids or pvp.
Shooters have always been hardware pushers, and in a way they should be. A huge part of the draw is having the game be the best-looking thing out. The gameplay has more or less remained the same for years now.

Plus if one genre is raising the average gaming PC up, that's good for all genres. MMOs looking a decade-old at release is not exactly a huge plus IMO.
I think games that need high end rigs are a niche product, just like any other game targeted at a niche audience. For something like Battlefield 3 it can make sense, as they want the multiplayer to be played for years to come, so the graphics need to be strong at launch to remain looking good in 2-3 years.

The other thing is, very occasionally, a development team actually WANTS to do something high end. iD do it all the time (gameplay usually suffers), along with DICE and Crytek. They really seem to enjoy making amazing looking games that require massive specs to run perfectly. If they have a market and it's felt they can make money, why not?
I have a quad core processor, Windows 7 64-bit, 6GB RAM and my computer is 3 years old. It runs anything I can throw at it, including any game at 1280x1024. Problem is, the video card is the weak link (it's a low power card) so if I want to play at 1920x1200 I can't. Traditionally I have always upgraded some components when Diablo 1/2/expansion were released so with Diablo 3 I may yet again upgrade something and that would be the video card and perhaps the RAM to 8GB. RAM is so cheap these days that there's almost no excuse for less than 8GB, if you are a gamer.
I'll leave this with you:

"Modern Warfare 3 and Battlefield 3 will help sales of Xbox 360s and PS3s," said Pollak. "More importantly, and rarely covered by the press, Battlefield 3 is driving upwards of a billion dollars in PC builds and upgrades this year alone. No other title since Crytek's Crysis had such an anticipatory impact on PC hardware sales."

Aside from the natural human drive to consistently advance and improve, I'm sure there are financial gains for software developers who constantly push hardware requirements.

It's like modern vs. classic electronics. We live in a throwaway society now. If GE made a toaster that lasted 100 years, you'd never have to buy a new toaster and it would limit their profits.
The only time I have upgraded stuff for a game was that I upgraded memory in an old desktop in order to play EQ2 on it at release, so 7 years ago now. That computer played Morrowind and SWG just fine before the upgrade, and I even was complimented on the quality of my SWG screenshots at times when I shared them.

I do upgrade my computer periodically, but not for games. I n spite of games being the primary reason I have a computer. . . .
@Spinksville - Consoles are such an insanely good value vs. PCs, especially for games like Skyrim, Batman, Dark Souls, and FPS titles. You can get a PS3 for cheaper than a mid-range video card these days and it will outlive that video card by 3-5 years. Not to mention it is an awesome media centre that requires far less tweaking over time.
I think this is a misleading title. High hardware requirements are not back. Recomended hardware requirements are on the rise but a computer that is 2-3 years old can still run just about any game out there assuming you are willing to lower the graphic settings.

I bought my computer during WAR beta and it runs BF3 fine. I have the graphics turned down and I don't that I'm missing out on anything. Two of my friends did full upgrades last week for BF3 and we are all in the same game, same team.
It's actually a ploy by Microsoft. Here's my reasoning. Everytime you replace your CPU or motherboard you have to reactivate Windows. But you only get a couple of these and you might have already blown through them during the first install or due to reinstalling after a virus.

So in addition to that new shiny quad CPU you are being forced to buy a new windows installation of software you already own.

Gone are the days where you bought it and had it installed on 2 or 3 computers that you owned. With activation you are forced to buy a copy for every computer and they end up tracking it to a CPU or motherboard.
The P.C. won't be around all that much longer so it soon won't be an issue.
That has been said for around 20 years now. I don't buy it that the PC is going away.

To be honest I think all the recent changes in requireing more powerful hardware has to do with DirectX. All the games list 9.0 but I know that Skyrim has parts written that take advantage of DX 10.0.
I don't mean for multiple posts but forgot an important point. PC's are much better for games like Skyrim than a XBox or PS3. The reason is user mods. Tons of great stuff were written for Morrowind and Oblivion that a console player never got to see. It extended the life of the game many times over.
You know when you're on The Elder Scrolls desktop computer upgrade plan when...

You're most recent computer replacement was to play Skyrim.

The one before that was to play Oblivion (or possibly Fallout 3).

And the one prior to that was to play Morrowind.

Give each a mid-life kicker of a solid bunch of parts upgrades about half-way between TES releases and you always know you're good to go!

Just pray you don't have my awful luck with video cards and manage to avoid the dying the death of a thousand artifiacts at random intervals outside the schedule :)
I recently have read an article about DirectX. The article say that consoles have not powerful graphic cards but still can provide much more quality graphics that pc's with much better graphic cards.The reason to this is DirectX.
The article was misleading. XBox 360 uses Direct3D (a part of DirectX) just like most PC games do, and the PS3 uses the OpenGL API.

Direct access to graphics hardware can theoretically yield better performance, but it comes with a high price: Incompatibility. Before Direct3D and OpenGL emerged as the de-facto standards, early 3D games were practically exclusive to a specific graphics card. With massive tinkering, you could theoretically get them to start on other cards, but they would often be slow and buggy.
We’ve been enduring the effects of console-centric development for too long. Shitty PC ports masquerading as simultaneous releases are the slow asphyxiation of innovation in the industry. Game technology advances these days are focussed entirely on how to squeeze a little extra quality or performance over the static, stagnant hardware that everyone’s pandering to, instead of pushing the envelope on advances we can’t even conceive yet.

It’s like owning a high-end, luxury sports car and filling it with bog-standard unleaded from a gas station which doesn’t stock premium because that’s not where the money is. We are already experiencing the knocks and rattles in the engine by way of design decisions made entirely on console limitations.

The below effects are some of the more annoying examples of completely unnecessary console-born ‘features’ that are detrimental to the game, and could easily be overcome on a PC.

* Disappearing bodies to save on RAM. (Too many culprits to mention. Even BF3’s campaign does this. I just killed a terrorist and he faded into smoke. What the?! Am I fighting VAMPIRE terrorists now?! More importantly… if the fantasy humanoids I’m killing turn to ash when I kill them, how the hell am I fighting skeletons later?)

* Gamepad-oriented controls and/or no keyboard re-mapping. If your quicktime events still have red, green, blue, yellow indicators or even XABY (Force Unleashed) or the ‘controls’ section of the help file still shows a 360 control pad, this is how you can tell the porting team was just plain lazy or understaffed. The WORST offenders (Deadspace, I’m looking at you) haven’t even switched on to the fact that a mouse doesn’t act like an analogue stick.

* Inventory/map/vendor/whatever GUIs that show you virtually nothing on one page, and force you to use tab or some other bumper equivalent to switch between the dozens of unnecessary menu pages. My face is maybe 1-2 feet from the screen, you don’t need to fill it with text large enough to satisfy Hans Moleman.

* Checkpoints/save points instead of saving whenever you want/need to. No ability to rename save files. (RAGE: Savegame 12: Wasteland. Oh yes, I remember that unique point in time where I was in… the wasteland.)

* Graphics options limited to ‘brightness’ and ‘resolution’. (Anti-aliasing? Aniso-whatnow?) Low-poly environments with higher-rez texture packs sewn on… if you’re lucky. Sometimes even post-launch as a PATCH.

And one of the best FU’s: “Please do not turn off your conso—er, PC while saving indicator is on.”
Post a Comment

Links to this post:

Create a Link

<< Home
Newer›  ‹Older

  Powered by Blogger   Free Page Rank Tool