Uncategorized

The Hardware Hardships of the 2010’s

This blog entry started out as a reply to a message by robtek on the Il-2 Battle of Stalingrad(/Moscow) forums considering the limitations of the Digital Nature engine used in Rise of Flight and BoS/BoM. You would be forgiven for thinking I’m sometimes easy to launch into full rant -mode since that’s exactly what happened this time as well – and I don’t even really disagree with robtek! However, I do believe there’s an interesting and important hardware-related perspective when comparing the likes of Il-2 BoS and to an extent Digital Combat Simulator to older titles such as Il-2 and Falcon 4.0. So, let’s take a trip down memory lane for a look at how things have changed and how they’ve stayed the same for simulation gamers during the last couple of decades.

The original Il-2 wasn’t born as the extremely versatile and flexible game we know it today. In addition to improvements (by Maddox Games and later modders) in the code itself, the original Sturmovik was helped immensely by intel, AMD, Ati and nVidia. I have a rather strong opinion about many things in BoS that I see as design flaws (to be honest I don’t even really consider it a matter of opinion), but to give the developers some justice you have to also consider how much Mr. Maddox and his crew (as well as everyone else in the early 2000’s) got assist from hardware industry giants during the years.

Let’s make one thing straight, I’m not actually making a case for the current Digital Nature engine being the best thing ever. In fact the 32-bit engine is something that could very possibly hold BoS/BoM back massively. The DirectX 9 dependence has been a warm topic on the forums lately, but that’s probably in practice a less serious limitation. In fact it’s quite interesting how many games are still DX9 about 13 years after it was released. The situation is by no means ideal for a number of reasons, but BoS/RoF are far from the only games still relying on the relatively ancient technology.

I think the really interesting question here is how can such an ancient programming interface still be so relevant in gaming. The answer is probably a case of goodenoughitis plaguing the hardware industry in the recent years as well as some genuine technological limitations that have held back hardware development.

As an interesting thought experiment think about the computers we had five years before 2001, the Year of the Sturmovik. There were Pentiums and Pentium 2’s if you were really lucky, something like 200MHz at most. Many still ran 486’s and maybe even a few 386’s were around. A gaming computer had something like 2 – 16MB of memory and the very high end ones even had video cards that could assist the CPU with basic 3D rendering. Keep in mind that being a hardcore gamer I’ve of course always been interested in the state of the art high end stuff, so my memories are probably rather biased and not exactly representative of what you’d expect from a more typical computer at any given time.

Five years later, in 2001 (or the Year one Anno Sturmovik) the GigaHertz race was over and it was a contest between the AMD Athlon XP and the somewhat infamous Pentium 4 by intel. The latter sacrificed everything to reach insanely high clock speeds, up to 2GHz at the time – making AMD introduce the “something plus” naming convention (for example Athlon XP 1900+) to advertise the fact that their CPUs actually ran faster clock for clock – their actual clock speeds were lower than the name implied. This was rather interesting time for hardware enthusiasts, since the two CPU giants took separate routes in CPU design more than ever before. Computers had in the region of 256+MB of memory (IIRC) and a computer without an advanced 3D video card wasn’t acceptable as a gaming device anymore.

The increase in computing power in the five years preceding the original Il-2 (approximately 1996 – 2001) was absolutely astounding. Gaming had changed forever and stuff only a few years old was seriously retro. This was also the time gaming went truly mainstream (partly assisted by massive technology development) – and simulators were relegated to a niche far from top charts.

Five years later – which is about 2006, the year that saw the ultimate commercial release of the original Il-2 Franchise, Il-2 Sturmovik: 1946 – you could get a quad-core CPU at about 2.4GHz and lots of operations per cycle. The clock speed had lost its use as a good measure of CPU performance a while back already, but you can use that number as a very rough comparison to later generations, the quest for clock speeds at all costs was pretty much over and the designs were more balanced. The true powerhouse of a well built gaming computer was an advanced video card. For example you could get a GeForce 7900GTX, the pinnacle of DirectX 9 cards and one of the early DirectX 10 supporters – of course the power to run DX10 effects properly arguably wasn’t quite there yet. A gaming computer had 2-4GB of memory, together with the video card’s half a Gig it was already starting to bump into the maximum capabilities of 32bit operating systems.

The increase in capability was again quite massive, although not on the scale of what happened during the five years before Il-2. Lock on: Flaming Cliffs, the first Eagle Dynamics title with the “modern” flight modeling and a predecessor to the Digital Combat Simulator platform as we know it today, was released in 2005. On the mainstream side of things this period was war between Sony’s immensely successful Playstation 2 and Microsoft’s Xbox.

Now, let’s fast forward five years from the release of Il-2: 1946, up to about 2011 – the time Rise of Flight was really beginning to take off after a relatively slow and humble beginning. 64-bit systems were becoming a thing and computers had 8 or even 16GB of memory. The CPUs were still four core, but they were getting rather fast at over 4GHz with overclocking (intel i5 2500K+) and high per-clock efficiency. The video cards were rather powerful beasts capable of running most games handily even in Full HD – which is basically what monitors could at best display anyway (except in some very special cases). Full HD was an important culmination point for resolution, since it’s what good televisions were back then and mostly still are.

This five year period was not exactly the quantum leap the previous ones were, but you still got quite a bit more power during those years, pushing the boundaries of what was possible and allowing Il-2 to do far greater things than what was possible at the beginning of the millennium.

Since we’re already at 2011, the next approximately five-year leap forwards will get us quite close to today. So what does a hardware-crazy hardcore gamer like me run in 2015? Note that this will be a very important period for the games based on the Digital Nature engine.

Add a video card from 2012 and the answer is the exact same thing I was running in 2011. Well I do have a few SSD’s now and those are rather wonderful things, but if I went to a computer store with a suitcase full of cash and asked for all the gaming power they have I couldn’t get much. Another video card would probably help a bit in some cases (although perhaps not that much in simulators), but since 2011 (2012) I’ve been running an i7 2600K @ 4.2GHz and a 690GTX which will give a $1k+ Titan X of 2015 a run for its money (and that’s quite a bit of money). Of course it’s cheating a bit being two cards in one, but the point stands.

I can’t really say that the last four or so years haven’t brought progress in hardware, but I’ll still go ahead and do just that. Because compared to the previous revolutions in computing capability – especially the earlier years – that’s exactly what’s been going on. Nothing. The average computer is probably better than four years ago, but the state of the art hasn’t changed much.

One of the reasons is technological. Pushing the boundaries of performance is getting harder and harder, at least without a revolution in manufacturing technologies or perhaps quantum computers or other such massive technological advances. Part of the problem is the goodenoughitis I mentioned earlier. Gaming in general isn’t pushing technology like it did before. Modern computers can make Call of Duty look pretty damn good and have done so for a while and everything has to be multi-platform anyway, so we’re mostly limited by what the consoles can do and they don’t tend to get updated quite as often as PC’s did at the turn of the millennium.

The icing on the cake is that simulators are actually getting both ends of the shortest stick at the same time. The technological advance raised production costs massively and we’re unlikely to ever see a single simulator with as many planes as Il-2: 1946, because it just takes so much more effort to create a plane at modern day standards. Unfortunately income per plane doesn’t seem to scale in a similar fashion. Ironically at the same time development costs reach never before seen heights, progress in hardware development (or the lack thereof) isn’t helping to boost the capabilities of the simulators that actually do see the light of day despite the development costs.

The level of expectations regarding graphics, AI and flight modeling have increased so much that the cost in CPU and GPU cycles is completely unimaginable to the computers that ran the original Il-2 more or less happily at the time. Now multiply that sophistication by a factor of lots, since instead of the usual separation between the player aircraft and the AI, Il-2 BoS (and RoF) are running a high end flight model for every AI unit as well. Having suffered decades of AIs piloting F-109s and Eurofighter Spitfires I can only commend 777 Studios / 1C Game Studios for taking this step. Well you know I’m going to berate them for it soon after, but that’s just because I’m me. Having a level playing field against the AI is something I’ve dreamed of for a long time and it’s a bold and necessary step forwards for flight simulators.

Of course it’s also a controversial thing to do considering how no simulator ever has managed to replicate the thousands upon thousands of planes and other units needed to fully simulate World War II -era battlefields. Not even close. You’d be right to ask if it’s a wise approach at all considering the hardware limitations, and perhaps it isn’t – but this is precisely how progress happens. Someone does something different and if it works, it becomes the new norm. Sometimes it takes a bit of time, though, and the pioneers don’t get to reap the rewards if they were too far ahead of their time. I really don’t know if 777 / 1C Game Studios have made the right compromises or not, but in this case I definitely won’t blame them for trying.

Note that I’m by far not qualified to evaluate if the engine is actually putting all the hardware resources to good use (I am worried about the 32-bit limitation I mentioned earlier, however). To be honest I’m not especially impressed with any of the modern combat simulators’ engines, we’ll have to see how the long awaited EDGE (Digital Combat simulator) pans out. I did an intriguing test with Rise of Flight a few years back by creating missions with more and more AI planes around until my computer just gave up – in a spectacular fashion. Usually when DCS or other incarnations of Il-2 lag and chop and in general don’t run very well I look at the CPU just to see it’s mostly doing nothing and the video card is giving me the “don’t look at me I’m basically idling here” -routine. At least when Rise of Flight ran my computer aground every single one of the four cores was screaming for mercy and I have to respect that even if I can’t tell whether or not the CPU was actually doing stuff in an efficient manner.

There are basically indefensible design choices in the Il-2 Battle of -series but I won’t go into details at this time, if you happen to be interested in my view on them there’s the article I wrote earlier. It may be possible that the game is also hindered by non-compromises such as the AI running a full flight model just like the player(s), but this approach has definite upsides to it I felt worth mentioning. The main point of this blog post is, however, pointing out one more reason why comparing modern simulators to the likes of Il-2 Sturmovik (2001) is somewhat unfair also because of reasons beyond the developers’ control. The stagnation of hardware development is something that affects all simulator hobbyists. There’s so incredibly much stuff we just can’t do in our simulators because the hardware just isn’t there yet – and may not be for a long while.

Disclaimer: the correctness of the included bits of hardware history depend largely on my memory, which is not actually that dependable at all.

Advertisements
Standard

3 thoughts on “The Hardware Hardships of the 2010’s

  1. extreme0n3 says:

    I think the point about hardware not changing much since 2011 is quite incorrect.

    Compare the top of the line GPU in 2011 (GTX 580) to today’s powerhouse’s and there is a huge difference in performance and available VRAM.

    Run one of today’s most intensive games at maximum settings on a GTX 580 and the game will be bearly playable due to poor frame rates.

    In fact, if I run a game today which pushed hardware to the limits in 2011, my PC will break into a sweat today, and I’m NOT running the very top spec hardware available today.

    Like

    • It is true that video cards have kept improving at a higher rate than CPU’s. However, (without actual statistics in hand) their development rate has slowed down considerably compared to the earlier years of 2000’s for example. Back in the day monsters such as the Radeon 9700 family absolutely crushed the existing cards. At the high end video cards would turn almost obsolete in a year, nowadays it’s more of an incremental improvement. That may actually change now when even the best cards are struggling at 4k resolutions.

      But as I admitted in the text my video card comparison was a bit of a cheat since I’m using SLI on one board – and it’s actually from 2012. That’s still a somewhat interesting comparison, since if you take for example the Radeon 9700pro from around 2002 (close to the original Il-2) and go three years back it’s up against a GeForce 256. I didn’t find comparisons with a quick search, but if those could do SLI I think you’d need a whole bunch to get even to the same ballpark with 9700 performance.

      Anyway and perhaps most importantly as far as this article is concerned (it started out as a response to a claim that the Digital Nature engine can’t handle large amounts of units), at the most common 1080p resolution the video card is usually the less important factor in simulator gaming. Especially when talking about the number of units in a mission the video card doesn’t help very much at all. Progress from 2011 up to today isn’t zero (which I kind of almost admitted already in the main text), but if you look at it from simulator gaming perspective, also take into account the CPU and compare to what was going on during the early Il-2 days, the difference is massive.

      I don’t really know what to make of your last point, since I think it’s pretty much what I’ve been trying to say. Games from 2011 (RoF, DCS, iCoD etc.) will make modern computers break sweat pretty much as they did back in the day because the hardware development hasn’t been that great. However, if you go back to 2001 when Il-2 was released and use a computer from that era it will probably handle 1997 games like Quake 2 without breaking a sweat.

      Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s