A few days ago, Epic went ahead and revealed the Unreal Engine 5 through a demo running on a PlayStation 5. It was our first real look at the capability of the next generation of consoles and, admittedly, even a single screenshot would have been a colossal reveal compared to what we've gotten so far. For all its talk of "Nanite technology" and obsession over triangles -- to the extend Pythagoras would possibly become aroused by it -- there was no talk of how Articifial Intelligence (AI) would evolve and, of course, no show. We were treated to a Lara Croft-like character traversing impressively-lit and well-carved caves, but it was a solo, guided experience meant to showcase the visual advancements the new engine will bring.
All of which brought the idea for this blog to my mind. Let's start with a question.
Have you ever wondered, which games have the best, or uniquely-operating, AI in the games you've played? Now that I've planted said question in your head, can you separate said games by console generation? And lastly, now that all the picking apart is done, do you realise that (most likely) a huge chunk of said games is not from the current, soon-to-be-previous generation of consoles?
And, if so, we're in the same boat. I find it a damn shame.
Let's go back, to almost 20 years ago. Halo: Combat Evolved was unleashed upon the world, and for all its praise, one thing still stands out today when you pick it up for a quick playthrough: its AI. Enemies would react to you and your actions; they would coordinate, take cover, evade grenades or throw them back. The grunts would run like headless chickens should you choose to take their leader out before you do them. And to this day, it is refreshing. You feel as though the battlefield isn't a series of fancy whack-a-mole sessions.
Go on a little further down, to 2005's exceptional horror-shooter, F.E.A.R., which took things a few steps further. F.E.A.R. stands as a technological achievement, being the first game to implement context-sensitive behaviour to its characters; to put it simply, they weren't simply responding to actions, but were actively trying to outsmart the player. Instead of becoming omnipotent, though, they simply were clever. They would flush you out of cover with grenades; try to flank you, ordering around their squad members while you were shaking in your boots, listening in on their conversations. It would not be a stretch to say F.E.A.R. still acts as a blueprint for today's AI design, since many of today's mainstays can be traced back to F.E.A.R.
Later on, S.T.A.L.K.E.R., apart from having an acronym so painful to type down that I'll only refer to the game by name once, also built upon F.E.A.R.'s blueprint to produce something unique; enemies would drag their dead out of the battlefield, packs of animals would abandon their attack should you kill some of them. In group fights, they would cover their comrades with suppressive fire to slowly but safely advance towards the player. It all contributed to having a more flowing, believable experience. You weren't shooting at unthinking paper cutouts resembling enemies.
There are many more games you could point to and recall, otherwise normal, situations that were made extraordinary by the AI. The Killzone series always had a knack for smart enemies, something that elevated the tension of its brutal firefights. Strategy games also had to hone their craft in order to provide something truly challenging but ultimately fair, like the excellent Civilization V. And who could forget Far Cry 2? Apart from those who lamented malaria and the dreaded jamming guns, of course. I'm on the other side, with those who were not fazed by these randomly occuring -- and sometimes crippling -- events because the overall mix of gunplay was enhanced by superb AI.
But as the years went by, these innovative games became fewer and fewer. Games with weird acronyms became harder to come by, and as such, this scarcity of periods took its toll on AI.
In the dying years of the previous console generation, some games managed to shine. Halo Reach, before it, was also a personal highlight and a great point of reference when talking about the evolution of AI in Halo. Some years later, Alien Isolation, sat in a league of its own. The relentless alien hunting you down, in unscripted, do-or-die situations where no way out was easy or pre-planned. It is one of my most fondly remembered games for many that reason alone. ARMA 3 also had pretty mindful enemies, always noticing small hints around them and investigating. Splinter Cell Blacklist, a game in a beloved series of mine, also rose above its peers when it came to smart, observant enemies with memory of your actions; just like The Phantom Pain after it.
It is around that time when, in hindsight, I believe games took a sharp left on the supposed right-heading timeline, and instead of advancing their offerings, they stagnated, if not took a few steps back. I would think, perhaps foolishly, that with the advent of significantly more advanced consoles, with 16 times the RAM to handle complex tasks and more advanced processors, AI would benefit. I do not know the specifics of how AI works and what its bottleneck is, but I dare believe that, at the very least, multiple times more powerful systems would spell an advancement in that regard.
Alas, looking at a generation on its last legs, I can't help but feel let down by how little things improved over a span of seven years. I find it pointless to talk about the many missed chances, so I will simply talk about the most pivotal of games, the ones I deem responsible for the stagnation.
The prime culprit, for me, was The Elder Scrolls V: Skyrim. When you take a step back from player-created stories, like randomly venturing in a cave and escaping with your life hanging on a thread, you will probably see what I saw: a dumbed-down, completely passive, unchanging world. For every immersive trip to a cave, there was a city guard that took an arrow to the head (subverting expectations, I am) but suddenly forgot about it. For every memorable Dark Brotherhood quest, there was a Radiant Quest that occupied your time instead of something meaningful. Instead of actually evolving past Oblivion, Bethesda doubled down on the senseless, random nature of so many AI-related aspects. With a simply functioning companion and most of the interesting stuff being player-triggered or created, Skyrim relied on randomness and allowed for mediocrity to creep in. It is no easy task to hand-craft such a big world, taking into account every single outcome, but ultimately, I believe the game suffered for it. And, going out on a limb here, I'll say that Skyrim being devoid of any tangible personality -- outside of what the player gave it and endured throughout their playthrough -- paved the way for The Witcher III's universal praise; it helped make the "game world as a character" argument all the more valid, in my humble opinion.
But, Skyrim was just one of the many symptoms. Game developers were moving away from organic worlds and characters, towards more desolate, "personal experiences" in games. Take, for example, BioShock Infinite. To get it out of the way, I enjoyed the game, albeit less than the first because of its nonsensical story but that's not the point of this article. What I did not like, though, was the AI companion, Elizabeth; and I am, in no way, talking about her personality, which I liked. Elizabeth, as a companion, had various states: all the way from acting as a prop to straight-out cheating. She was an entity of her own -- and not in a good way. She was devastating enemies when left to her own devices, but I could not have any sense of synergy when working alongside her. The argument to that would be quite simple: a bad AI that does not interfere with the player's actions, is less likely to be noticed and break the experience. While that is a respectable answer, I believe it still remains a bad AI. The fact that I did not have to babysit Elizabeth, or worry for her safety and possible game overs, was indeed good in the grand scheme of things; but still, that she was not an obstacle is not an achievement to me.
Then, you have the likes of The Last of Us, a game that wears its cinematic nature on its sleeve. While, basically, the story is about babysitting Ellie (to an extend) from point A to point B, that is not the case in the gameplay. She can hold her own, has a few handful tricks up her sleeve and can actually be quite helpful. But, again, she is an entity of her own, in all the wrong ways. While Joel was hiding, stealthily approaching for the kill or simply wishing to pass by unnoticed, Ellie was moving directly in the sight of enemies and they would not notice her. Again, this is a case of bad AI not completely ruining the experience for the player, but it did wreck my immersion time and again. This is a cinematic game, made and unmade by its ability to engross the player in the experience.
On the other end of the bad AI spectrum, is Resident Evil 5's Sheva. For all the good she brought to the table, namely feeling like a more realistic character in the crazy universe of Resident Evil, she was quite untrustworthy. On one hand, she was capable of raising you from certain death in an instant and covering your back, making every bullet count against the zombified enemies her and Chris faced. On the other hand, she was also capable of your supplies to the ground, blazing through shared ammunition and medical items, leaving you quite vulnerable in a tight situation. However, Sheva is what I consider an imperfect middle ground in terms of AI: on her day, she's awesome. When not on her day, well, call it a day and play something else.
Of course, bad AI is not simply incompetent or unskilled AI. You have omniscient AI, primarily in fighting and strategy games, which is just as bad. Street Fighter II is a culprit, while Starcraft II is -- although, transparently so -- also cheating on higher difficulty levels. By "cheating", I mean acting instantaneously upon your input, with inhuman reflexes (fighting games) or even having access to functions the player doesn't yet have, like seeing through unscouted terrain (strategy games). In shooters, ultra-competent AI always aims for your head even when you're behind walls and they have absolutely no way of knowing where your head will be when you step out of cover. They also target specifically you, even if you're part of a four-man-squad and blow you to kingdom come with no hesitation; which also works as well as you can imagine paired with incompetent AI squadmates. This synergy of bad and omnipotent AI very often leads to pain-inducing, unfun playthroughs, although some people get a kick out of the odds being stacked against them and that is commendable -- I also enjoy victory in some such games even when I know I'm being cheated against, but that's an exception and not the rule. For example, omniscient enemies that telepathically communicate they have spotted you to an entire army, is something you can even find in games with well-designed AI and after that, it's up to the player to decide if the payoff is worth putting up with such AI.
There is a lot of ground to cover between bad and cheating AI, though, and that's what my main point is. Not too long ago, a game called Mount & Blade II: Bannerlord was released, and to my surprise, does AI very well. For example, soldiers move from point to point while raising their shields to another direction, to be covered against enemy arrows. In most games, they would simply get blindsided and fall dead. But why is Bannerlord, a game by TaleWorlds Entertainment, such an achievement in AI while colossal companies seem incompetent? I am in no way devaluing TaleWorlds' achievement here, but they simply are not of the same caliber as companies like, say, Naughty Dog, Capcom, 2K, EA and so forth. They might not have the best graphics, monetization schemes or live services, but they have a game I crave to play, because it respects my intelligence and makes each victory hard fought for in a fair way. After such an experience, it's difficult to go back to playing so-so action RPG's that wow me for a period, but fade into obscurity afterwards.
All that said, it continues to sadden me how many people -- myself included -- get wowed to no end by ray tracing and meticulously rendered caves, but do not realise that such upgrades are only skin deep and the big heads are getting away with it. What good is a photorealistic model for your companion, if they still act like a brainless chicken? What is the point of a 1:1 recreation of World War II battle, to the point where you can't separate renders from reality, when the enemy soldiers act like Dr Manhattan and vaporize you on sight? Simply put, all that increase in computing power should surely translate to advancements in behaviours and not only graphics, right? If developers are really aiming for total immersion, through visuals, audio, acting and script, why do they keep leaving the brain out of the equation? It's just that, the thought of "what could have been" turns me off when watching the next big thing in graphical advancements, because I know it will not necessarily mean a more immersive experience.
The answer to all this seems crazily simple, to me: eye-candy is easier to sell than state-of-the-art AI. With the undisputed rise in number of multiplayer games, AI companions are becoming more and more of a luxury. And with the esoterical, immersive "personal experiences", the story is the crutch holding everything together, having no need for any widely-applied intelligence to the grunts you mow down on your way to the next cutscene. I guess, it's just a sign of the times. Maybe AI isn't really needed nowadays, because of all I mentioned and possibly many more I'm not seeing, like the cost of developing it. Maybe it's a thing of the past, maybe I'm simply more demanding, or it simply has peaked without me realising it.
It is not that I wish for advancements in graphics to have stayed in the previous decade in favour of more sophisticated AI, do not take me wrong. But, as more eloquent people have put it before me...