10 min read

In this article by Peter L Newton, author of the book Learning Unreal AI Programming, we will understand the fundamental techniques and practices of game AI. This will be the building block to developing an amazing and interesting game AI.

(For more resources related to this topic, see here.)


While all the following components aren’t necessary to achieve AI navigation, they all contribute critical feedback that can affect navigation. Navigating within a world is limited only by the pathways within the game.

Navigation for AI is built up of the following things:

  • Path following (path nodes): Another solution similar to NavMesh, path nodes can designate the space in which the AI traverses.
  • Navigation mesh: Using tools such as Navigation Mesh, also known as NavMesh, you can designate areas in which the AI can traverse. NavMesh generates a plot of grids that is used to calculate the path and cost during navigation. It’s important to know that this is only one of several pathfinding techniques available; we use it because it works well in this demonstration.
  • Behavior trees: Using behavior trees to influence your AI’s next destination can create a more interesting player experience. It not only calculates its requested destination, but also decides whether it should enter the screen with a cartwheel double backflip, no hands or try the triple somersault to jazz hands.
  • Steering behaviors: Steering behaviors affect the way the AI moves while navigating to avoid obstacles. This also means using steering to create formations with your fleets that you have set to attack the king’s wall. Steering can be used in many ways to influence the movement of the character.
  • Sensory systems: Sensory systems can provide critical details, such as players nearby, sound levels, cover nearby, and many other variables of the environment that can alter movement. It’s critical that your AI understands the changing environment so that it doesn’t break the illusion of being a real opponent.

Achieving realistic movement with steering

When you think of what steering does for a car, you would be right to imagine that the same idea is applied to game AI navigation. Steering influences the movement of AI elements as they traverse to their next destination. The influences can be supplied as necessary, but we will go over the most commonly used ones. Avoidance is used essentially to avoid colliding with oncoming AI. Flocking is another key factor in steering; you commonly see an example of it while watching a school of fish. This phenomenon, known as flocking, is useful in simulating interesting group movement; simulate a complete panic or a school of fish. The goal of steering behaviors is to achieve realistic movement behavior within the player’s world.

Creating character with randomness and probability

AI with character is what randomness and probability adds to the bots decision making. If a bot attacked you the same way, always entered the scene the same way, and annoyed you with its laugh after every successful hit, it wouldn’t make for a unique experience—the AI always does the same thing. By using randomness and probability, you can instead make the AI laugh based on probability or introduce randomness to the AI’s skill of choice. Another great by-product of applying randomness and probability is that it allows you to introduce levels of difficulty. You can lower the chance of missing the skill cast or even allow the bots to aim more precisely. If you have bots who wander around looking for enemies, their next destination can be randomly chosen.

Creating complex decision making with behavior trees

Finite State Machines (FSM) allow your bot to perform transitions between states. This allows it to go from wandering to hunting and then to killing. Behavior trees are similar but allow more flexibility. Behavior trees allow hierarchical FSM, which introduces another layer of decisions. So, the bot decides between branches of behaviors that define the state it is in. There is a tool provided by UE4 called Behavior Tree. Its editor tool allows you to modify AI behavior quickly and with ease.

The following sections show the components found within UE4’s Behavior Tree.


This node is the starting node that sends the signal to the next node in the tree. This would connect to a composite that begins your first tree. What you may notice is that you are required to use a composite first to define a tree and then create the task for that tree. This is because a hierarchical FSM creates branches of states. These states will be populated with other states or tasks. This allows easy transitions between multiple states.


This node creates another task, which you can add on top of the node as a “decoration”. This could be, for example, a Force Success decorator when using a sequence composite or using a loop to have a node’s actions repeated a number of times. I used a decorator in the AI we will make that tells it to update to the next available route. Consider the following screenshot:

In the preceding screenshot, you see the Attack & Destroy decorator at the top of the composite, which defines the state. This state includes two tasks, Attack Enemy and Move To Enemy, the latter of which also has a decorator telling it to execute only when the bot state is searching.


These are the starting points of the states. They define how the state will behave with returns and execution flow.

There is a Selector in our example that will execute each of its children from left to right and doesn’t fail but returns success when one of its children returns success. Therefore, this is good for a state that doesn’t check for successfully executed nodes.

The Sequence executes its children in a similar fashion to the Selector, but returns a fail message when one of its children returns fail. This means that it’s required that the nodes return a success message to complete the sequence.

Last but not least is Simple Parallel. This allows you to execute a task and a tree at essentially the same time. This is great for creating a state that will require another task to always be called. So, to set it up, you first need to connect it to a task that it will execute. The second task or state that is connected continues to be called with the first task until the first task returns a success message.


Services run as long as the composite that it is added to stays activated. They tick on the intervals that you set within the properties. They have another float property that allows you to create deviations in the tick intervals. Services are used to modify the state of the AI in most cases, because it’s always called. For example, in the bot that we will create, we add a service to the first branch of the tree so that it’s called without interruption, thus being able to maintain the state that the bot should be in at any given movement.

This service, called Detect Enemy, actually runs a deviating cycle that updates Blackboard variables, such as State and EnemyActor:


Tasks do the dirty work and report with a success or failed message if necessary. They have two nodes, which you’ll use most often when working with a task: Event Receive Execute, which receives the signal to execute the connected scripts, and Finish Execute, which sends the signal back, returning a true or false message on success. This is important when making a task meant for the Sequence composite.


Blackboards are used to store variables within the behavior tree of the AI. In our example, we store an enumeration variable, State, to store the state, TargetPoint to hold the currently targeted enemy, and Route, which stores the current route position the AI has been requested to travel to, just to name a few.

Blackboards work just by setting a public variable of a node to one of the available Blackboard variables in the drop-down menu. The naming convention shown in the following screenshot makes this process streamlined:

Sensory system

Creating a sensory system is heavily based on the environment where the AI will be fighting the player. It will need to be able to find cover, evade the enemy, get ammo, and other features that you feel will create an immersive AI for your game. Games with AI that challenges the player create a unique individual experience. A good sensory system contributes critical information, which makes for reactive AI. In this project, we use the sensory system to detect pawns that the AI can see. We also use functions to check for the line of sight of the enemy. We check whether there is another pawn in our path. We can check for cover and other resources within the area.

Machine learning

Machine learning is a branch of its own. This technique allows AI to learn from situations and simulations. The inputs are from the environment, including the context in which the bot allows it to make decisive actions. In machine learning, the inputs are put within a classifier, which can predict a set of outputs with a certain level of certainty. Classifiers can be combined into ensembles to increase the accuracy of the probabilistic prediction. We don’t dig heavily into this subject, but I will provide some material for those interested.


Tracing allows another actor within the world to detect objects by ray tracing. A single line trace is sent out, and if it collides with an actor, the actor is returned, including the information about the impact. Tracing is used for many reasons. One way it is used in FPS games is to detect hits. Are you familiar with the hit box? When your player shoots in a game, a trace is shot out that collides with the opponent’s hit box, determining the damage to your opponent and, if you’re skillful enough, resulting in their death. There are other shapes available for traces, such as spheres, capsules, and boxes, which allow tracing for different situations. Recently, I used the box trace for my car in order to detect objects near it.

Influence mapping

Influence mapping isn’t a finite approach; it’s the idea that specific locations on the map would contribute information that directly influences the player or AI. An example when using influence mapping with AI is presence falloff. Say we have enemy AI in a group. Their presence map would create a radial circle around the group with an intensity based on the size of the group. This way, other AI elements know that on entering this area, they’re entering a zone occupied by enemy AI.

Practical information isn’t the only thing people use this for, so just understand that it’s meant to provide another level of input to help your bot make additional decisions.


In this article, we saw the fundamental techniques and practices of game AI. We saw how to implement navigation, achieve realistic movement of AI elements, and create characters with randomness in order to achieve a sense of realism.

We also looked at behavior trees and all their constituent elements. Further, we touched upon some aspects related to AI, such as machine learning and tracing.

Resources for Article:

Further resources on this subject:


Please enter your comment!
Please enter your name here