Procedural Weapon Animations
Procedural Animations
Responsive first person animations are important when developing a first person shooter. Visual feedback greatly impacts gunplay - the experience of using a weapon in the game - . To make animations feel more lively and responsive we want them to react to the game world and our player input. While we also avoiding looking too stiff or repetitive.
This system uses some baked animations for complex motions related to weapon handling. (reload, equip et.c.)
Why procedural animations
The main purpose of this system was to better my understanding of gunplay while challenging myself to produce responsive animations through the use of code and math. We all know procedural animations are great for adding detail and secondary motion.
In a previous game I worked on I used procedural animation to reduce the authoring time and animations required for our character controller.
Benefits of procedural animation
Quick player response
Can respond dynamically to the game world.
Procedural animations can be used to replace or reduce the reliance on bake animations. Greatly reducing the animation workload and authoring time.
Baked animations are a much quicker alternative when dealing with more complex or detailed movements. Combining these with procedural animation layers can make your weapon animations really pop.
When I use baked animations
Base weapon poses
Weapon handling (Reloads, Equip animations, Idle breaks etc.)
Hand grip poses
Pointers on creating a first person character setup.
Pre-requisites
For best result, a good character rig and setup will give you a much cleaner looking result. While helping you avoid a lot of work trying to deal with edge cases, and correcting visual errors by modifying the animation system itself.
A good rig makes for easier work and better looking animations.
Note
This project is based on Unreal Engine 5. Unreal uses animation controllers called Animation Blueprints (ABP) some technical aspects may not translate 1:1 for older versions of the engine. General concepts should still apply.
Since the release of UE5, ABPs have seen significant performance gains thanks to Blueprint Thread Safe Update Animation. Blueprint thread safe animation allows us to do many of the computational heavy parts of the animation system on separate threads instead Game Thread. This allows us to do more logical operations inside the Animation Blueprint with a greatly reduced performance footprint. Perfect for calculating procedural animations!
Driving motion through code
First person weapon animations are a great start to procedural animation. As many of the compound movements that makes up both locomotion and weapon actions can be broken down into several smaller motions.
Using some basic trigonometry and activation functions we can start constructing movement cycles from the ground up.
Using input parameters from our game world and player controller, we can build animations that feel responsive and reactive to the game world.
Animating the weapon
Animations are generaly made by calculating and applying translation offsets, rotations and scaling to the bones of our skeletal mesh at runtime.
Item Bone
When animating weapon movement, what we really want to do is animate the bone our weapon is attached to. The item bone acts like a socket that we can animate and apply rotations and offsets to at runtime.
We then make use of IK to keep our hands aligned with the weapon model.
Since we're dealing with First Person Animations the item bone should be part of your First Person viewmodel (arms).
Multiple items or Dual Wielding
In case you deal with multiple items equipped at the same time, like dual wielding or offhand equipment, having a double item bone setup can come in useful, this is not covered in this article (yet), but the same principles still apply.
Animating without an item bone
If you have no other choice than attaching the weapon to, let's say, the hand of your character. These concepts still apply.
This comes with the drawback of the item being glued to the hand of the character, making item interactions a lot more difficult to deal with, as the weapon follows the hand instead of the other way around.
I strongly recommended using a separate item bone.
I have and article on FPS Character Setup for pointers on how to establish a good base character for First Person Animations. This is the setup I've used when developing this animation system.
TIP:the unreal default mannequin comes with a good bone setup per default, with a bone named "ik_hand_gun" serving as the item bone that has child IK targets for both the left and right hand set up.Base pose
The first step to creating an animation is choosing a default pose for your character.
For a first person character this usually means finding a pose that frames your weapon without obstructing other gameplay elements.
For optimal performance, having the left and right hand IK targets properly set up and aligned to the hand poses will make for much cleaner IK solves later on.
The first person pose and world person pose are not the same!
If we compare the first person rifle pose to the full body rifle pose on the full character mesh we can clearly see that the character on the left's both right and left hand being offset unnaturally, with head and shoulders aligned. This is to better frame the weapon and hands through the first person camera perspective, while also avoiding the seams or inside of the mesh being visible to the player.
First Person pose (left): This is how the character is posed when viewed through the first person camera.
World person pose (right): This is what other players see when looking at your character.
Note the left and right hand IK bones and how they align with the locations of the characters hands in the base pose.
This is before applying any IK. Having the animations base pose match up with IK targets makes IK solvers a lot smoother.
Tip: poses as variables
There are many different ways to solve poses. Instead of using hard set animations or being overly reliant on state machines. Consider passing base poses as variables from the weapon class itself. This is one of unreals strengths and can make your system more modular.
The same goes for any other baked animation or pose you want to change based on the current weapon. Like different reloads, equips or even melee attacks.
Creating Basic Motion
Before we start animating these are some of the core concepts that are used extensively throughout this animation system. These are the base for our procedural animation system.
Timers
This is the foundation and probably most important part of this system. The most basic timer tracks elapsed game time. Learning to use timers and controlling the flow of time is instrumental to let us control the speed of animations.
Why not just use system time
Most engines have some kind of built in time function that tracks time elapsed since the application started. The reason we create our own timers is to control the flow of time. By multiplying our input time we can change the speed at which the timers increments or decrements giving us full control of when and how timers increment.
Sine waves
Sine and cosine allow us to convert linear time into oscillating linear movement. This is used to create the fundamental motion patterns that move the weapon around the screen. Changing the timescale will not only affect the animation speed, it can also be used to generate completely new patterns.
These trigonometry functions can be used for other things than animation. They are extremely useful for any kind of procedural content generation.
Sine
Sin(x) Sin(y)
Sine waves are the fundamental part of creating oscillating linear motion. You will encounter these in all sorts of procedural workflows.
This is a corner stone when working with procedural repetitive motion.
Cosine
Cos(x) Sin(y)
Combining Cosine and Sine gives us a rotating vector. This is very useful anytime we want to convert floating point numbers to directional vectors.
Double Time
Sin(x) Sin(y_doubleTime)
Adjusting the timescale creates new interesting patterns. Here, the Y component is incrementing at twice the rate of X "double timed"
This will be a recurring pattern through out this article, as it is the base for creating the different motion patterns of the walk cycle.
Defining the step cycle
Step cycle
The step cycle is crucial, it defines the time it takes for a character to complete one full step.
The step cycle is foundational, it creates the repetitive motion that drives many parameters of the procedural pipeline.
To model the swimming motion ∞ often seen in first person walk cycles. I use two timers. One timer incrementing at twice the speed of the other.
I refer to this "double timed" as seen in the example above. Step Time and Step Double Time are both used extensively when building the walk cycle. As these variables are used to control the location and rotation offset of the weapon on screen.
Baked Curves instead of Timers
The timer controlling the step cycle is sufficient to generate the visual pattern of the walk cycle.
When working with a visible lower body, you want the step cycle to synchronize with the lower body. This requires mapping the step cycle to match the lower body step frequency. This requires you to synchronize the step cycle to match the animation rate of your lower body.
One way of doing this is setting the time increments of the step cycle based on animation curves baked into the lower body locomotion animations.
Stepping at fixed rates
Sometimes you come across games that scale movement animations entirely based on movement speed. What this means is, when we walk slower or faster than our intended animation speed, instead of changing the step length, the animations starts moving in slow motion or at super high speeds.
While this can hold up in some instances. We should also consider what this models. This means our character is adjusting their pace instead of their stride.
To better model a change of stride we can set fixed step rates based on our characters movement state.
A stepped cycle changes the frequency of the movement at set intervals instead of continuous scaling. Consider having multiple step frequencies. I recommend having at least one for each movement state:
Walking
Jogging
Sprinting
Crouching
You could also create a stepping function based on your movement speed instead of your movement states!
Sway and Bobbing
Translation and Rotation
The bulk of the procedural workflow can be reduced to two main components:
Translation and rotation offsets. Where play rate and animation scale are determined by parameters within the game.
player input, character velocity, movement state and other actions, feed into the model to alter these offsets, among other things.
How these parameters are processed should be based on your weapons. As they all have different weights, center of mass and torque points.
Translational Movement
X, Y & Z location offsetOffset, bobbing speed and height is based on player velocity and direction.
Additional animation layering is done by processing the player input.
Here the rifle is pushed into the shoulder during both forward and backward movement.
Rotational Movement
Pitch, Yaw & Roll rotation offsetsSway and roll are based on planar velocity as well as specific directional velocity to accentuates lateral movement.
Combined
While these values are exaggerated, they make up the bulk of the base movement animations.
Both making use of the Step Time and Double Time to drive the base movement patterns.
System Input
Look Input
When looking around, having the weapon follow the control input feels more responsive. This is just a basic setup where the look axis controls the weapon rotation offset.
Making the weapon rotate faster than the player makes the weapon look light, and our character look more "in control".
If we reduce the rotation speed and introduce some lag, we get a weapon that looks heavy and harder to control.
Leading with the barrel not your head
This is a pet peeve of mine.
It's common to see games implementing only one of these models, resulting in light weapons feeling to heavy, or heavy weapons feeling too light.
Doing so can create a disconnect between the player character and the game world.
You expect a special forces operative to lead with the barrel as they enter a room or round a corner, aiming ahead of where they intend to go, and ready to fire at any moment. That would mean giving them a faster weapon rotation speed that looks slightly ahead of the character rotation.
The opposite would be true when you intentionally want to depict your character as lacking control. Either because a lack of experience or as a way to convey weight. For example: carrying a heavy machine gun, shouldering a rocket launcher, or maybe just shooting a front-heavy revolver.
Shooting
Recoil offset
These animations are driven by the vertical and horizontal values generated by the dynamic recoil system. In large, this represents the forces absorbed by the characters body.
These values are applied to the weapon offset to give immediate feedback on the weapons recoil and spray pattern. Also helping to better frame the bullets in ADS.
Recoil Impulses
Another layer is driven by recoil impulses. These use spring interpolation to simulate the force counteracted by the characters hand and arm muscles.
This is more prominent in less stable firearms like pistols and other weapons that tend to generate more twist and torque in the wrist.
The impulse system applies pitch yaw and roll.
Counter push
After shooting multiple rounds in a row, I want the body to start adapting to recoil. This can be done by scaling weapon recoil or the strength of impulses based on bullets in a sequence. This allows application of greater force to single shots and bursts. Giving the appearance of a greater sense of control during prolonged fire.
This is something I first read about being used in Battlefield V.
With Counter Push
The rifle returns back into frame after a few rounds have been fired.
Without Counter Push
Using the same recoil value the weapon continues to push into the character. If there was no counter push, the values would have to be tuned down alot, making the whole system look less dynamic.
Detailing
Breathing
It's important to add something that breaks up the character idle. Simply doing breathing isn't going to stand out as anything special. This is only the first layer of detailing many.
However, this is probably a good time to talk about activation functions.
Maybe familiar if you're interested in machine learning.
Activation functions
Activation functions enable us to activate or deactivate specific parts of an animation tree or playback process based on parameters through the use of basic multiplication. It's a mathematical way to make certain part of an equations relevant based on an activation state. An activation state can be anything from movement direction, speed or maybe how many bullets have been fired in a sequence. it can be any value that you can convert to the 0-1 range.
For example. We don't want this breath cycle to play when we are moving as it will be additive with our step sequence. So we want it to stop playing while we're moving. To do this we can define a speed threshold, above which we don't want our breath cycle to play. Through the use of some math we can then turn off our breathing offset as we start gaining speed.
Activation function based on movement speed threshold
We take our current planar velocity
the length of our velocity vector on the XY plane
divide it by our speed threshold
the speed at which we want the breathing to be copletely removed
clamp the value
to keep the value in the 0-1 range
invert the activation value
We now have a smoothed 0-1 float which we can multiply by our breathing offset or timer to act as a switch or "activation".
Velocity/Threshold
As the velocity approaches the threshold our value approaches 1.0. Once we surpass the threshold the value will keep increasing. You're probably familiar with basic division telling us how much our Velocity(dividend) is part of the Threshold(divisor) whole.
Clamp(0, 1, Velocity/SpeedThreshold)
Clamp(0, 1, Velocity/SpeedThreshold)
We clamp the result to the 0-1 range to make sure the values above the threshold are discarded, while preparing it for the next step of inverting value.
One minus (1-x)
One minus (1-x)
You might recognize this from the material editor. It is a very common tool when working with masks. What the one minus node does is invert any values in the 0-1 range. When working 2d masks this inverts all black and white elements of an image. In this case it inverts our activation value.
The activation now looks something like this:
Activation Value = 1-Clamp(0, 1, Velocity/SpeedThreshold)
Activation Value = 1-Clamp(0, 1, Velocity/SpeedThreshold)
If we multiply our breath offset by our activation value, the closer we get to our threshold the less it will be applied. And since we're using a float value, our acceleration will determine the blending time of our breath animation.
Vertical Offset(vector3) += BreathOffset(Vector3) * ActivationValue(float)
Vertical Offset(vector3) += BreathOffset(Vector3) * ActivationValue(float)
Our breathing offset will now be blended into our animation when we stay below the velocity threshold.
Micro Adjustments or Jitter
Micro adjustments or jitter is what i consider to be the vfx equivalent of dust particles in FPS animation. You often don't think about them, but without them your animation will look flat and uninteresting.
These motions represent the tremors and micro adjustments in our muscles.
This is a combination several frequencies, that construct a fluid and random looking movement noise. Doing this in a procedural way will allow us to change these parameters on the fly. We could scale the noise based on stamina we can give the impression of fatigue. We could turn it to 11' to make it look like the player is under the influence. These details help the animation look more lively and natural.
Subtle Micro Adjustments
Gives the illusion of muscles working to correct and stabilize the weapon even when stationary.
Exaggerated Values
The same base noise with scaled up offset values. It looks like we drank every bottle on practice range.
Hand Adjustments
Hand adjustment are used break up a static base pose. In this case, hand adjustment animations have a set chance to trigger when a character stops shooting, after jumping or when completing a reload.
This is something I refer to as an action break not to be confused with an idle break.
Action Breaks and Idle Breaks
Action breaks and idle breaks are both animations utilized to disrupt an otherwise static or repetitive base pose.
I think they deserve a distinction to get a better sense of where we can utilize and expand animations. Idle breaks exist to fill in the space between player actions, while action breaks serve to add more flair and detail to a players action.
In my case, by ensuring all action breaks are additive, they can be reused across multiple weapon types as long as the hand movements remain relatively small.
Locomotion Actions
Start & Stop Impulses
One off impulses
Small spring impulses are added when starting to move or changing direction. These are one offs that are use to emphasize when the player changes movement direction.
Exaggerated Values
Adjusting the strength of these impulses can give a very stylized and fluid look. I think this is similar to how The Finals do their animations.
Jumping & Landing
Pitch, Yaw and Vertical offset are affected by the players vertical velocity.
On landing, impulses are fed through the previously established recoil and movement impulse systems based on our vertical velocity when we landed.
If you want even more weight added, consider making landing impulses that affect the vertical offset.
Don't forget the camera
The camera system also adds a slight recoil impulse upon landing. These systems together really add weight when controlling our character.
When taking fall damage. Consider adding camera roll to give the impression of the players legs not fully bracing the fall. The FPS setup is parented under the camera meaning they follow any motions we add to it.
Unreal has a very easy to use camera shake system that work great with small animations like this.
Crouching
Usually this would be done through blending in a special Crouch pose, possibly with it's own 2,4 or 8 way directional locomotion animations. The procedural approach is just change some of the base offset values blended with a crouch Alpha value.
This example is just a slight roll and vertical offset. This can easily be customized on a per weapon basis.
Here is an example where the jump and crouch together almost gives the impression of vaulting over the wall.
Sprinting
The sprinting animation can also be modeled using the same principles. Sprinting is also a linear action, that can be easily managed as a separate pose state.
On each footstep a camera impulse is added and scaled based on the current sprint Alpha. The animations below are all driven by the exact same parameters and scaling as the base locomotion. Altering the parameters based on sprinting could yield even more interesting results, but I think this looks fine.
Two Handed Sprints
This is nothing more than an alpha blend between two poses.
The rest is driven by the same parameters as the base locomotion.
One Handed Sprints
These animation only occupy the right hand. Leaving the offhand free for it's own animations.
Tactical Sprint
Re-using the pistol sprint pose with other pistol handle weapons creates a Warzone-like tactical sprint animation for free.
This could be incorporated as a mechanic that allows off hand equipment usage when sprinting.
To be added: Sliding
Weapon Actions
Weapon actions or weapon handling have multiple purposes. From an animation perspective, to showcase the mechanics of a firearm. Especially important when portraying fictional weapons.
From a gameplay perspective, where reload and equip time are a means of balancing. The reload and equip time should be defined in code, separate to the animations. Even if animation playback can be re-scaled. A weapons form and function should always be considered.
Baked vs Procedural animations for weapon handling
Weapon handling often requires complex hand and arm movements. While these can be made using procedural animations. Creating baked animations is significantly faster. All the base animations below are authored in a 3rd party animation software. Procedural detailing is then added to break up and add some variety to the limited animation set.
Reloading
Reload animations often require a lot of hand, arm and finger movement to look interesting. This animation is made in blender, during the animation additional procedural detailing is added.
A simple procedural reload could be made moving the hand between pre-defined grip points. on the handle and magazine. This can be useful when dealing with different sized magazines or grip points.
Equip
Equip animations often serve a balance purpose.
In some games where graphical fidelity is important. Multiple equip animations can be used for weapons that are equipped for the first time or those that are already chambered.
You probably wouldn't pull the charging handle every time you equip your rifle.
Unequip Transition
Unequip animations aren't always necessary. It's fairly common to see games balance weapons by their equip time, often making the unequip instant. If you prioritize visuals unequip animations are a given.
The big edge case is solving weapon switching before a weapon is fully equipped. I.E when players are quickly scrolling their inventory, as we don't want to wait for both the equip and unequip animation.
In this case the unequip animation triggers if a weapon has completed it's equip time.
Impulse through custom Anim Notify
But wait a second, There is definitely room for procedural animation here as well!
The anim controller already contains functions to add recoil and movement impulses. Through custom anim notifies, we can easily incorporate these into baked animation tracks. We just have to call the same functions we do when adding the other impulses. Randomizing the impulse range can give more variety at a low cost.
No Impulse
The base animation without any impulses added.
Tuned Impulse
Toned down impulses that blend nicely with the rest of the weapon animations.
Exaggerated Impulse
Greatly exaggerated values to give a better view of what is actually happening.
Aim Down Sight
Centering the aimpoint
When doing ADS what we really want is to use our weapons physical aimpoint instead of our crosshair. This mainly comes down to trying to maintain the physical aimpoint of our model at the center of the screen.
FOV shifting
I'm using a material that renders weapons at a different Field Of View than the in game camera. In essence, this makes a weapon look more compact on screen. This is very useful when dealing with ADS framing.
I strongly recommend these FOV material functions: Weapon FOV - By Krystian Komisarek as a starting point.
Read more about how I set up my FPS character here: Setting up a first person character
Choose your model
How you frame your sight depends a whole lot on your project and your bullet trace model. Do you want to frame the recoil or do you want to center the weapon aim point.
Framing the recoil
In my project based on a camera recoil model similar to counter strike, featuring a vertical and horizontal offset to bullet traces. When feeding these recoil values as rotations to the animation controller, the the scope follows the recoil pattern keeping the bullets framed.
Static aimpoint
To make sure our aimpoint remains the center of screen we can instead reduce or completely eliminate translational and rotational offsets.
I still prefer keeping some rotational movement to maintain a more fluid look.
Faking it with scopes
Working with scopes is pretty forgiving. A red dot, or holographic sight have their aim points seemingly floating in the air. This means we can fake the framing.
By replacing the crosshair with a red dot or holographic aimpoint. And making sure it stays almost centered works just as well. In an action game this wont be easy to notice, and might prove more predictable in the long term.
Physical Sights
When weapons have a physical sight, faking the aimpoint becomes a bit harder, as we rely on the weapon screen position to stay centered.
What this really means is using our weapons front sight as the animation pivot point.
Centered Front Sight
When using front sights, or really any static aimpoint, we want it to remain centered.
Here recoil impulses are added after the ADS rotation adjustments. If the recoil impulses were applied first, the weapon would look like it sinks when shooting.
Misaligned Front Sight
Here is what happens if we don't properly account for weapon rotation in ADS on weapons using front sights.
This could still be work if our recoil model features significant movement accuracy penalties.
ADS Anim Graph
Setting up ads is really as simple as blending two anim states. When our rifle is on the hip we have a base pose with additive transform and rotation offsets.
When we go into ADS we have a separate state that overwrites our transform values to center our weapon on screen. This allows us to be more selective about what rotations and offsets we then reintroduce in our offset calculations.
Additive
When not using ADS the offsets are additive to retain our base pose.
Replace
In ADS we replace the base pose by hard set values that center the weapon on screen. Also layering in some of the rotation values generated by the model.
*Don't worry about the fast path. This is comes from adding offsets together in the graph for easy editing, these values can be cached once we are happy with our system.Simple ADS blend
Node(A): Hipfire Node(B): ADS
This is all you need for an ADS blend.
Continue Reading
Modern FPS games rely on dynamic weapon recoil to make different firearms feel unique and interesting. The recoil system greatly impacts the cadence of any shooter game.
Looking at visual and audio feedback elements
Feedback is half of what makes gunplay feel good