Procedural Weapon Animations

Procedural Animations

Responsive first person animations are key when making a first person shooter. Visual feedback greatly affects the experience of using a gun within the game. To make animations feel more lively and responsive they should react to the game world and our player input without looking to stiff or repetitive.

Procedural animations are great for repetitive motion like locomotion and secondary motion like 

Baked animations

The main purpose of this system is to produce responsive animations through the use of math and code. It can also serve as a way to reduce the reliance on baked animations. However, base poses and baked animations are still a foundational part of this system. 

Baked animations are a much quicker alternative when dealing with more complex motion, when authoring first person weapon animations the most complex movements are those involving the hands and weapon handling. Combining baked and procedural animation can get more mileage out of a limited animation set.

Where this system uses baked animations

Pointers on creating a first person character setup.

Pre-requisites

For best result, a good character rig and setup will give you a much cleaner looking result. While helping you avoid a lot of work trying to deal with edge cases, and correcting visual errors by modifying the animation system itself.

A good rig makes for easier work and better looking animations. 

Note

This project is based on Unreal Engine 5. Unreal uses animation controllers called Animation Blueprints (ABP) some technical aspects may not translate 1:1 for older versions of the engine. General concepts should still apply.

Since the release of UE5, ABPs have seen significant performance gains thanks to Blueprint Thread Safe Update Animation. Blueprint thread safe animation allows us to do many of the computational heavy parts of the animation system on separate threads instead Game Thread. This allows us to do more logical operations inside the Animation Blueprint with a greatly reduced performance footprint. Perfect for calculating procedural animations!

Driving motion through code

First person weapon animations are a great start to procedural animation. As many of the compound movements that makes up both locomotion and weapon actions can be broken down into several smaller movements.

By using some basic trigonometry and activation functions we can model these movement cycles from the ground up.

By combining math with the input parameters from our game world and player controller, we can build animations that feel responsive and dynamic.

Animating the weapon

Animating through code means calculating and applying offsets, rotations and scaling to the bones of our skeletal mesh at runtime.

Item Bone

When animating weapon movement, what we really want to do is animate the bone our weapon is attached to. The item bone acts like a socket that we can animate and apply rotations and offsets to at runtime.

We then make use of IK to keep our hands aligned with the weapon model.

Since we're dealing with First Person Animations the item bone should be part of your First Person viewmodel (arms).


Multiple items or Dual Wielding

In case you deal with multiple items equipped at the same time, like dual wielding or offhand equipment, having a double item bone setup can come in useful, this is not covered in this article (yet), but the same principles still apply.

Animating without an item bone

If you have no other choice than attaching the weapon to, let's say, the hand of your character. These concepts still apply. 

This comes with the drawback of the item being glued to the hand of the character, making item interactions a lot more difficult to deal with, as the weapon follows the hand instead of the other way around.

I strongly recommended using a separate item bone.

I have and article on FPS Character Setup for pointers on how to establish a good base character for First Person Animations. This is the setup I've used when developing this animation system.

TIP:the unreal default mannequin comes with a good bone setup per default, with a bone named "ik_hand_gun"  serving as the item bone that has child IK targets for both the left and right hand set up.
Transform (modify) Bone - Translation and rotation of are both additive
The weapon is attached and pivoted at the handle

Base pose

The first step to creating an animation is choosing a default pose for your character.

For a first person character this usually means finding a pose that frames your weapon without obstructing other gameplay elements.

For optimal performance, having the left and right hand IK targets properly set up and aligned to the hand poses will make for much cleaner IK solves later on.


The first person pose and world person pose are not the same!

If we compare the first person rifle pose to the full body rifle pose on the full character mesh we can clearly see that the character on the left's both right and left hand being offset unnaturally, with head and shoulders aligned. This is to better frame the weapon and hands through the first person camera perspective, while also avoiding the seams or inside of the mesh being visible to the player.

First Person pose (left): This is how the character is posed when viewed through the first person camera.

World person pose (right): This is what other players see when looking at your character.

Note the left and right hand IK bones and how they align with the locations of the characters hands in the base pose.

This is before applying any IK. Having the animations base pose match up with IK targets makes IK solvers a lot smoother.

Tip: poses as variables

There are many different ways to solve poses. Instead of using hard set animations or being overly reliant on state machines. Consider passing base poses as variables from the weapon class itself. This is one of unreals strengths and can make your system more modular. 

The same goes for any other baked animation or pose you want to change based on the current weapon. Like different reloads, equips or even melee attacks.

Creating Basic Motion

Before we start animating these are some of the core concepts that are used extensively throughout this animation system. These are the base for our procedural animation system.

Timers

This is the foundation and probably most important part of this system. The most basic timer tracks elapsed game time. Learning to use timers and controlling the flow of time is instrumental to let us control the speed of animations.

Why not just use system time

Most engines have some kind of built in time function that tracks time elapsed since the application started. The reason we create our own timers is to control the flow of time. By multiplying our input time we can change the speed at which the timers increments or decrements.

Sine waves

Sine and cosine allow us to convert time into oscillating linear movement. This makes up a large portion of the fundamental motions that move the weapon around the screen. Changing the timescale will change our movement patterns.

Having a good understanding of basic trigonometry can help a lot if you intend to work with any kind procedural generation.

Sine

Sin(x) Sin(y)

Sine waves are the fundamental part of creating oscillating linear motion. You will encounter these in all sorts of procedural workflows.

This is a corner stone when working with procedural repetitive motion.

Cosine

Cos(x) Sin(y)

Combining Cosine and Sine gives us a rotating vector. This is very useful anytime we want to convert floating point numbers to directional vectors.

Double Time

Sin(x)  Sin(y_doubleTime)

The vertical component is double timed creating a more interesting motions. This will be a recurring pattern through out this article, as it is the base for creating the "swimming" motion often seen in walk cycles.

Defining the step cycle

Step cycle

The step cycle is crucial, it defines the time it takes for a character to complete one full step. 

The step cycle drives many parameters of the procedural pipeline.

In the last example above. One timer is incremented at twice the speed of the other "double timed". This creates a "swimming" or pattern that we often see in walk cycles in other games.

I like to track and store these as two separate variables:

Step Time and Step Double Time

As they will be re-used any time we need to base something of our step cycle.

Baked Curves instead of Timers

While this article is based on the use of timers. Using animation curves baked into animation assets could be an alternative way to help drive this system. 

Stepping at fixed rates

Sometimes you come across games that scale movement animations entirely based on movement speed. What this means is, when we walk slower or faster than our intended animation speed, instead of changing the step length, the character starts moving in slow motion or at super speed.

This can work fine. Though, I think it looks sloppy.

A better way would be to have fixed step rates based on your movement state. 

A stepped cycle changes the frequency of the movement at set intervals instead of continuous scaling. Consider having multiple step frequencies. I recommend having at least one for each movement state:

Or better yet create a stepping function based on movement speed instead of movement states! 

linear animation scaling (not good)

Sway and Bobbing

Translation and Rotation

The bulk of the procedural workflow can be reduced to two main components:

Translational and rotational offsets. Play rate, scale and speed of which are determined by a multitude of parameters within the game. player input, character velocity and actions, feed into the model to alter these offsets, among other things.

Weapon can have their own set of parameters that define how those values are interpreted to model weight, balance and stability differences between weapons.

Translational Movement

X, Y & Z location offset

Offset, bobbing speed and height is based on player velocity and direction.

Additional layering is done by reading current movement input.
Here the rifle is pushed into the shoulder during forward and backward movement.

(you'll see this in CS too!)

Rotational Movement

Pitch, Yaw & Roll rotation offsets

Sway and roll are based on planar velocity as well as specific directional velocity to accentuates lateral movement.

Combined

While these values are exaggerated, they make up the bulk of the base movement animations.

Both making use of the Step Time and Double Time to drive the base movement patterns.

System Input

The input axis values from the mouse(exaggerated) are added to the pitch and yaw of the weapon,. Giving it a more responsive feel.

Look Input

When looking around, having the weapon follow the control input enhances the responsiveness of animations. This can be achieved easily by directly controlling the weapon rotation with the look axis input fed into the animation controller.

Making the weapon rotate faster than the player makes the character look in control. This is how you would lead with the barrel of a rifle or pistol.

If we reduce the rotation we instead get a weapon that looks heavy and harder to control. This is something you would expect to see on a heavier weapon, like an LMG, Machine Gun or Bazooka. 

The input axis values from the mouse(exaggerated) are subtracted from the pitch and yaw giving the weapon a heavier feel

Leading with the barrel not your head

I frequently see games implementing only one of these models, making weapons look either too light or too heavy, giving a stylized look. I sometimes find it immersion breaking when a trained special force operative looks like they can't properly control their weapons.

A trained character should be leading with the barrel. By "leading" I mean pointing the barrel where they intend to go. When walking around with your weapon drawn and ready to fire, you naturally want the weapon pointed in the direction you're looking, ensuring you're prepared to fire as soon as you round a corner or turn your head.

This not only gives more responsive animation but also gives your character a sense of being "in control."

The exception to this rule is when you intentionally want to depict your character as lacking control. This can be useful when you want to convey weight, like carrying a heavy machine gun, rocket launcher, or maybe even a front-heavy revolver.

Shooting

Recoil offset

If you've read through my article about building a dynamic recoil system. You'll see that I'm tracking both the vertical and horizontal recoil offset generated by the model. Just like the mouse axis inputs these can be used to offset the weapon to help better visualize the spray pattern.

Vertical and Horizontal recoil parameters are used to affect Pitch, Yaw, and Vertical offset before layering in additional recoil detailing.

Recoil Impulses

Another recoil detail layer is driven by recoil impulses these are one off force events that use spring interpolation. These spring impulses are used to simulate muscle forces from the arms, where the regular recoil offset accounts for recoil absorbed by the entire body.

This is most visible in less stable firearms like pistols, that tend to have a more powerful twist and torque, while keeping a somewhat stable offset overall.

The impulse system also controls weapon Roll.

Counter push

After shooting multiple rounds in a row, I want the body to start adapting to recoil. This can be done by scaling the weapon recoil and the strength of impulses based on how long the player has fired in sequence. This allows me to apply greater force when shooting single shots or shorter bursts, without making prolonged bullet sprays look too twitchy.

This is something I first read about being used in Battlefield V, according to a Dev post on their forums. While this post has since been lost. Thank you to whomever wrote it!

With Counter Push

The rifle returns back into frame after a few rounds have been fired. 

Without Counter Push

Using the same recoil value the weapon continues to push into the character. If there was no counter push, the values would have to be tuned down alot, making the whole system look less dynamic.

Detailing

Breathing

It's important to break up the idle. As simple breathing pattern isn't anything special, it's just the first step of many.

This is probably a good time to talk about activation functions. If you're somewhat curious about machine learning you have probably heard the term. 

It's a lot simpler than it sounds. It's possibly something you've already been using, especially if you are familiar with the Unreal material editor.


Activation functions

Activation functions enable us to activate, deactivate, or scale specific parts of an animation, based on input parameters. It's a mathematical way to make certain part of an equations relevant based on an activation state, or as you would call it in art, masking. An activation state can be anything. Movement direction, velocity, or maybe how many bullets have been fired in a sequence. As long as we can represent the value as a floating point number, that we can normalize to the 0-1 range.

For example. We don't want this breath cycle to play when we are moving as it will be additive with our step sequence. So we can mask it out while we're moving. To do this we can define a speed threshold, above which we don't want our breath cycle to play. Through the use of some math we can then turn off our breathing offset as we start gaining speed.

Activation function based on movement speed threshold

the length of our velocity vector on the XY plane

 the speed at which we want the breathing to be copletely removed

to keep the value in the 0-1 range

We now have a smoothed 0-1 float which we can multiply by our breathing offset or timer to act as a switch or "activation".

OneMinus node from the Material Graph
Blueprint example of the activation function

Velocity/Threshold

As the velocity approaches the threshold our value approaches 1.0. Once we surpass the threshold the value will keep increasing. You're probably familiar with basic division telling us how much our Velocity(dividend) is part of the Threshold(divisor) whole. 


Clamp(0, 1, Velocity/SpeedThreshold)

We clamp the result to the 0-1 range to make sure the values above the threshold are discarded, while preparing it for the next step of inverting value. 


One minus (1-x)

You might recognize this from the material editor. It is a very common tool when working with masks. What the one minus node does is invert any values in the 0-1 range.  When working 2d masks this inverts all black and white elements of an image.  In this case it inverts our activation value.
The activation now looks something like this:  


Activation Value = 1-Clamp(0, 1, Velocity/SpeedThreshold)

If we multiply our breath offset by our activation value, the closer we get to our threshold the less it will be applied. And since we're using a float value, our acceleration will determine the blending time of our breath animation.


Vertical Offset(vector3) += BreathOffset(Vector3) * ActivationValue(float)

Our breathing offset will now be blended into our animation when we stay below the velocity threshold.

Micro Adjustments and Jitter

Micro adjustments or Jitter is something I consider the dust particles of FPS animation. You often don't think about it, but you'll feel that something is missing if when they are absent.

Combining several small motions at different frequencies creates a random movement pattern, This resembles muscles doing micro adjustments to hold still an object of weight, while adding some natural tremors.

Doing this procedurally will allow us to change these parameters on the fly. If we adjust scale based on stamina we can give the impression of fatigue, or if we turn it all the way to 11' it starts looking like we should be nowhere near a firearm. These details help add life to otherwise static animations.

Subtle Micro Adjustments

Gives the illusion of muscles working to correct and stabilize the weapon even when stationary.

Exaggerated Values

The same base noise with scaled offset values. This is what it could look like if you drank every bottle on practice range. 

This example uses a Wait X time after action. If re triggered the timer is reset and waits again.Once the timer completes there is a chance of playing an animation.

Hand Adjustments

Hand adjustment animations serve to break up a static base pose. I would like to categorize them into two types: idle breaks and action breaks. 

Idle breaks are used to interrupt static or repetitive movement.

Action breaks add variation to player actions.

Action Breaks and Idle Breaks

Action breaks and idle breaks are both baked animations utilized to disrupt the otherwise static base pose. In this case, hand adjustment animations have a set chance to trigger once when performing an action. These actions could be when: a character stops shooting, after jumping, or completing a reload. 

These animations are non-intrusive and directly linked to player actions, hence the label action break.

While idle breaks serve a similar purpose, they are used to break up inaction.

Additive subtle movement

Using additive animation is perfect for action breaks. If the movements are subtle enough we can re-use these adjustments across multiple weapon types giving us a lot more mileage out of a limited animation set.

Locomotion Actions

Start & Stop Impulses

One off impulses

Small spring impulses are added when starting to move or changing direction. These are one offs that are use to emphasize when the player changes movement direction.

Exaggerated Values

Adjusting the strength of these impulses can give a very stylized and fluid look. I think this is similar to how The Finals do their animations.

Jumping & Landing

Pitch, Yaw and Vertical offset are affected by the players vertical velocity.

On landing, impulses are fed through the previously established recoil and movement impulse systems based on our vertical velocity when we landed.

If you want even more weight added, consider making landing impulses that affect the vertical offset.

Don't forget the camera

The camera system also adds a slight recoil impulse upon landing. These systems together really add weight when controlling our character.

When taking fall damage. Consider adding camera roll to give the impression of the players legs not fully bracing the fall. The FPS setup is parented under the camera meaning they follow any motions we add to it.

Unreal has a very easy to use camera shake system that work great with small animations like this.

Crouching

Usually this would require a full animation blend space, authored for a specific crouch pose, possibly with it's own 2,4 or 8 way directional locomotion animations. Through this procedural workflow, the animation offset is just a single offset, blended by a crouch alpha value.

This example is just a slight roll and vertical offset. This can easily be customized on a per weapon basis for better framing, without any need to leave the editor.

Here is an example where the jump and crouch together almost gives the impression of vaulting over the wall.

Sprinting

The sprinting animation can also be modeled using the same principles. Sprinting is also a linear action, that can be easily managed as a separate pose state. 

On each footstep a camera impulse is added and scaled based on the current sprint Alpha. The animations below are all driven by the exact same parameters and scaling as the base locomotion. Altering the parameters based on sprinting could yield even more interesting results, but I think this looks fine.

Two Handed Sprints

This is nothing more than an alpha blend between two poses.

The rest is driven by the same parameters as the base locomotion.

One Handed Sprints

These animation only occupy the right hand. Leaving the offhand free for it's own animations.

Tactical Sprint

Re-using the pistol sprint pose with other pistol handle weapons creates a Warzone-like tactical sprint animation for free.

This could be incorporated as a mechanic that allows off hand equipment usage when sprinting.

Weapon Actions

Reloading

Reloads aren't necessarily that difficult to do in a procedural way. It's more about keeping them interesting. This is one of the cases where I do prefer to use baked animations, since creating an interesting reload animation requires a lot of finger, hand and arm movements.

Equip

Equip is another essential animation, Equip time is often a mechanic used to balance a weapon. The animation length should probably be adapted to the length of the equip mechanic before being finalized. 

Unequip Transition

Unequip animations aren't always necessary. It's fairly common to see games balance weapons by their equip time, often making the unequip instant. If you prioritize visuals unequip animations are a given.

The big edge case is solving weapon switching before a weapon is fully equipped. I.E when players are quickly scrolling their inventory, as we don't want to wait for both the equip and unequip animation.

In other words, the unequip animation is only played if a weapon has completed it's equip time.

Impulse through custom Anim Notify

But wait a second, There is definitely room for procedural animation here as well!

The anim controller already contains functions to add recoil and movement impulses. Through custom anim notifies, we can easily incorporate these into baked animation tracks. We just have to call the same functions we do when adding the other impulses. Randomizing the impulse range can give more variety at a low cost.

No Impulse

The base animation without any impulses added.

Tuned Impulse

Toned down impulses that blend nicely with the rest of the weapon animations.

Exaggerated Impulse

Greatly exaggerated values to give a better view of what is actually happening.

Aim Down Sight

Centering the aimpoint

When doing ADS what we really want is to use our weapons diegetic aimpoint instead of our crosshair. This mainly comes down to trying to maintain our diegetic aimpoint center of screen.

Note:

I'm using a material that renders weapons at a different Field Of View. In essence, this makes them look more compact, and allows not only better visuals, but also a lot of clarity when looking down the sights of a weapon. 

I strongly recommend these FOV material functions: Weapon FOV - By Krystian Komisarek as a starting point.

The animations follow the recoil offset framing the spray pattern.
The red dot in this case is just an alternate crosshair existing in the HUD blended based on the ADS alpha

Choose your model

How exactly you want to frame your sight depends a whole lot on your project and your bullet trace model. Do you want to frame the recoil or do you want to center the weapon aim point.

Framing the recoil

In my project based on a camera recoil model similar to counter strike, featuring a vertical and horizontal offset to bullet traces. When feeding these recoil values as rotations to the animation controller, the bullets get neatly framed by the sights.

Static aimpoint

To make sure our aimpoint remains the center of screen we can instead reduce or completely eliminate translational and rotational offsets. 

I still prefer keeping some rotational movement to maintain a more fluid look.

Faking it with scopes

Working with scopes is pretty forgiving. A red dot, or holographic sight have their aim points seemingly floating in the air. This means we can fake the framing. 

By replacing the crosshair with a red dot or holographic aimpoint. And making sure it stays almost centered works just as well. In an action game this wont be easy to notice, and might prove more predictable in the long term.

Physical Sights

When weapons have a physical sight, faking the aimpoint becomes a bit harder, as we rely on the weapon screen position to stay centered.

What this really means is using our weapons front sight as the animation pivot point.

Centered Front Sight

When using front sights, or really any static aimpoint, we want it to remain centered.

Here recoil impulses are added after the ADS rotation adjustments. If the recoil impulses were applied first, the weapon would look like it sinks when shooting.

Misaligned Front Sight

Here is what happens if we don't properly account for weapon rotation in ADS on weapons using front sights.

This could still be work if our recoil model features significant movement accuracy penalties.

ADS Anim Graph

Setting up ads is really as simple as blending two anim states. When our rifle is on the hip we have a base pose with additive transform and rotation offsets.

When we go into ADS we have a separate state that overwrites our transform values to center our weapon on screen. This allows us to be more selective about what rotations and offsets we then reintroduce in our offset calculations.

Additive

When not using ADS the offsets are additive to retain our base pose.

Replace

In ADS we replace the base pose by hard set values that center the weapon on screen. Also layering in some of the rotation values generated by the model.

*Don't worry about the fast path. This is comes from adding offsets together in the graph for easy editing, these values can be cached once we are happy with our system.

Simple ADS blend

Node(A): Hipfire Node(B): ADS
This is all you need for an ADS blend.

Continue Reading

Modern FPS games rely on dynamic weapon recoil to make different firearms feel unique and interesting. The recoil system greatly impacts the cadence of any shooter game.

Looking at visual and audio feedback elements
Feedback is half of what makes gunplay feel good

Planned Updates

Lower body animations