Skip to content

Recipe ‐ High‐and‐low‐level‐realtime‐pose‐input‐in‐unity

Felix s1910238061 edited this page Jan 26, 2024 · 7 revisions

This project provides examples of real-time pose input inside a Unity Project. We explore two state-of-the-art pose estimation systems from different domains within the same project. Our first implementation runs a lightweight real-time pose estimation algorithm that directly takes the users' webcam as an input. Our second implementation uses the specialized multi-camera OptiTrack system together with a motion capture suite as the input feed.

We have developed a simple game where the player controls a whimsical creature with their own body. The movement of the creature is entirely physics-based. The Game uses a locked-side view that follows the player, and movement only occurs on a single plane. The goal is to navigate to the end of a level with some simple obstacles by walking, jumping, and ducking.

The full game with the webcam input can be downloaded here

Game Level Layout

Recipe: Single view pose estimation input in unity

This project provides an example for webcam pose estimation use with Unity and runs in real time even on mid- to low-tier PCs.

Using an existing Webcam as an input for pose estimation has many benefits, such as cost, availability, shareability, device invariability and tracking setup effort. Unsurprisingly, these benefits come with a lot of drawbacks, mainly tracking precision and stability. These will also be discussed in parallel to the project setup.

The baseline for this project was the following repository: https://github.com/ganeshsar/UnityPythonMediaPipeBodyPose This provides a bare-bones Unity that already receives streaming data from the MediaPipe Pose estimation algorithm, which has to be run in parallel.

Requirements

  • PC or Laptop (We used windows, but this should also work on other platforms)
  • Webcam
  • Unity (We will be using 2022.3.15)
  • Python (We tested 3.9.0, 3.9.18 and 3.11.6)
  • Space to move around. Your entire body should be visible to the Webcam. Depending on your fov you might have to stand a few meters back.

Hardware Setup

You just need a computer, webcam and enough space. However for the tracking to work reliably you need ample lighting and a high contrast of the body ot the background. The examples in this recipe present sub-ideal conditions.

Note that depending on your camera you might have to stand back quite a bit for your entire body to fit the frame. This can especially be inconvenient for the fixed webcams in laptops.

Webcam Setup with a Laptop

Software Setup

Most of the core functionality is already provided by the repository. The pose estimation runs disconnected from the Unity project inside a Python environment.

The pose estimation code only needs a single library installed in addition to the default python installation:

  • Run pip install mediapipe in your Python environment. Then it should be good to go.

  • Run main.py

  • Wait until the camera feed window appears, indicating it's ready.

If no camera feed opens, the project will not work. This can happen if no webcam is found (indicated by very high fps) or the script is not executed in the correct Python environment.

If the camera feed is visible, the data stream should already be accessible to the Unity project.

Pose estimation capture

We want our character to mirror our movement. The simplest way to do this is to flip the image before the pose estimation. This is done by uncommenting line 105 in body.py.

# Image transformations and stuff
image = cv2.flip(image, 1)
image.flags.writeable = global_vars.DEBUG

# Detections
results = pose.process(image)

The pose estimation supports 3 levels of complexity [0, 2], the default being 1. 2 provides higher accuracy, but at a significant performance cost. We need our input to be as responsive as possible, so we stuck with 1. This runs fine even on a mid-tier laptop. We have observed level 2 to run below 30 fps on a high-end desktop PC. The change can be made in global_vars.py.

Mapping Limb Rotations

MediaPipe Pose outputs 33 landmark positions in 3D space. The project already provides with a dummy skeleton made out of spheres at the landmark positions, connected by lines.

MediaPipe Pose Landmarks

The provided PipeServer Script reads from the data pipe and makes the landmark positions available in the Unity project.

Vector3 position = pipeServer.body.Position(Landmark Mark);

Landmark is a provided Enum that contains the indices and names of all 32 landmark positions.

For our game, we are really only interested in shoulder, elbow, leg and knee angles to control our character. This is done by taking 3 landmark positions and calculating the angle at the center landmark. Alternatively, the world angle for a limb could just be calculated using two landmark positions. This would have been mostly sufficient for our game, however using 3 Landmarks compensates for body tilt.

Sampled Landmarks

The landmark positions are solved in 3D space, however for our game, we only need the angle in the XZ plane. Debug lines help a lot with spotting mistakes early on.

Vector3 dir = pipeServer.body.Position(right) - pipeServer.body.Position(center);
dir.z = 0;
Vector3 dir2 = pipeServer.body.Position(center) - pipeServer.body.Position(left);
dir2.z = 0;

angle = Mathf.Clamp(Vector3.SignedAngle(dir2, dir, Vector3.back)*scale, angleMin, angleMax) + angleOffset;

Debug.DrawLine(transform.position, transform.position + dir.normalized, Color.red);
Debug.DrawLine(transform.position, transform.position + dir2.normalized, Color.green);
Debug.DrawLine(transform.position, transform.position + Vector3.back, Color.blue);

To make the limbs compatible with unity physics an to make the movement of the limbs look more natural, the angle is not applied directly but through a spring joint connecting the limb. To change the target angle the JointSpring has to be copied, modified and then applied back to the joint as data within the struct cannot be modified directly.

JointSpring hingeSpring = hinge.spring;
hingeSpring.targetPosition = angle;
hinge.spring = hingeSpring;

Tuning the Input

Mapping the calculated angles directly to the character results in very fast and twitchy movement. Angle changes are capped to a maximum speed to filter out instabilities in the pose estimation. We wanted fairly responsive and twitchy movement, so maxSpeed was later increased to 1400 degrees per second.

float angleDelta = angle - prevAngle;
if (Mathf.Abs(angleDelta) > maxSpeed * Time.deltaTime)
{
    angle = prevAngle + Mathf.Sign(angleDelta) * maxSpeed * Time.deltaTime;
}

For each limp unique values for rest position, angle range and overdrive are defined. The calculated joint rotation can be multiplied to exaggerate the movement of the limb. This was used for the legs, to give the character more expressive movement, even with limited leg movement of the user.

The settings for the right forearm object:

Script as seen from the inspector

Inactive Rest Angle is the angle that is the position that will be assumed when the limb is deactivated.

Further tuning is done using the spring joint settings using spring force and dampening:

Spring Joint Settings

The rotational Velocity of a RigidBody in Unity is heavily capped by default. This cannot be changed in the inspector, but has to be done in code. This value is the maximum rotation in radians/second. The default value is 7.

void Start()
{
    GetComponent<Rigidbody>().maxAngularVelocity = 100;
}

Movement

The leg movement is amplified by a factor of 3. Maybe a bit too much.

Movement Helpers

Making the character actually able to navigate the level was possibly the biggest challenge. Moving the character using just the limbs and physics in the desired direction is almost impossible. We recommend not just relying on pure physics for movement, but also adding some (more or less) invisible helper forces.

A push force can be applied to nudge the character into a particular direction based on the user's tilt. The force is not inherently known to the user, but it makes the character appear much more responsive and controllable. The average tilt of the body is calculated by averaging the directions of the shoulder-hip vectors. We only apply the force if the character has been touching the ground within the last second, to not accelerate the character while in the air. The force is also applied at the approximate position of the feet, as it is less noticeable and does not push the character over.

mainRigidbody.AddTorque(Vector3.forward * Mathf.Clamp(-deltaAnge, -30, 30) * turnUprightForce, ForceMode.Acceleration);

if (IsGrounded() && active)
{
    Vector3 force = new Vector3(moveForce * -Mathf.Sign(torsoTiltAngle),0,0);
    mainRigidbody.AddForce(force, ForceMode.Acceleration);
    mainRigidbody.AddForceAtPosition(force, transform.position + gameObject.transform.up * -3f, ForceMode.Acceleration);
}

A torque is also applied that always slightly pushes the character towards upright.

Gameplay

Shipping

The Unity project can simply be built and shipped as usual. However, setting up a Python environment for the pose estimation and running the script from the command line is not exactly user-friendly.

We did not manage to get the popular option Pyinstaller to work with our project.

cx_freeze worked out of the box with the default command.

cxfreeze -c main.py --target-dir dist --target-name="Pose Estimator"

Unlike Pyinstaller, cx_freeze cannot create a single executable, but creates a folder with the .exe and all dependencies. You could create a single .msi installer file, but we have opted to just zip the forlder with the project as is. We have not found a way to link the executable to the Unity project, so it still has to be started separately.

Notes

  • Body Visibility: Ensure your entire body is visible and facing the webcam. Even though it is tempting to run around during the game, try to stay within the camera's bounds.

  • 'Remove Legs' Option: Even though you may intend for the user to use their entire body, this might not be possible in every setup. To accommodate those with more limited space, we have added a "remove legs" button in the start menu, which removes your legs from the game. This makes it possible to play the game with just the upper body.

  • Emojis in Folder Name: If the project path includes emojis, the Pose Estimator breaks down :')

 

 

Recipe Mocap Game using Optitrack

This Tutorial is an alternative to the Webcam Version above. Many aspects between the two versions are shared: The difference is that the movement input works differently: Instead of using a webcam, we use a mocap suit and the Optitrack System to get the motion data. This leads to a more precise and stable movement for the game, but it also requires a more complex and expensive setup.

Requirements

  • A unity version that supports the Optitrack plugin. (We will be using 2022.3.15)
  • Visual Studio 2019 or the latest Visual C++ Redistributable
  • A computer with decent specs to run the game and process the mocap data.
  • An Optitrack System
  • A computer with Motive installed
  • A mocap suit

Setting up the Optitrack System

Before creating or playing a game, the Optitrack system has to be set up. Please note that setting up the Optitrack System may take a while. This guide will cover the software setup. For information on the whole physical Setup (placing the trackers and connecting everything), please refer to the official Optitrack documentation.

Installing the Optitrack Motive software

The first step for using the Optitrack system is to make sure that the Optitrack Motive software is installed, which is the case on the computer in the LRS Lab in the FH Hagenberg. Motive is used to calibrate the system and to stream the data to Unity. The software can be downloaded from the Optitrack website. Beware that the Software is not free and requires a license.

Calibrating the Optitrack System Software

The next step is to calibrate the whole Optitrack System, to make sure that the tracking is as precise as possible, and that the tracking area is set up correctly. Before starting the calibration process, make sure that there are no objects in the tracking area that could interfere with the tracking. Hide or cover those objects behind opaque objects. Objects that could interfere with the tracking are for example:

  • Mirrors
  • Windows
  • Shiny objects
  • Spare tracking markers
  • The mocap suit
  • The wand
  • The calibration square
The viewport of Motive

After clearing the tracking area, go to the Motive Software on the Computer and press the "New Calibration" Button. This will then open a window with all the camera Previews. If there are white dots on the preview, they could either be trackable Objects that are in the tracking area, or they are reflections of the tracking cameras itself or other unknown sources.

White dots to mask.

To handle those reflections, double-check again that there are no objects in the tracking area that could reflect the IR light. If there are none: for this case the Motive Software has a "Mask" button. This button will instantly try to mask out all the remaining reflections from the preview.

Calibrating the Optitrack system

Wanding

If there are no more white dots in the preview, the tracking area is ready to be calibrated. To start the calibration press the "Start Wanding" button in the Motive Software.

The Wand View Motive

Now grab the "wand" that was delivered with the Optitrack System and start walking around in the trackable area while waving it around. The system will automatically detect the wand and will use it to calibrate the system. The camera preview will show the path of the wand. Make sure to cover as much area as possible.

The Motive Software and the LED Rings around the cameras will show the calibration level for each camera. The circle works like a progress bar. The more LEDs are lit, the better the calibration level is. If all LEDs are lit, the calibration is good enough. If only a portion of the LED circle is lit, the calibration is not good enough. If the camera detects the wand also blue LEDs will light up as an indicator that the wand is in the current camera's coverage area.

The Camera View Motive The Start Calculating Button Motive

After every camera has reached a calibration level of 100%, go back to the Motive Software and press on the "Start Calculating" Button on the calibration page. This will start the calculation of the calibration data. This process can take a while. The Software will then show how good the current calibration currently is for each camera and also the overall score.

The Score View Motive

Floor Calibration

After the Calibration process, the ground plane has to be set. This is done by laying the special triangular-shaped tracking device (the Calibration Square) on the floor in the tracking area.

The Calibration Square

Place it where the global origin of the tracking area should be. Also, make sure that the rotation of the triangle is so that the longer side of it is pointing in the direction of the positive Z axis and the shorter side is pointing in the direction of the positive X axis. In motive, the Y axis is the up axis as it uses the right-handed coordinate system. Make sure that the Calibration Square is the only trackable object in the tracking area. Motives will automatically detect the Calibration Square and enable the "Set Ground Plane" Button. Press this button to set the ground plane.

The Set Ground Plane Button Motive

Now the Optitrack System is ready to be used.

Adding A Skeleton by using a Mocap Suit

There are two Mocap Suits available in the LRS Lab. Both of the suits are size medium and consist of several parts:

  • Mocap Hat
  • Mocap Shirt
  • Mocap Pants
  • Left Mocap Shoe
  • Right Mocap Shoe
  • Left Thumb Tracker
  • Right Thumb Tracker

To get a skeleton in the Motive Scene, select all the dots from the Mocap Suite and open the "Builder" Panel. Select "Skeleton" and then press "Create". There are templates provided for the correct tracking marker positions. Usually, the Mocap Suits are already equipped with the all necessary markers for the skeleton rig, so if everything is set up correctly, the markers should already be in the correct position. If that is not the case: The software will show where the body markers are missing and which ones are not in the correct position. In case all markers are detected go into the T Pose to calibrate the skeleton. Then select all the marker dots in the 3D view and press the "Create" Button in the Builder Panel. Upon pressing the "Create" Button a 3D character should appear in the 3D viewport. Name this "Skeleton" or another name. This name will be used in Unity later on.

The Build Skeleton Button Motive

Setting up Unity Project

Make sure that the Unity version installed that supports the Optitrack Plugin: According to the Optitrack Documentation the plugin is supported in Unity Version: 2017.2 / 2017.1 or above, but 2020.3 or newer is recommended. In this Tutorial, 2022.3.15 is used. Unity can be downloaded from the Unity Website. There the Unity Hub can be downloaded which is used to manage Unity Projects and Unity Versions. In the Unity Hub, the Unity Version can be selected and installed.

Installing the Optitrack Plugin

This step works both for new projects and existing ones: The installation of the Optitrack Plugin is very straightforward. The Plugin can be downloaded from the official Optitrack Website. After downloading, make sure that the Unity Editor with the correct Project is open. Double-click the newly downloaded file. If multiple Unity Versions are installed, a window will pop up asking which Unity Version should be used to import the Plugin. Select the desired Unity Version and press "OK". This will open the Unity Import Package Window. Make sure that all the files are selected and press the "Import" Button. This will then import the Optitrack Plugin into the Unity Project. After the import is finished, the Optitrack Plugin is ready to be used in the Unity Project.

The Import Window Unity

Setting up the Optitrack Plugin

Now that the plugin is installed, its scripts have to be added to the scene. Thanks to the plugin, there is now a new folder in the Project Window called "Optitrack". This folder includes a prefab called Client - OptiTrack. Drag this prefab into the scene. This will add the Optitrack client to the scene. The Optitrack client is the main component of the Optitrack Plugin. It is responsible for connecting to the Optitrack System and receiving the mocap data. To configure the Optitrack plugin click the "Client-OptiTrack" in the Project Hierarchy. This should open the Optitrack Client in the Inspector Window. There is a script called "Optitrack Streaming Client" that offers several settings.

The Optitrack Settings with IP
  • Server Address: The IP address of the Optitrack Server is typically the IP address of the computer running the Motive Software. For instance, in the Lab, it is "10.21.3.161". To find the IP of the Optitrack Server, enter the Motive Software and click on the "Streaming" Tab, where the IP Address is displayed under "Local Interface".

  • Local Address: The IP address of the computer operating the Unity Project is generally the same as the computer executing the Unity Project. The IP can be obtained by opening the Windows Terminal and entering ipconfig.

  • Connection Type: The mode of connection to the Optitrack Server needs to match the connection type set in the Motive Software from the "Streaming" Tab. As per the Optitrack Documentation, the "Unicast" connection type is recommended.

  • Skeleton Coordinates: The coordinate system for the mocap data. Choose "Local" to align the mocap data with the Unity Scene's coordinate system, or "Global" to align with the Optitrack System's coordinate system.

Some checkboxes can be set:

  • Draw Markers: If this is checked, the tracking markers will be drawn in the scene. This is useful for debugging.
  • Draw Cameras: If this is checked, the cameras will be drawn in the scene. This is useful for debugging.

Skeleton in the Scene

Now that the Optitrack Plugin is set up, the skeleton can be added to the scene to check if everything is working correctly. The Optitrack provides a Prefab for a 3D Character that can be found in Optitrack/Prefabs/Retargeted Skeleton - OptiTrack. Drag this prefab into the scene. This will add the 3D Character to the scene. Now put a plane into the scene and move it a bit up so that the 3D Character is standing on the plane. Then click on the "Retargeted Skeleton - OptiTrack" in the Project Hierarchy. This should open the Inspector Window where the settings for the Skeleton can be found.

The Skeleton Settings

There is also a script on the character called "Optitrack Skeleton Animator". Here the Streaming Client is selected. This Streaming Client is the Optitrack Client that was added to the scene earlier. Also, enter the name of the Skeleton from the Optitrack Scene at "Skeleton Asset Name". This is the name of the Skeleton created earlier in the Motive Software. In this case, it is "Skeleton". Now press the play button in the Unity Editor. The 3D Character should now be moving according to the mocap data if the IPs are set correctly and the Optitrack System is running.

Troubleshooting Issues in Unity

When connecting Unity and the Optitrack system, it can lead to certain connection errors or warnings. To ensure a successful connection, please check:

  1. IP Address Verification: Verify that both the local IP address of the computer running Unity and the IP address of the Motive software are correctly configured in the Unity scene.

  2. Optitrack System and Motive Software: Ensure that the Optitrack system is powered on and that the Motive Software is actively running and did not crash. The Motive Software is important as it has to feed real-time data to Unity. Without it being active, Unity will not receive any data from the Optitrack system, and this leads to an error.

  3. Firewall Considerations: A frequent problem in connecting to Motive is the firewall settings on the Unity computer. The official Optitrack documentation suggests disabling the firewall for public networks to get a connection. However, disabling the firewall, although effective, can expose the system to security risks. It is highly recommended to re-enable the firewall instantly after motion capture sessions to maintain security integrity.

Another Problem that was encountered was that the Draw Markers Checkbox caused both Unity and Motive to simultaniously crash simultaneously: So if that happens, make sure to disable Draw Markers as it is enabled per default.

If the character's color is pink, that means that the character's material is not compatible with URP. To fix this, just create a new material and drag it onto the character. The character should now be colored correctly. Another solution is to convert the material to URP by going to Window > Rendering > Render Pipeline Converter and then selecting "Render Settings", "Material Upgrade" and "Readonly Material Converter". Then press "Initialize And Convert". Now the character should be colored correctly.

The Material Converter Window

Mirroring the Skeleton's Movement to Certain Limbs

Now there is a working character in the scene: But the non-humanoid creature from the game does not have the same proportions as the 3D Skeleton, and also not the same amount of bones: It just has a head with two arms and two legs. Only certain parts of the whole skeleton are needed. Thanks to Unity, positions and the current rotations of every bone in the skeleton can be read out. This information will later be used to control the creature's limbs. To see the name and position of every important bone in the character the small arrow next to the "Retargeted Skeleton - OptiTrack" GameObject inside the hierarchy can be pressed to open up the prefab. There should now be a hierarchy of all the bones of the character listed. The required bones are:

  • EthanLeftArm
  • EthanLeftForeArm
  • EthanRightArm
  • EthanRightForeArm
  • EthanLeftUpLeg
  • EthanLeftLeg
  • EthanRightUpLeg
  • EthanRightLeg
  • EthanRightShoulder
  • EthanLeftShoulder
  • EthanRightHand
  • EthanLeftHand
  • EthanRightFoot
  • EthanLeftFoot
The Ethan Bones

To see the effect of each bone's rotation the Key E can be pressed to rotate the bone. Every rotation can be undone by "Ctrl + Z" as changing the rotation of the T Pose can lead to problems later on.

The same whimsical creature from the webcam version will be used. The difference is that there is no need for a PipeServer as the Optitrack Plugin already provides the mocap data.

Many of the scripts and functions are also the same as in the Webcam Version: but the movement scripts: So MainBodyMovement.cs and MapJointAngleMotor.cs are different: They still share some of the same logic, as the movement works similarly as in the Webcam Version, but the way to get the movement data is different. In the MainBodyMovement Script, the movement of the body is handled. The script is attached to the main body of the creature.

The important snippet which is different from the Webcam Version is this one:

public Transform rightShoulder;
public Transform leftShoulder;

public Transform rightUpLeg;
public Transform leftUpLeg;

Vector3 torsoUpDir1 = rightShoulder.position - rightUpLeg.position;
Vector3 torsoUpDir2 = leftShoulder.position - leftUpLeg.position;
Vector3 torsoUpDir = (torsoUpDir1.normalized + torsoUpDir2.normalized);
torsoUpDir.z = 0;
torsoUpDir.Normalize();
The Main Body Movement Script

For the script to work correctly the EthanRightShoulder, EthanLeftShoulder, EthanRightUpleg, and EthanLeftUpleg are dragged into the corresponding fields in the Inspector Window for the MainBodyMovement Script. These particular skeleton bones are used to serve as reference points within the script to calculate how the creature's torso moves and tilts. By using the positions and orientations of these bones, the script determines the directional alignment and the degree of tilt of the torso, which is fundamental for moving in the desired direction with the creature.

The second script that is different from the Webcam Version is the MapJointAngleMotor.cs Script. This script is attached to every single limb of the creature: So it should be used exactly 8 times:

  • for the left arm
  • for the right arm
  • for the left forearm
  • for the right forearm
  • for the left leg
  • for the right leg
  • for the left up leg (left thigh)
  • for the right up leg (right thigh)

This is the important snippet of the script:

public Transform left;
public Transform center;
public Transform right;


void Update()
{
    float angle = inactiveRestAngle;
    if (active)
    {
        Vector3 dir = right.position - center.position;
        dir.z = 0;
        dir2 = center.position - left.position;;
        dir2.z = 0;

        angle = Mathf.Clamp(Vector3.SignedAngle(dir2, dir, Vector3.back)*scale, angleMin, angleMax) + angleOffset;
    }

    float angleDelta = angle - prevAngle;
    if (Mathf.Abs(angleDelta) > maxSpeed * Time.deltaTime)
        angle = prevAngle + Mathf.Sign(angleDelta) * maxSpeed * Time.deltaTime;

    prevAngle = angle;
    JointSpring hingeSpring = hinge.spring;
    hingeSpring.targetPosition = angle;
    hinge.spring = hingeSpring;
}
    

The MapJointAngleMotor.cs script is designed to control the individual limbs of a creature in Unity. For it to function properly, the script must be attached to each limb of the creature. Next assign the relevant Transform objects to the left, center, and right fields in the script, as depicted in the screenshot and the table provided below. It is crucial to note that the Right Ethan Limbs should be placed in the Left Limbs fields, and vice versa, to ensure that the limbs mirror the player's movements.

The Left Forearm

Left Limbs

Left Arm Left Forearm Left Upleg Left Leg
LEFT EthanLeftShoulder EthanRightShoulder EthanRightShoulder EthanRightUpLeg
CENTER EthanRightShoulder EthanRightForeArm EthanRightUpLeg EthanRightLeg
RIGHT EthanRightForeArm EthanRightHand EthanRightLeg EthanRightFoot

Right Limbs

Right Arm Right Forearm Right Upleg Right Leg
LEFT EthanRightShoulder EthanLeftShoulder EthanLeftShoulder EthanLeftUpLeg
CENTER EthanLeftShoulder EthanLeftForeArm EthanLeftUpLeg EthanLeftLeg
RIGHT EthanLeftForeArm EthanLeftHand EthanLeftLeg EthanLeftFoot

These Transforms represent key points in the limb, such as joints, and are essential for calculating the limb's current direction and angle. The script continuously adjusts the limb's angle within a certain range by using a HingeJoint for smooth joint rotation. This arrangement enables each limb to move and react dynamically, according to its calculated orientation and the creature's overall movement and actions. More info about how the hinges work can be found in the Webcam Cookbook part.

To test if everything is set up correctly, press the play button in the Unity Editor. If all steps were followed accurately, the creature should now be moving in sync with the Optitrack mocap data.

A player playing with the game

Getting Character Movement without Optitrack

In case there is no access to an Optitrack System all the time, the character movement can still be tested and debugged, to make sure the limbs are returning the correct movement data. For this, a Debug script can be created. It first searches for the "Retargeted Skeleton - OptiTrack" GameObject and then searches for specific bones in the skeleton. Those bones can then be moved by pressing the arrow keys. This will then rotate the bones in the scene, and if the mirroring of the limbs is set up correctly, the limbs of the creature should also move. This script looks like this:

void Start() {
    // Automatically find the Source Character in the Scene
    sourceCharacter = GameObject.Find("Retargeted Skeleton - OptiTrack").transform;
    // Arm Limbs
    sourceLeftArm = FindBoneByName(sourceCharacter, "EthanLeftArm");
    sourceLeftForeArm = FindBoneByName(sourceCharacter, "EthanLeftForeArm");

    // Leg Limbs
    sourceLeftUpLeg = FindBoneByName(sourceCharacter, "EthanLeftUpLeg");
    sourceLeftLeg = FindBoneByName(sourceCharacter, "EthanLeftLeg");
    
}

void Update()
{
    // Rotation amount in degrees
    float rotationDegrees = 18.0f;
    // Convert degrees to radians
    float rotationRadians = rotationDegrees * Mathf.Deg2Rad;

    if (Input.GetKey(KeyCode.DownArrow))
    {
        RotateLimb(sourceLeftArm, 0, -rotationRadians, 0);
        RotateLimb(sourceRightArm, 0, rotationRadians, 0);

        RotateLimb(sourceLeftForeArm, 0, 2 * rotationRadians, 0);
        RotateLimb(sourceRightForeArm, 0, -2 * rotationRadians, 0);

        RotateLimb(sourceLeftUpLeg, 0, rotationRadians, 0);
        RotateLimb(sourceRightUpLeg, 0, -rotationRadians, 0);

        RotateLimb(sourceLeftLeg, 0, -rotationRadians, 0);
        RotateLimb(sourceRightLeg, 0, rotationRadians, 0);

        Debug.Log("Left arm rotation: " + sourceLeftArm.rotation);
    }
    Vector3 sourceRotation = sourceLeftArm.rotation.eulerAngles;
    testLeftArm.transform.rotation = Quaternion.Euler(0, 0 , sourceRotation.y);
}

void RotateLimb(Transform limb, float xDeg, float yDeg, float zDeg)
{
    limb.Rotate(xDeg, yDeg, zDeg);
}

private Transform FindBoneByName(Transform character, string boneName) {
    Transform[] bones = character.GetComponentsInChildren<Transform>();

    foreach (Transform bone in bones)
    {
        if (bone.name == boneName)
        {
            return bone;
        }
    }
    Debug.LogError("Bone not found: " + boneName);

    return null; 
}