Skip to content

markandre13/makehuman.js

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

makehuman.js

A Makehuman variant with different features.

Currently Working On

Animation!

This needs the makehuman.js backend to be running locally.

Once that is working, the next step will be to record animations and export them to Blender.

Current Status

  • Have a look at build 2023-10-31
  • Edit, load and save morph
  • Pose and load pose
  • Select one of 32 pre-defined facial expressions and/org edit facial pose units
  • Render various proxy meshes instead of the basemesh
  • Export the mesh with rig and texture coordinates as Collada for Blender
  • Nothing else... 😅

How to build

  • npm install
  • npm run dev:prepare
  • npm run dev:build
  • npm run dev:serve (in another terminal)

Why?

  • I've been using MakeHuman for more than a decade but often struggled with the UI and the source code.
  • I'm up to something with Blender and Chordata and in need for full artistic control of the toolchain. 😎

Animation

Capturing

  • MediaPipe

    Contains image recognition for facial landmarks, fingers and body.

  • Chordata

    Body pose capturing using 15 or 17 MARGS (accelerometer + gyroscope + magnetometer) attached to the body.

  • The FreeMoCap Project

    Not used in makehuman.js (yet): Python project which bundles and extends various free motion capture tools.

Face

Body

TBD

How does Makehuman work?

Morph

Data

  • data/3dobjs/base.obj contains a 3d model of a human body, called the base mesh.

    It is completely made of quads, which well give good results when applying a Catmul and Clark subdivision to it.

    Further reading: Mesh Topology.

  • data/target/ contains 1258 morph targets, which can deform the base mesh's shape, gender, age and ethnicity.

    The morph targets are handmade by editing the basemesh in a 3d editor and extracting the changes with MakeTarget.

  • data/modifiers/ bundles those morph targets into 249 more user friendly modifiers

Code

// render's the morphed base mesh
function render(canvas: HTMLCanvasElement, scene: HumanMesh)

// the morphed base mesh
class HumanMesh {
    // input
    obj: Mesh        // the base mesh from the Wavefront Object
    human: Human     // the morph targets

    // processing
    update()         // calculate vertex from obj and human

    // output
    vertex: number[] // the morphed obj.vertex
}

// aggregates all the modifiers and creates a list of morph targets
class Human {
    // input
    modifiers: Map<string, Modifier>
    modifierGroups: Map<string, Modifier[]>

    // output 
    targetsDetailStack: Map<string, number> // morph targets

    // for posing and skinning (see below)
    meshData!: WavefrontObj
    __skeleton!: Skeleton
}

// creates a list of ui elements (sliders, text fields) to edit the modifier values
function loadSliders(filename: string)

Pose

The skeleton aggregates bones and weights. Bones can be rotated.

Posing the skeleton directly, especially those in the face, can be a bit tedious. Hence there are pre-defined pose units, e.g. "LipsKiss" or "HandBendOutLeft", which can control multiple bones at once while also restricting bone movement.

Data

  • Skeleton

    • data/rigs/default.mhskel the bones making up the skeleton

      For the actual bone positions little cubes within the mesh are referenced, so when the mesh is morphed, the skeleton is morphed along with it.

      Further reading: Base Mesh and Rig.

    • data/rigs/default_weights.mhw the weights each bone has on each vertex

  • Pose Units

    • data/poseunits/body-poseunits.json defines 63 pose units for the body.

    • data/poseunits/face-poseunits.bvh and face-poseunits.json defines 60 pose units for the face.

    • data/expressions/*.mhpose defines 32 face expressions based upon pose units, e.g. like "laugh01" or "fear02".

Code

// aggregates the bone tree and weight list
class Skeleton {
}

// a single bone
class Bone {
    parent?: Bone
    children: Bone[] = []

    name: string

    yvector4?: vec4 // direction vector of this bone (along y-axis)
    matRestGlobal?: mat4 // bone relative to world
    ...

    // user defined rotation
    matPose: mat4
}

// weights
class VertexBoneWeights {
    // bone name -> [[vertex numbers, ...], [weight for vertex, ...]]
    _data: Map<string, Array<Array<number>>>
}

Proxy

Proxies provide additional meshes, e.g. teeth, tounge, eyes, alternative body meshes and cloth.

The proxy files contain data which is used to transform the morphed/posed basemesh into a proxy mesh. These files are created with MakeClothes.

class Proxy {
    // return proxy mesh vertices, adjusted to basemesh morph/pose
    getCoords(baseMeshVertices: number[]): number[]
}

Development

# install packages
npm install

# populate data directory

(I run this outside the Visual Studio Code terminal, to get access to my
local Makehuman installation to avoid downloading all assets from the net.)

npm run dev:prepare

# run (in separate terminals)
npm run dev:build
npm run dev:serve

Run single test

npm run dev:test --file=build/test/skeleton.spec.js

About

A MakeHuman variant with different features.

Resources

License

Stars

Watchers

Forks