Blender API

From Hanson Robotics Wiki
Jump to: navigation, search

The Blender API defines the software interface between high-level system control software, and the low-level animated blender head. All software above the blender API will be common and generic to all heads (and bodies); whereas all software below the API will be specific to a single, particular blender head/body.

The Blender API is used to allow the artist complete freedom to design the head/body blender rig in any way desired. Bones, rigging, coordinate systems can be anything. The way that animations are specified, controlled and tuned can be anything at all. Eeye-tracking can be done in any way.

The software system above the Blender API is generic: if a given rig implements the API, then the vision, behavior, motion planning, emotional response and audio subsystems will work for that rig. Thus, completely different rigs can be controlled by the software system.

The API

The Blender API consists of the following:

  • A way to query the list of supported animations, which is ...
  • A way to control where the eyes are looking, which is ...
  • A way to face in a certain direction, which is ...

In addition, the Blender API must issue PAU messages, which are used to drive motors on a physical head/body. These are ...