Hanson Robotics wiki
This is a site describing the open source software and hardware created by Hanson Robotics. It was newly created in December 2014 and is a work in progress, currently in the process of being populated with useful content.
With our efforts, we intend to create (a strong illusion of, and eventually the actual reality of) human life within interactive characters, both virtual and robotic. In pursuit of this goal, our software includes/integrates many features and platforms, including artistic animation, robotic controls, machine perception, natural language dialogue, machine learning and AI. The software and its pieces may serve many purposes including research, entertainment, and medical applications. Feel free to add new descriptions of features, uses, ideas, opinions, and feedback from field deployments and tests.
We keenly appreciate that many groups and people are working in related areas, and we thank our many collaborators including Open Cog, Cogbot, iCog, the NextGen Systems Group, etc.
Please participate, contribute and help bring intelligent characters to life!
Strategy and Planning
Software System Overview
PUMA - Perception Understanding Motivation Action
See also the diagram source.
Art, Character, Narrative
- June-July dev sprint
- 2015 reqs
- Combined performance-animation requirements
- Expression Terminology
- Sophia Character Definition
- Alice Robot Character
- Scripts and Demo Descriptions
- Blender API
- Animation data flow
- Action Orchestration
- Face tracking requirements
- Perception tasks
- Perception synthesizer
- Physiological Action Units (PAU)
- Relevant software
- ROS node setup
- Robot web dashboard
- Docker setup
- Voice Expression and Authoring Overview
- Voice Expression Authoring Notes
- Voice Expression Training and Prosody Automation
- Behavior Authoring Interface
- Procedural Animation and Face Tracking Notes
- Tiered pyAIML Chat System