Sorry, but that will be a rather long post. Hopefully I will post more regularly in the future...
Most of the work that was made until now was about procedural animation, at first with the physical simulation thanks to the Bullet physics library, then with skeletal animation with improvements to the new animesh. All this work will be available in the incoming 2.0 release of Crystal Space.
The Bullet's physical plugin
The Bullet plugin of CS was only partially implemented and suffered of several bugs and problems. The plugin has now be completely overhauled and almost all the features of the initial iDynamicSystem interface have been implemented. As a result, the Bullet plugin is now fully useable, and is clearly preferable compared to the ODE plugin, simply because it works much better and has much more features. The 'phystut' application demonstrates how to use most of these functionalities.
In addition to that, several new features have been added. Most noticeable are:
- Kinematic objects for objects acting on the physical simulation but controlled by the user. You can attach a mesh to it in order to have it updated automatically when the mesh is moved.
- Soft bodies to simulate deformable objects such as flags, clothes, ropes, etc. A new animation controller (CS::Animation::iSoftBodyAnimationControl) and some utilities (CS::Physics::Bullet::SoftBodyHelper) have been written in order to create easily a soft body from a genmesh and to animate the genmesh depending on the results of the simulation. The controller is also able to maintain correct anchoring of the soft bodies when they are attached to an animesh which has a deformable topology (see below).
- Terrain colliders integrated with the terrain2 mesh. They are able to load and unload automatically the terrain cell's colliders on demand.
- Debug plugin to display information on the colliders and the objects of a physical scene. This plugin works with both the ODE and the Bullet plugins.
- Physical hit beams and grab joints.
The main functionalities missing are the setup of the information of the collisions and the filtering of these collisions.
The 'phystut' tutorial application on physics simulation. In this scene there are boxes, spheres, capsules, cylinders, convex and concave meshes, joints, motors, ragdolls, ropes, clothes and soft bodies.
The animated mesh system
The animesh (namely the class CS::Mesh::iAnimatedMesh) which was introduced during the YoFrankie project has been improved a lot. Many already existing features have been improved and fixed, such as the sockets, the blending of the animations, the import with blender2crystal, the Finite State Machine animation node, the computation of the bitangents, the display of the animeshes in viewmesh, etc.
Both the skeletal and the morphing animations have also been optimized and run now decently (although all computations are still made only in software).
Many new features have also been implemented, most of them are visible in the 'avatartest' application:
- A new high detail test model has been made specifically for Crystal Space: Krystal. This model has been made thanks to the great help of a friend nicknamed Korbak. It is made in Blender and is based on the Gothic Woman model by Tiziana. At first, a Blender script was written in order to import motion captured animations into this model, but a much better solution has been implemented after that (see below).
- Definition of the "bodymesh" (class CS::Animation::iBodySkeleton), holding the geometric and physical description of the skeleton of an animesh. Bone chains can also be defined to represent sub-parts of the skeleton (to be used e.g. by Ragdoll or Inverse Kinematic nodes).
- New "Ragdoll" animation node. This node can manage automatically the physical model of the animesh, and put parts or whole of the skeleton in a dynamic or kinematic state. The dynamic state allows to have some parts of the skeleton animated by the physical simulation, e.g. to animate props or dying avatars. The kinematic state is useful to have the animesh interact with the physical scene, and to attach some soft bodies to it.
- New "Lookat" animation node. This node can control a bone of the skeleton in order to make it look at a given target. It is useful to bring life into an avatar and making it more responsive to what happens in the scene. The animation node handles automatically the transitions when the target gets in and out of visibility, so there's no need to manage the transitions at a higher level (but the exact behaviour is configurable).
- New "Speed" animation node. It takes as input a set of animations of the mesh moving at different speed (e.g. idle, walking, running), and blends them together to achieve any custom speed in between.
- New "Inverse Kinematics" animation nodes. Such nodes allow to control the position of an end effector placed on a sub-chain of the skeleton. It is useful to control the precise position of some bones of the skeleton such as the hands or the feet. There are two different implementations of this node, one is a quaternion version of the Cyclic Coordinate Descent algorithm, the other uses the physical simulation in order to achieve the effect.
- New "Retarget" animation node. This node can retarget an animation from one skeleton to another. It is useful for example to re-use animations or to input motion captured data into your animesh. The two skeletons must have a similar topology and limb proportions, otherwise the results won't look good. To overcome this, the behavior of the retargeting can be tweaked, but a more advanced solution would be needed in the future.
- New "Debug" animation node. It currently simply allows to display the state of the skeleton.
- All animations and nodes can be loaded from the CS's XML scene description file format. A facility has also been setup to manage easily the splitting of the definition of the objects in several different resource files.
The Krystal model in the 'avatartest' demo application. The animations are imported from motion captured data and retargeted into the skeleton of Krystal. The skirt is animated through soft bodies physical simulation and the hairs are made with the dedicated fur mesh. You can also control the position of the hands of Krystal through Inverse Kinematics. Finally, the model has a faked sub-surface scattering shader in order to improve the visual aspect of the skin of Krystal.
This is not the end of the list of new functionalities:
- Decals can now be defined on animeshes, allowing an easy personalization of the meshes. The decals are updated automatically to match the deformations of the animesh.
- Clothes (i.e. physical soft bodies) can be easily and precisely attached to an animesh. They will be animated by the physical simulation and their attachment point will also be updated depending on the deformations of the animesh.
- I mentored Alexandru Voicu during its Google Summer of Code 2010 project on hair simulation and rendering. Thanks to his work, there is now a dedicated system to simulate hairs and fur. The motion of the hair strands are animated by the physical simulation, and the system has many properties to personalize the appearance of the hairs.
- A facial animation demo has been written, showing how to define facial expressions and control the animation of the morph targets to achieve the desired effect. This system would however need to be generalized into a more advanced solution, hopefully really soon.
- A motion capture facility has been written. The system is able to load animations from the BVH (Biovision Hierarchical data) file formats. A viewer (csmocapviewer) allows to visualize easily the content of the BVH file and the retargeting to a given animesh. The viewer is also shipped with a Point Light Display system, which has been designed to be used in psychology studies on the human perception of biological motion. It uses a Perlin noise to add points with random motion.
- Animeshes have been integrated in the "Project Bias" made by SueastSide. It is a demo of a medieval First Person Shooter, accessible as a demonstrator in CS (the 'csbias' application).
- The libnoise library has been integrated into CS (namespace CS::Noise). It is useful to generate coherent noise functions such as the Perlin noise, and can be used to generate and animate e.g. clouds or fire, but are also helpful for procedural animation of avatars (see the work on this by the master himself).
We will keep working on CS, CEL, and particularly the animeshes as part of the developments needed for the Immersive Virtual Environment. The main topics that are planned are:
- Hardware skinning pipeline and Level Of Details. These are the main missing functionalities in the current animeshes, but should hopefully be managed by our incoming developer.
- Custom animation channels, e.g. for morph or shader variable animation
- Artificial Intelligence: inclusion in CEL and interaction with CEL's behavior trees and path finding systems
- Assets management and edition tools
- Animation events system
- More advanced Finite State Machine animation node
- Playing with libnoise's inclusion in CS and Ken Perlin's ideas on using noise to add details in the animations
- Locomotion system and walking on uneven terrains
- More advanced animation retargeting
- Assets pipeline from the MakeHuman editor into the CS animeshes
- More clothes, props, and personalization of the animeshes