- Ruchi Hendre

# Motion Editing and Filtering

Updated: May 28, 2020

This week we learnt about mathematical concepts and saw some approaches for Motion Editing and Filtering.

To transform a motion following mathematical constructs can be used;

1. Rotation Matrices.

2. Euler Angles

3. Quaternions

4. Exponential Map.

Some other approached the problem using shortest path or optimization approach.

On Wednesday we saw different papers and techniques for intuitive motion editing and filtering.

B spline editors have been used since 1999 for motion editing. We saw the evolution by comparing a b spline editor then and now in 2016.

1. The first paper presented a technique by which the movement of the character can be controlled by sketching new trajectories on the interface.

The video presenting the paper :

This is interesting because it has very practical usage during previsualization phase and has the potential to make it considerably faster.

Keywords: Window of time , Motion capture data, sketch based inverse kinematic, used Quasi Newton BFGS optimizer.

2. Then we saw papers on motion filters, the first one was the squash and stretch cartoon filter paper which enhances an animation using a Gaussian filter, the only input the user needs to give is the amplitude of the filter.

They found that this motion filter can be applied to video cutouts, 3D , 2D and mocap data animation.

It is basically negative of second derivative of the guassian function multiplied by the given amplitude. It uses the discretization of data for applying this filter.

3. Another paper on motion filtering was Sketching dynamic illustrations using principles of 2d animation. This could be used as an useful education tool for beginner animators to understand the fundamentals of animation. This paper uses grid and vector graphics, every grid element has a key stroke. It uses a standard particle simulator too for staging principle of animation. Uses shape matching algorithm too.

4. Next we looked towards neural networks and learning, for character scene interactions. This has tremendous use in games. This paper used voxelised representation of the environment and had an interaction and environment sensor. They used the motion editing algorithm to satisfy constraints and meet new contacts which is a fancy way of saying IK.

They had a gating and motion predicting network in their setup too.

The video demonstrating this work can be found below:

At the end we had a discussion on motion capture data, and what are it's pros and cons. Motion capture animation still looks very rigid, and needs an artists touch but now with the increasingly efficient learning algorithms we may be able to reach a more realistic simulation of the character movement.