My goal is to make humanoid robots walk. Anywhere.

Walking over non-flat terrains

I'm interested in the walking pattern generation problem. The goal is to compute, in real-time, the immediate future trajectory of the robot. The difficulty is to traverse dynamically-diverging phases (the "falling" part of walking) between each footstep. On horizontal floor, this problem had been addressed by the developments of Kajita et al. (2003), who introduced the use of model predictive control, and Wieber (2006), who framed it as a quadratic programming (QP) problem. In the following work, I found a way to extend the QP formulation to multi-contact:

The price to pay for keeping a QP is that the robot only uses a subset of its feasible accelerations. To discover more dynamic motions using all feasible accelerations, I'm also exploring nonlinear optimization:

These solutions work in simulations with sensor noise and control delays. The next big step is to demonstrate them on the actual robot.

Contact conditions

The gap between flat-floor and multi-contact locomotion lies in the contact condition. Over flat floors, it was common to use a ZMP support area defined as convex hull of contact points, but this approach does not hold in general. We found the general construction in:

The (2D) ZMP is tied to the assumptions of the pendulum mode, which may be too restrictive for multi-contact locomotion. The most general contact condition is actually the (6D) contact wrench cone:

Other works

Force estimation from motion capture

I enjoyed a lot riding with Tu-Hoa Pham on his work on force estimation. This idea is that, using priors on how humans make contacts, one can estimate accurately contact forces without force sensors. The priors were implemented by an LSTM recurrent neural network) trained on a dataset of human motions:

All publications

You can check out my full list of publications for all my previous works.

Content on this website is under the CC-BY 4.0 license.