NIPS 2006
Skip to yearly menu bar Skip to main content


EHuM: Evaluation of Articulated Human Motion and Pose Estimation

Leonid Sigal · Ming-Hsuan Yang · Michael J Black


There has been a large body of work developed in the last 10 years on the human pose estimation and tracking from video. Many of these methods are based on well founded statistical models and machine learning techniques. Progress however has been limited because of the lack of common datasets and error metrics for quantitative comparison. The goal of this workshop is to (1) establish the current state of the art in the human pose estimation and tracking from single and multiple camera views, (2) discuss future directions in the field, and (3) introduce a benchmark database and error metrics for comparing current and future methods. To this end a new (HumanEva) dataset for evaluation of articulated human motion will be introduced.

Live content is unavailable. Log in and register to view live content