Motion Retrieval

Cartwheel animation

Current motion capturing technologies (e.g., Microsoft Kinect or ASUS Xtion) allow us to record 3D data of a moving human. The recorded data can be utilized for different kinds of applications, e.g., in sports for comparing performance aspects of athletes, in security research for identifying special-interest persons, in medicine for determining the success of rehabilitative treatments, and in computer animation for synthesizing realistic motions. Such kinds of applications usually require to solve the following problem: Given a short query motion and a database of long recorded motions, identify all database sub-motions that are similar to the query motion. Such functionality is integrated within two web applications: MotionMatch and Multi-modal Person Identification. A new prototype version utilizing neural network features is available here.

A short query motion can be specified by setting its beginning and ending frame. After the query sub-motion selection and clicking the “Retrieve sub-motions similar to the selection” button, the technology is utilized to retrieve query-similar sub-motions which are then displayed on a web page and ordered from the left to right according to their similarity score. The number of the retrieved sub-motions is limited to the 50 most similar ones. A new query can also be defined by selecting a query motion from the retrieved results.

Related Publications

  • J. Sedmidubsky, J. Valcik: Retrieving Similar Movements in Motion Capture Data. In 6th International Conference on Similarity Search and Applications (SISAP 2013). Springer-Verlag, 2013.
  • J. Sedmidubsky, J. Valcik, P. Zezula: A Key-Pose Similarity Algorithm for Motion Data Retrieval. In 12th International Conference on Advanced Concepts for Intelligent Vision Systems (ACIVS 2013). Springer-Verlag, 2013.
  • J. Valcik, J. Sedmidubsky, P. Zezula: Assessing similarity models for human-motion retrieval applications. Computer Animation and Virtual Worlds, John Wiley & Sons Ltd. ISSN 1546-427X, 2015.