Animating Avatars Using Motion Capture Data Extracted from Video Stream Data

Abstract

Motion capture provides extremely realistic and realtime results. The research and advancements that have been made in the field of motion capture have concentrated on the motion of human characters. Current motion capture techniques require the use of very specialised equipment, sensory markers and tracking devices. It is not always suitable or practical to use such equipment for the capturing of all moving objects. Motion capture for animals and other living creatures poses many complications.

This paper presents a different motion capture technique, that does not require any markers, body suits, or other devices attached to the performer. The only equipment needed is a hand held video camera. This technique will enable motion capture for complex motion of animals and other living creatures. The paper investigates current methods of motion capture and their feasibility in capturing complex animal motion. An efficient method of extracting 3D motion using a single hand held video camera is investigated. A motion capture applicationis implementedto view video data and extract skeleton and motion data, from the film footage captured with a camera. A model viewer application is created which models and viewsthe data extracted from motion capture files in a three dimensional environment.

This paper provides the reader with a solution to motion capture for animals and other movingobjects that have very complex motion, and where it is not possible to attach markers or other equipment to them. Various motion capture techniques and technologies are compared a discussed in conjunction to the technique proposed by this paper.

Participants

Technical Reports

[1] Mark Whitfield. User-defined path following using motion capture data. Technical Report Literature Review, Virtual Reality Special Interest Group, Computer Science Department, Rhodes University, Grahamstown, South Africa, June 2004. [PDF] [BibTeX]

[2] Mark Whitfield. Adapting motion capture data to follow an arbitrary path. Technical Report Honours Project Report, Virtual Reality Special Interest Group, Computer Science Department, Rhodes University, Grahamstown, South Africa, November 2004. [PDF] [BibTeX]

[3] Mark Whitfield. Adapting motion capture data to follow an arbitrary path. Technical report, Virtual Reality Special Interest Group, Computer Science Department, Rhodes University, Grahamstown, South Africa, October 2004. [PDF] [BibTeX]

[4] Mark Whitfield. Adapting motion capture data to follow a user-defined path. Technical Report Poster, Virtual Reality Special Interest Group, Computer Science Department, Rhodes University, Grahamstown, South Africa, August 2004. [PDF] [PNG] [BibTeX]

[5] Andrew Peirson. Interactive synthesis of avatar motion from preprocessed motion data. Technical report, Virtual Reality Special Interest Group, Computer Science Department, Rhodes University, Grahamstown, South Africa, November 2003. [PDF] [BibTeX]

[6] Andrew Peirson. Interactive synthesis of avatar motion from preprocessed motion data. Technical Report Honours Project Report, Virtual Reality Special Interest Group, Computer Science Department, Rhodes University, Grahamstown, South Africa, November 2003. [PDF] [BibTeX]

[7] Jennifer Matlock. An automatic lip synchronisation system. Technical Report Honours Project Report, Virtual Reality Special Interest Group, Computer Science Department, Rhodes University, Grahamstown, South Africa, November 2002. [DOC] [PDF] [BibTeX]

[8] Melissa Palmer. Animating avatars using motion capture data extracted from video stream data. Technical Report Honours Project Report, Virtual Reality Special Interest Group, Computer Science Department, Rhodes University, Grahamstown, South Africa, November 2001. [PDF] [WPD] [BibTeX]

Images