Motion capture: Building A 2D virtual character to mimic human performance using a computer vision- based system
Abstract
With the rapid advancement of technology, virtual reality experiences have gained
immense popularity, leading to an increased interest in understanding the creation and potential
applications of virtual characters. This thesis project introduces an innovative approach to
applying motion capture using computer vision-based techniques to generate a two-dimensional
virtual character that can stimulate human performance in real-time. By using a computer
webcam, the proposed method can track and capture human movements and emotions, which
are then translated into a two-dimensional model using Python and C# algorithms.
Complementary tools like Clip Studio, Live2D, and Unity are made used to generate the virtual
character data and create the virtual environment for them._____
The proposed method is divided into three key phases in order to achieve a fully
functional result: (1) the first phase consists of collecting virtual character data, (2) the second
phase constructs systems to detect and track body movements, and (3) the last phase integrates
these processes into the virtual environment in Unity. Two systems are established in order to
give the virtual character the capability to imitate human performance: the Half-body Motion
Tracking system and the Facial Expression Detection system. _____
Generally, this thesis project introduces a novel and practical method for motion
capture, enabling the development of virtual experience in the digital realm with a simpler,
more cost-effective, with efficient approach. It also emphasizes the potential applications of
virtual humans in areas such as identity security, the entertainment industry, education, and
other areas involving human-computer interaction. With instantaneous capabilities and
adaptability, this approach demonstrates the epic influence of virtual humans. Offering new
opportunities and enlarging the horizons of human-computer interaction.