Gait analysis systems measure certain metrics to give their results. These results then drive clinical treatment for gait correction. However, detailed gait analysis requires expensive equipment, and a lot of space, markers, and time. Measurements from markerless, video-based gait analysis systems, on the other hand, are inaccurate. To improve upon existing systems, researchers have now combined RGB camera-based pose estimation and an inertial measurement unit sensor for gait analysis. This significantly reduces errors in the process.

In people with gait disabilities (ie, a pattern of walking—or gait—that is not normal), assessing gait speed, stride length, and joint kinematics are essential. Measurement of gait parameters over a period of time is critical to determine treatment effects, predict fall risk in elderly individuals, and plan physiotherapy treatments. In this regard, optoelectronic marker-based three-dimensional motion capture (3DMC)—a gait analysis tool—can accurately measure gait metrics. However, economic and time constraints, coupled with requirements for a large space, extensive equipment, and technical expertise, make 3DMC impractical in clinical settings.

Alternate methods include inertial measurement unit (IMU)-based motion capture systems and RGB camera-based methods, which can measure gait without reflective markers when equipped with depth sensors. But these have their own drawbacks. IMU-based systems require many IMU sensors to be attached to human body segments, reducing their feasibility, and compared to optoelectronic 3DMC systems, RGB camera-based methods are less accurate in their measurement of kinematic parameters such as lower limb joint angles. 

Hence, improved gait analysis systems are needed.

To this end, a team of researchers comprising Dr. Masataka Yamamoto, Mr. Yuto Ishige, and Professor Hiroshi Takemura from the Faculty of Science and Technology, Tokyo University of Science, and Professor Koji Shimatani from the Prefectural University of Hiroshima, Japan, have developed a simple and accurate sensor-fusion method for accurate gait analysis.

“We combined information from a small IMU sensor attached to the shoe with estimated information on the bones and joints of the lower limb, obtained by capturing the gait from a single RGB camera,” explains Dr. Yamamoto, the lead author of the study. In a recent article published in Scientific Reports, the researchers have detailed this method and the results they achieved with it. 

The team used single RGB camera-based pose estimation by OpenPose (OP) and an IMU sensor on the foot to measure ankle joint kinematics under various gait conditions for 16 healthy adult men between 21 and 23 years of age who did not have any limitation of physical activity. The participants’ gait parameters and lower limb joint angles during four gait conditions with varying gait speed and foot progression angles were noted using only OP as well combined measurements from OP and the IMUs. The latter was the team’s novel proposed method. Results from these techniques were compared to gait analysis using 3DMC, the current gold standard.

The proposed combination method could measure gait parameters and lower limb joint angles in the sagittal plane (which divides the body into right and left). Moreover, the mean absolute errors of peak ankle joint angles calculated by the combination method were significantly less compared to OP alone in all the four gait conditions. This is a significant development in gait analysis.

“Our method has the potential to be used not in medicine and welfare, but also to predict the decline of gait function in healthcare, for training and skill evaluation in gyms and sports facilities, and accurate projection of human movements onto an avatar by integrating with virtual reality systems,” notes Dr. Yamamoto.

With further research, this method can be adapted to clinical settings and a larger demographic.

[Source: Tokyo University of Science, Japan]

CAPTION: TUS researchers develop a method that enables accurate gait analysis by combining information from a small inertial measurement unit (IMU) attached to the shoe with estimated information on the bones and joints of the lower limbs, obtained by capturing the gait from a single RGB camera. (Image Credit: Masataka Yamamoto from Tokyo University of Science, License Type: CC BY 4.0)