AI and the human body: Hidden assumptions in motion capture can have serious impact

February 1, 2024
Written By:
Noor Hindi, School of Information
Contact:
Concept illustration of AI and motion capture. Image credit: Nicole Smith, made with Midjourney

When designers use inaccurate depictions of the human body, the use of artificial intelligence in some applications might not be as safe for those who don’t fit that body type, according to a new study.

These flawed assumptions define what is considered the norm for human bodies, and have made their way into AI through motion capture, said study co-author Abigail Jacobs, assistant professor at the U-M School of Information and Center for the Study of Complex Systems.

The study shows how AI plays an important role in the design, development and implementation of motion capture systems, which infer the movement of people, animals and objects in space.

Using sensors and/or cameras, these systems collect data that can then be modeled on a computer to create “digital skeletons” that can be used to animate video games, diagnose health conditions or simulate workplace ergonomics.

“These systems are used in applications ranging from designing safe manufacturing floors, augmented reality and autonomous vehicles,” Jacobs said. “They depend on stylized and flawed assumptions about, for instance, whose human bodies are ‘standard’ or ‘representative.'”

In the same study, Jacobs and colleagues also delved into historical practices dating back to the 1930s, revealing a concerning trend: an overreliance on healthy, adult men to represent “typical” bodies and movements. Other baselines still part of modern, state-of-the-art systems rely on the bodies of deceased men, where the simulated movement of frozen cadavers stands in for live movement. This leads to distorted representations and assumptions, the researchers said.

Over time, these assumptions become baked into modern software, causing potential harm in how motion capture systems represent bodies. This is analogous to how color photography was developed to capture only light skin tones, with harm in how modern cameras represent darker-skinned bodies.

“Consider the historical practice of crash test dummies based on normative male bodies, leading to higher injury rates for women and children,” Jacobs said. “Given the range of applications of motion capture systems, it is potentially harmful when bodies are assumed to move and look like, for instance, athletic young men or frozen cadavers.”

The study lays out an analytical framework that can be applied to other technologies by paying attention to how assumptions are built into hardware and AI, how bodies are represented in AI systems, and how hidden assumptions—often old, baseless ones—shape present-day technologies, Jacobs said.

The study was posted online in January. Co-authors include Emma Harvey and Hauke Sandhaus of Cornell University, Emanuel Moss of Intel Corp. and Mona Sloane of the University of Virginia.