Understanding human behavior in manufacturing

Hannah Diorio-Toth

Jan 8, 2026

In the world of manufacturing, safety and efficiency are two fundamental objectives for researchers who are trying to improve a working environment or process. You may think that this means exploring ways to change how machines operate, however there is an equally important need to look at the behavior of the humans who are operating these machines.

Screenshot of a computer monitor displaying robotic automation software, showcasing a robotic arm and workstation setup.

The paper applied two different sequential models to a large-scale dataset from Carnegie Mellon’s Manufacturing Futures Initiative.

Researchers from Carnegie Mellon University are investigating the patterns of human behavior in advanced manufacturing environments in order to improve human-machine collaboration. The paper, “Characterizing Sequential Patterns of Human Behavior in Advanced Manufacturing” was recently published in the Journal of Computing and Information Science in Engineering, exploring this interaction in the context of a Wire Arc Additive Manufacturing (WAAM) machine.

A colorful illustration depicting a scene with stylized figures and a building, created using a digital art technique.

Direct and indirect contact through exterior interfaces

“Machines designed by engineers are usually easy to predict. Normally, they'll do what you tell them to, in the order that you tell them to do it. Human beings are not that consistent,” explained Christopher McComb, associate professor of mechanical engineering. “But, we can still use computational techniques from engineering to try to predict these very messy, stochastic, complex human systems.”

As human involvement increases in design and operation, that missing piece is often the human.

Katherine Flanigan, Assistant Professor, Civil and Environmental Engineering

Although manufacturing work is made up of step-by-step processes, little was known about how human operators actually behave while doing these tasks, particularly in the WAAM environment studied in this work. Because humans naturally adapt to situations in unpredictable ways, it can be difficult to find and understand patterns of use that lead to success or failure. “We have rich machine logs that tell us what happened—when a defect occurred or when performance degraded—but not why it happened. As human involvement increases in design and operation, that missing piece is often the human,” said Katherine Flanigan, assistant professor of civil and environmental engineering.

The paper applied two different sequential models to a large-scale dataset from Carnegie Mellon’s Manufacturing Futures Initiative in order to understand the human behavior patterns that appear in an advanced manufacturing work environment.

3D scan of a robotic arm in a laboratory setting with a person interacting with it.

Indirect calibration of the welder arm using remote control

“By explicitly modeling how people interact with these machines, we can begin to understand how human integration supports—or sometimes undermines—design outcomes.” explained Flanigan. “That understanding doesn’t just improve efficiency and safety; it opens the door to new AI-driven training, decision-support, and educational tools that help operators work more effectively with complex manufacturing systems.”

3D point cloud rendering of a human figure in an indoor environment. The image highlights spatial structure and depth perception.

Direct build plate grinding

The project included collaboration across the College of Engineering and across continents. The research team included McComb, Flanigan, and Melinda Mudzurandende, a recent graduate from Carnegie Mellon University Africa. Mudzurandende, first author on the paper, earned her bachelor’s degree in mechanical engineering from Academic City University in Ghana before coming to CMU-Africa to earn her master’s in engineering artificial intelligence (MS EAI). She is interested in pursuing research in artificial intelligence accountability and is currently applying to Ph.D. programs.

3D point cloud visualization featuring a human figure highlighted in yellow within an indoor environment.

Direct build plate refitting

“Melinda has an amazing skillset as an MS EAI graduate. She's picked up so many different areas of both conventional machine learning and modern large language model techniques,” said McComb.

The findings, although focused on WAAM, could be applied in the future across manufacturing environments to create data-driven Standard Operating Procedures to improve safety and efficiency. The research team plans to continue the work, focusing next on using more complex deep learning models.