New machine-learning technique can distinguish living bodies from deceased ones
In the aftermath of disasters, drones have already been used to map the destruction and help rescuers find possible survivors. Now a new system could take this to the next level, automatically analyzing drone footage to determine whether the people spotted are still alive.
“We’re using computer vision, and what we’re looking for are very small changes that are associated with movement—that rhythmic movement of breathing,” says Javaan Chahl, a sensor systems researcher at the University of South Australia and senior author of a study describing the process, published last October in Remote Sensing.
The system uses machine learning to analyze a 30-second video clip of a human body, measuring changes in light reflected from the part of the chest region where motion would be most apparent. Then it determines whether shifts in intensity are consistent with a live, breathing person. The researchers tested the system on footage of nine subjects: eight living humans and one mannequin with a wig and makeup.
Test subjects were unobscured, but Chahl says the system could also work on people partially covered by rubble—as long as their torsos are visible. Past attempts at identifying vital signs using drones measured subtle changes in skin color, which can indicate blood flow. But those systems have to view exposed skin over pulse points, meaning the drones must hover much closer.
The researchers have yet to test their system in the field. “This experiment seems to work in very controlled conditions, where bodies are lying in static poses on the ground and drones four to eight meters up in the air are performing these visible-light video captures,” says Lisa Parks, a media researcher at the Massachusetts Institute of Technology, who studies drones and surveillance but was not involved in the new study. In real disaster-recovery situations, Parks notes, conditions such as wind, rain, temperature fluctuations and running water could interfere with reflected light. Without a more realistic test scenario, she says, “I wonder how feasible this really would be if rolled out into an actual postdisaster context.”
Chahl agrees that the system’s current version has limits. “At the moment, the drone is … looking for people on the ground, and then it looks to see whether they’re alive,” he says. “That’s not quite the Star Trek life-sign scanners that I’ve always wondered about.” But now that the basic concept has been proved, Chahl hopes to develop it further. “What we’d like to do is actually use the life signs to detect the people,” he says, “so you can make a map of where there’s likely to be people and not.”