Posted in | News

World First Study Now Separates Living from the Dead

Image Credits: Roschetzky Photography/shuttersotck.com

Autonomous drone cameras have been trialled for several years to detect signs of life in disaster zones. Now, in a world first study, researchers from Adelaide and Iraq have taken this a step further.

Using a new technique to monitor vital signs remotely, engineers from the University of South Australia and Middle Technical University in Baghdad have designed a computer vision system which can distinguish survivors from deceased bodies from 4-8 metres away.

As long as the upper torso of a human body is visible, the cameras can pick up the tiny movements in the chest cavity, that indicate a heartbeat and breathing rate. Unlike previous studies, the system doesn’t rely on skin colour changes or body temperature.

The breakthrough is a more accurate means of detecting signs of life, the researchers say.

UniSA Professor Javaan Chahl and Dr Ali Al-Naji, the study leaders, made global headlines in 2017 when they showed for the first time that a camera on a drone could measure heart and respiratory rates.

At the time, their technique was based on detecting changes in human skin tone and the camera needed to be within three metres of the person. The technique was also limited to one pose where the subject stood in front of the drone, not lying prone as it would be in a disaster zone.

Other techniques using thermal cameras can only detect signs of life where there is a contrast between the body temperature and the background, making this difficult in warm environments. Thermal cameras are also unreliable where people are wearing insulated clothing.

This study, based on cardiopulmonary motion, is the first of its type and was performed using eight people (four of each gender) and a mannequin, all lying on the ground in different poses. Videos were taken of the subjects in daylight, up to eight metres away, and in relatively low wind conditions for one minute at a time, with the cameras successfully distinguishing between the live bodies and the mannequin.

Professor Javaan Chahl, University of South Australia

Prof Chahl says the technology could be used to monitor for signs of life where time is critical, helping first responders in their search to find survivors in disaster zones.

This system would be ideal for many situations, including earthquakes and floods, nuclear disasters such as Fukushima, chemical explosions, bio attacks, mass shootings, combat search and rescue or where a plane has crashed in a remote area.

Professor Javaan Chahl, University of South Australia

Current ground-based operations for rescuing survivors in disaster zones include using rescue robots and rescue dogs, which are expensive and hampered by restricted access.

He says the motion-based system needs additional testing in adverse weather conditions and to ensure accurate readings when bodies are partially obscured.

The findings are published in Remote Sensing and are available at: https://www.mdpi.com/2072-4292/11/20/2441/pdf.
 

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.