Cameras and machine learning take pulse of remote patients
Telemedicine could be improved with a method that uses the camera on an electronic device to take pulse and respiration signals from a real-time video of a patient’s face.
The advance by a University of Washington-led team was presented in December, 2020 at the Neural Information Processing Systems conference. The team is now proposing a better method to measure these physiological signals with a system less likely to be hindered by different cameras, lighting conditions or facial features. The researchers will present these findings April 8 at theACM Conference on Health, Inference, and Learning.
"Machine learning is pretty good at classifying images. If you give it a series of photos of cats and then tell it to find cats in other images, it can do it. But for machine learning to be helpful in remote health sensing, we need a system that can identify the region of interest in a video that holds the strongest source of physiological information - pulse, for example - and then measure that over time," said lead author Xin Liu, a UW doctoral student in the Paul G. Allen School of Computer Science & Engineering.
Register now to continue reading
Thanks for visiting The Engineer. You’ve now reached your monthly limit of news stories. Register for free to unlock unlimited access to all of our news coverage, as well as premium content including opinion, in-depth features and special reports.
Benefits of registering
-
In-depth insights and coverage of key emerging trends
-
Unrestricted access to special reports throughout the year
-
Daily technology news delivered straight to your inbox
Construction industry lags in tech adoption
Are these the best people to ask "Insights from 2,000 Industry Leaders"? - what would their customers views be like (perhaps more...