MIT algorithm measures your pulse by looking at your face

How can you measure someone's pulse, just by pointing a camera at their face? MIT computer scientists have created an algorithm that can do just that, by amplifying the subtle changes in motion and colour in a video.

The algorithm analyses a video and looks for temporal variations -- or elements of the video that change over time -- and then dramatically amplifies those motions.

So, as the heart pumps blood into someone's face, their skin gets redder. The change is way too small to be perceived by the human eye, but by turning up those changes in tone we can see their face change from white (or black) to a deep shade of maroon.

A video of the process, called "Eulerian video magnification", shows that by measuring the change in colour, the researchers could accurately estimate a baby's heart rate. When they linked the infant up to a monitor, they confirmed that their guess was accurate.

The algorithm can find subtle changes in motion and amplify those, too. By pointing a camera at an artery, we see the imperceptible beat of a wrist become a pounding great pulsation that's straining to burst out of an arm. The rise and fall of a breathing chest can also be turned up.

MIT computer scientist Fredo Durand predicts that his algorithm will be used primarily for remote medical diagnostics. He also imagines that structural engineers could borrow the tech to measure the way wind makes a building sway or deform slightly.

Durand and his colleagues plan to make their code available later in 2012.

This article was originally published by WIRED UK