Member-only story
Seeing Better Than Us
Machines will be able to tell more just looking at us.
I’ve become fascinated with the ability to augment video to see details that we wouldn’t normally be able to see. The method involves accentuating small changes to make them much more visible. With this technique, you can take subtle changes in skin colour between frames and calculate the person’s heart rate. But it goes even further — eye movement, blood flow, twitches. If micro-expressions are a real thing, then we can make them visible as regular expressions. When this is applied to the latest in machine learning, it will mean that machines will have much more capability than we have in telling how we’re feeling.
Such technology might make lying more difficult. Does saying or hearing something make your heart rate change? How does blood flow change? Do you become flush in the face? We might have a tell and if we do, vision processing might be able to find it.
Eye tracking might be taken to a new level to get a better understanding of where we are spending our focus. Can our laptop cameras understand if our eyes our wandering across the screen and get us to focus again? Can we scientifically determine if someone says something and then looks up, down, to the left, or to the right, actually has any meaning? This might be possible soon.