On a recent project our researchers have been given the rare freedom of becoming both guinea pigs and experimenters – strapping various cameras first to themselves, and subsequently to our amenable respondents, to create timelapse videos of their lives. We tested cameras that can be set to capture images at prescribed time intervals, and a camera that automatically grabs images of significant events in response to input from an array of inbuilt sensors. It takes a photo if you start to move, if you change direction, or if you stray from light into shade.
In analysing the data, the problem we found was that these aren’t generally the events in a person’s day that can really be considered ‘significant’. Significance, and meaning, comes with emotional context and complex interaction, and to understand that we still need human interpretation. Our cameras produced some truly fantastic footage; we were able to see everywhere that our respondents had been and everything they had done, including exactly how many times one particular individual checked his phone before settling down to work in the morning. What we couldn’t see so easily was how these fragments of life could be useful on more than a superficial or illustrative level.
This touches on a trend that currently seems to define the world of wearable technology– technological innovation outstripping practical application. At a recent ‘Wearable Futures’ conference we encountered all sorts of devices, from super-sophisticated pedometers to chameleon fabrics that respond to wearers’ moods. Almost every one seemed capable of changing our future, but nobody knew exactly how.
Here at Revealing Reality we have played with biometric monitors that record the smallest of body movements, and with sensors that pick up the subtlest of emotional responses. We’ve been able to generate staggering quantities of data on human experience. Our challenge now is to explore how we can use that data to reveal the reality of human lives.