There’s a huge amount of data generated by all the quantified-self apps and devices. From basic data about one’s activity and health, such as heart rate and number of breaths to the height and duration during jumps while kiteboarding. But most of these device/apps just display data as simple stats and diagrams, for example your sleeping or activity patterns over time.
Of course that’s interesting and helps to understands ones own sleeping behavior. But it’s the same kind of ineffective data puking that happens in so many companies, it doesn’t lead to really valuable insights and actions. As data scientist Dj Patil argued in his talk at Le Web, we are really good at gathering data through ever smaller, faster and more precise sensors. We are also quite awesome in processing these massice amounts of data with things like map reduce, real time stream processing etc.. But we still really suck at generating insights and actions.
Why not use established statistical modelling techniques to generate these insights and recommendations for actions? If I would be wearing a quantified-self device I would like to be warne that I would get sick if I won’t start running more regularly depending on the weather or that I should go to bed before 10pm tonight to avoid catching the flu.
I would like to be reminded early enough that I should start running more at the beginning of winter time since I tend to get depressed due to the lack of light. Or how about which activities and meals during the day lead to better sleeping patterns in the following night? If I just started wearing the device and there isn’t enough data on my activity and sleeping behaviour to generate significant patterns and insights, why not use the aggregated data of all other users who were the same device?
To take this further and ultimately find the patterns that lead to the highest level of happiness, it would be interesting to integrate data about how a person is feeling (happy, depressed, anxious, relaxed etc.). Other interesting data point could include how much time a person spends being mindful, has invested into meditating, yoga, praying etc.. These types of data would be highly subjective information, but I could still see how this would be standardized and could to lead to more meaningful insights.
It will be interesting to see if there is going to be some standard presentation of quantified self data and whether APIs will be provided for retrieving data from different devices and synchronizing it. Or whether the device vendors will try to lock users in, own this valuable data and all the insights and recommendations for actions that could be derived. The Former could lead to a whole range of new enterprises that integrate data from all kinds of different devices on their quest to solve an individual’s happiness formula, including even seemingly unrelated data such as to what websites have been visited or what which TV shows have been watched.
Up until now it seems as if only engineers and data scientist have managed to pull the data out of the quantified-self devices to do some advanced analysis on it. For example Dan Goldin imported his runkeeper data into R and correlated the distance of his runs with the speed:
Last year Jawbone hired ex-linkedin data scientist Monica Rogati (@mrogati). It will be interesting to see how much further she will move Jawbone from pure reporting of existing numbers and stats towards insights and actions.
Here you’ll find some more interesting links related to the quantified self movement, devices etc.:
- Stephane Marceau on the OMsignal Sensing Shirt: http://vimeo.com/82215891
- Getting your data out of the Jawbone: http://eric-blue.com/2011/11/28/jawbone-up-api-discovery/