Nearly every service, app, website or provider we use records a bunch of data about us, for good and bad reasons. Not all of them enable you to use that data, but some of them make it pretty easy. This is my attempt to get an entertaining (and not to serious) picture of my motion profile that Google recorded for 2016.
I think most readers use an Android smartphone and do not have the history of the Google Location Service wiped, so if you are keen to know what you did last year, read the remaining post to get an idea how easy it is to analyse and use that specific data set.
The JSON files do not only contain the different timestamps, coordinates and accuracy guesses but also the activities that Google’s AI thinks I was doing. I have limited the data to
onFoot (violet plus signs),
inVehicle (light blue squares),
onBicycle (greenish, downwards facing triangles, really rare and most of them false are positives), and
exitingVehicle (yellow X). The filtered data set contains around 12k data points.
I have projected the coordinates using UTM so that the space dimensions reassemble the real distances more closely. The third dimension as time. All dimensions got normalized to
[0,1]. Since 3 dimensions are either difficult to show or boring, I have reduced the data to 2 dimensions using t-SNE, which is properly implemented in sklearn. Just be careful about the CPU and memory consumption.
The actual visualization is a short matplotlib script, showing the 2 resulting dimensions, the described activity colors and markers, as well as the accuracy information as transparency and size of the markers.
I am quite happy with the result, considering that the effort was rather low. Tt would be interesting to combine this data with other sources, so this might not be the last time I am using visiting the Takeout.