Doing more with your Google Location History
I love data, and while I am well aware that Google is using my location history for their own purposes, I’ve had location history activated for many years now, so I have a lot of data to play with. And while the visualizations on Google Maps itself are nice, I like to be more flexible. In this article I want to tell you how to get your full Google Location History and transform it into something useful, and give you some ideas what you can do with this data.
Getting your data
Your first stop is at Google Takeout to download your Location History. You can use this direct link to get started.
Once the archive is done, you can download and unzip it to get a
Location History.json with all your data.
Converting your data
While the JSON file includes a lot of useful data, it’s not really a useable format for many purposes and with the huge size will also cause problems for some applications. For that reason I have created a Python script several years ago, which can be used to take the JSON file and convert to more useful formats.
This Python script takes the JSON file of your location history which you can get via Google Takeout and converts it…
The script has been improved a lot by the open source community and I have recently started refactoring it because it was getting hard to maintain, and while doing so included some more important features.
To be able to use the script you have to have Python installed, and if your JSON file is already very big, you also need the
ijson library installed(
pip install ijson) for the script to be able to read your data iteratively.
Checkout the README of the repository above for all the information about the script. I will show you some examples of how to use the script below.
Using your data
Google Earth (KML)
The first obvious thing you might try is to just convert to KML and import the data to Google Earth.
python location_history_json_converter.py "Location History.json" locations.kml -f kml -i
With the amount of points this would produce it will most likely crash Google Earth, but you can use the
-e options to define start and end dates, limiting the amount of locations to be displayed. It’s also usually a good idea to filter out the less accurate points with the
-a option. Around 500 (i.e. uncertainty of about 500 meters) is a good value to start with.
python location_history_json_converter.py "Location History.json" locations.kml -f kml -s 2017–01–01 -e 2017–01–31 -a 500 -i
Once the conversion is done (which might take several minutes) you can drag your newly generated
locations.kml into Google Earth.
Of course having tons of pins on Google Earth isn’t all that good looking, but it’s a start.
Google My Maps (gpxtracks)
When I put together our travel reports I usually try to create a map with routes and places we visited. As a starting point I can use my location history, filtered to the dates of the vacation, converted to gpxtracks to get the routes.
python location_history_json_converter.py "Location History.json" vacation.gpx -f gpxtracks -s 2017–01–12 -e 2017–01–29 -a 500 -i
After creating a new map in Google My Maps you can then import the gpx file.
There’s still a lot of work involved in editing and cleaning up the map before reaching the final version, but it’s a great point to start from. And you get some statistics for free.
The most annoying part of the import is the "stars" created by GPS-signals shifting around when it should actually be stable. There might be enough information in the Location History file to filter out those locations to get a cleaner result (e.g. the estimated activity type could be STILL). PRs to my repository if you have an idea how to clean up the data are always welcome :)
Power BI Desktop (csv)
Power BI Desktop is a free (Windows) tool that anyone can use to create visualizations based on data in table format. To prepare the data for this tool my script can be used to create a very simple csv file, which includes only the timestamp, latitude and longitude. For this conversion I also used a better accuracy setting to reduce the amount of locations further.
python location_history_json_converter.py "Location History.json" locations.csv -f csv -a 100 -i
Before importing your data, open the options and make sure that the regional settings for import are set to English (United States) or the geo-coordinates might not be imported correctly.
You can then import your data via “Get Data” > “Text/CSV”.
My goal is to visualize the data in a map and since unfortunately the ArcGis Maps visualization only works with a maximum of 30000 points (at least in the free version) we are going to transform and aggregate the data before loading it, which can be done directly in Power BI.
First we are going to add calculations to round the Latitude and Longitude values to 3 digits after the comma, via “Add Column” > “Custom Column” calling the new columns “Lat” / “Lon” with the formulas
=Number.Round([Latitude], 3) and
Next we are going to group the data by the rounded locations and add a count for how often the location has been visited, via “Transform” > “Group By” using the settings as shown:
With those changes applied we can now use this data to create our map.
Pick the ArcGIS Maps for Power BI visualization and apply the Lat, Lon and Count values accordingly, and you will get a first overview of all your locations.
Using the three dots in the top right corner of the map you can switch to edit mode to make the map more interesting.
By far my favorite option is to use the dark gray base map with a heat map overlay.
You can then also zoom into the heat map and get a more detailed heat distribution in the visible area, and if you zoom in too much you will see the effect of our rounding calculation with evenly distributed heat spots.
You can always go back to edit your data transformation and maybe apply some filters to get a more detailed report of an area you are interested in.
I’ve seen some examples of my script used out in the wild already, but I’m curious to hear your ideas or examples of putting your location history to a better use (with or without my script). So don’t be shy and go explore your history.