In August 2023, Google announced the addition of an air quality service to its list of mapping APIs. You possibly can read more about that here. It appears this information is now also available from throughout the Google Maps app, though the information obtainable via the APIs turned out to be much richer.
According the announcement, Google is combining information from many sources at different resolutions — ground-based pollution sensors, satellite data, live traffic information and predictions from numerical models — to provide a dynamically updated dataset of air quality in 100 countries at as much as 500m resolution. This seems like a really interesting and potentially useful dataset for all types of mapping, healthcare and planning applications!
When first reading about this I used to be planning to try it out in a “check with your data” application, using a few of the things learned from constructing this travel mapper tool. Perhaps a system that may plot a time series of air pollution concentrations in your favorite city, or perhaps a tool to assist people plan hikes of their local area as to avoid bad air?
There are three API tools that will help here — a “current conditions” service, which provides current air quality index values and pollutant concentrations at a given location; a “historical conditions” service, which does the identical but at hourly intervals for as much as 30 days prior to now and a “heatmap” service, which provides current conditions over a given area as a picture.
Previously, I had used the superb googlemapspackage to call Google Maps APIs in Python, but these recent APIs should not yet supported. Surprisingly, beyond the official documentation I could find few examples of individuals using these recent tools and no pre-existing Python packages designed to call them. I can be happily corrected though if someone knows otherwise!
I subsequently built some quick tools of my very own, and on this post we walk through how they work and easy methods to use them. I hope this might be useful to anyone wanting to experiment with these recent APIs in Python and on the lookout for a spot to start out. All of the code for this project will be found here, and I’ll likely be expanding this repo over time as I add more functionality and construct some type of mapping application with the air quality data.
Let’s start! On this section we’ll go over easy methods to fetch air quality data at a given location with Google Maps. You’ll first need an API key, which you’ll generate via your Google Cloud account. They’ve a 90-day free trial period, after which you’ll pay for API services you utilize. Ensure that you enable the “Air Quality API”, and concentrate on the pricing policies before you begin making numerous calls!
I often store my API key in an .env file and cargo it with dotenv using a function like this
from dotenv import load_dotenv
from pathlib import Pathdef load_secets():
load_dotenv()
env_path = Path(".") / ".env"
load_dotenv(dotenv_path=env_path)
google_maps_key = os.getenv("GOOGLE_MAPS_API_KEY")
return {
"GOOGLE_MAPS_API_KEY": google_maps_key,
}
Getting current conditions requires a POST request as detailed here. We’re going to take inspiration from the googlemaps package to do that in a way that will be generalized. First, we construct a client class that uses requests to make the decision. The goal is kind of straightforward — we wish to construct a URL just like the one below, and include all of the request options specific to the user’s query.
https://airquality.googleapis.com/v1/currentConditions:lookup?key=YOUR_API_KEY
The Clientclass takes in our API key as key after which builds the request_url for the query. It accepts request options as a params dictionary after which puts them within the JSON request body, which is handled by the self.session.post() call.
import requests
import ioclass Client(object):
DEFAULT_BASE_URL = "https://airquality.googleapis.com"
def __init__(self, key):
self.session = requests.Session()
self.key = key
def request_post(self, url, params):
request_url = self.compose_url(url)
request_header = self.compose_header()
request_body = params
response = self.session.post(
request_url,
headers=request_header,
json=request_body,
)
return self.get_body(response)
def compose_url(self, path):
return self.DEFAULT_BASE_URL + path + "?" + "key=" + self.key
@staticmethod
def get_body(response):
body = response.json()
if "error" in body:
return body["error"]
return body
@staticmethod
def compose_header():
return {
"Content-Type": "application/json",
}
Now we are able to make a function that helps the user assemble valid request options for the present conditions API after which uses this Client class to make the request. Again, that is inspired by the design of the googlemaps package.
def current_conditions(
client,
location,
include_local_AQI=True,
include_health_suggestion=False,
include_all_pollutants=True,
include_additional_pollutant_info=False,
include_dominent_pollutant_conc=True,
language=None,
):
"""
See documentation for this API here
https://developers.google.com/maps/documentation/air-quality/reference/rest/v1/currentConditions/lookup
"""
params = {}if isinstance(location, dict):
params["location"] = location
else:
raise ValueError(
"Location argument have to be a dictionary containing latitude and longitude"
)
extra_computations = []
if include_local_AQI:
extra_computations.append("LOCAL_AQI")
if include_health_suggestion:
extra_computations.append("HEALTH_RECOMMENDATIONS")
if include_additional_pollutant_info:
extra_computations.append("POLLUTANT_ADDITIONAL_INFO")
if include_all_pollutants:
extra_computations.append("POLLUTANT_CONCENTRATION")
if include_dominent_pollutant_conc:
extra_computations.append("DOMINANT_POLLUTANT_CONCENTRATION")
if language:
params["language"] = language
params["extraComputations"] = extra_computations
return client.request_post("/v1/currentConditions:lookup", params)
The choices for this API are relatively straightforward. It needs a dictionary with the longitude and latitude of the purpose you wish to investigate, and might optionally absorb various other arguments that control how much information is returned. Lets see it in motion with all of the arguments set to True
# arrange client
client = Client(key=GOOGLE_MAPS_API_KEY)
# a location in Los Angeles, CA
location = {"longitude":-118.3,"latitude":34.1}
# a JSON response
current_conditions_data = current_conditions(
client,
location,
include_health_suggestion=True,
include_additional_pollutant_info=True
)
Lots of interesting information is returned! Not only do we’ve got the air quality index values from the Universal and US-based AQI indices, but we even have concentrations of the foremost pollutants, an outline of every one and an overall set of health recommendations for the present air quality.
{'dateTime': '2023-10-12T05:00:00Z',
'regionCode': 'us',
'indexes': [{'code': 'uaqi',
'displayName': 'Universal AQI',
'aqi': 60,
'aqiDisplay': '60',
'color': {'red': 0.75686276, 'green': 0.90588236, 'blue': 0.09803922},
'category': 'Good air quality',
'dominantPollutant': 'pm10'},
{'code': 'usa_epa',
'displayName': 'AQI (US)',
'aqi': 39,
'aqiDisplay': '39',
'color': {'green': 0.89411765},
'category': 'Good air quality',
'dominantPollutant': 'pm10'}],
'pollutants': [{'code': 'co',
'displayName': 'CO',
'fullName': 'Carbon monoxide',
'concentration': {'value': 292.61, 'units': 'PARTS_PER_BILLION'},
'additionalInfo': {'sources': 'Typically originates from incomplete combustion of carbon fuels, such as that which occurs in car engines and power plants.',
'effects': 'When inhaled, carbon monoxide can prevent the blood from carrying oxygen. Exposure may cause dizziness, nausea and headaches. Exposure to extreme concentrations can lead to loss of consciousness.'}},
{'code': 'no2',
'displayName': 'NO2',
'fullName': 'Nitrogen dioxide',
'concentration': {'value': 22.3, 'units': 'PARTS_PER_BILLION'},
'additionalInfo': {'sources': 'Main sources are fuel burning processes, such as those used in industry and transportation.',
'effects': 'Exposure may cause increased bronchial reactivity in patients with asthma, lung function decline in patients with Chronic Obstructive Pulmonary Disease (COPD), and increased risk of respiratory infections, especially in young children.'}},
{'code': 'o3',
'displayName': 'O3',
'fullName': 'Ozone',
'concentration': {'value': 24.17, 'units': 'PARTS_PER_BILLION'},
'additionalInfo': {'sources': 'Ozone is created in a chemical reaction between atmospheric oxygen, nitrogen oxides, carbon monoxide and organic compounds, in the presence of sunlight.',
'effects': 'Ozone can irritate the airways and cause coughing, a burning sensation, wheezing and shortness of breath. Additionally, ozone is one of the major components of photochemical smog.'}},
{'code': 'pm10',
'displayName': 'PM10',
'fullName': 'Inhalable particulate matter (<10µm)',
'concentration': {'value': 44.48, 'units': 'MICROGRAMS_PER_CUBIC_METER'},
'additionalInfo': {'sources': 'Main sources are combustion processes (e.g. indoor heating, wildfires), mechanical processes (e.g. construction, mineral dust, agriculture) and biological particles (e.g. pollen, bacteria, mold).',
'effects': 'Inhalable particles can penetrate into the lungs. Short term exposure can cause irritation of the airways, coughing, and aggravation of heart and lung diseases, expressed as difficulty breathing, heart attacks and even premature death.'}},
{'code': 'pm25',
'displayName': 'PM2.5',
'fullName': 'Fine particulate matter (<2.5µm)',
'concentration': {'value': 11.38, 'units': 'MICROGRAMS_PER_CUBIC_METER'},
'additionalInfo': {'sources': 'Main sources are combustion processes (e.g. power plants, indoor heating, car exhausts, wildfires), mechanical processes (e.g. construction, mineral dust) and biological particles (e.g. bacteria, viruses).',
'effects': 'Fine particles can penetrate into the lungs and bloodstream. Short term exposure can cause irritation of the airways, coughing and aggravation of heart and lung diseases, expressed as difficulty breathing, heart attacks and even premature death.'}},
{'code': 'so2',
'displayName': 'SO2',
'fullName': 'Sulfur dioxide',
'concentration': {'value': 0, 'units': 'PARTS_PER_BILLION'},
'additionalInfo': {'sources': 'Main sources are burning processes of sulfur-containing fuel in industry, transportation and power plants.',
'effects': 'Exposure causes irritation of the respiratory tract, coughing and generates local inflammatory reactions. These in turn, may cause aggravation of lung diseases, even with short term exposure.'}}],
'healthRecommendations': {'generalPopulation': 'With this level of air quality, you may have no limitations. Benefit from the outdoors!',
'elderly': 'In the event you begin to feel respiratory discomfort equivalent to coughing or respiratory difficulties, consider reducing the intensity of your outdoor activities. Attempt to limit the time you spend near busy roads, construction sites, open fires and other sources of smoke.',
'lungDiseasePopulation': 'In the event you begin to feel respiratory discomfort equivalent to coughing or respiratory difficulties, consider reducing the intensity of your outdoor activities. Attempt to limit the time you spend near busy roads, industrial emission stacks, open fires and other sources of smoke.',
'heartDiseasePopulation': 'In the event you begin to feel respiratory discomfort equivalent to coughing or respiratory difficulties, consider reducing the intensity of your outdoor activities. Attempt to limit the time you spend near busy roads, construction sites, industrial emission stacks, open fires and other sources of smoke.',
'athletes': 'In the event you begin to feel respiratory discomfort equivalent to coughing or respiratory difficulties, consider reducing the intensity of your outdoor activities. Attempt to limit the time you spend near busy roads, construction sites, industrial emission stacks, open fires and other sources of smoke.',
'pregnantWomen': 'To maintain you and your baby healthy, consider reducing the intensity of your outdoor activities. Attempt to limit the time you spend near busy roads, construction sites, open fires and other sources of smoke.',
'children': 'In the event you begin to feel respiratory discomfort equivalent to coughing or respiratory difficulties, consider reducing the intensity of your outdoor activities. Attempt to limit the time you spend near busy roads, construction sites, open fires and other sources of smoke.'}}
Wouldn’t or not it’s nice to give you the chance to fetch a timeseries of those AQI and pollutant values for a given location? That may reveal interesting patterns equivalent to correlations between the pollutants or day by day fluctuations attributable to traffic or weather-related aspects.
We are able to do that with one other POST request to the historical conditions API, which can give us an hourly history. This works in much the identical way as current conditions, the one major difference being that because the results will be quite long they’re returned as several pages , which requires a little bit extra logic to handle.
Let’s modify the request_post approach to Client to handle this.
def request_post(self,url,params):request_url = self.compose_url(url)
request_header = self.compose_header()
request_body = params
response = self.session.post(
request_url,
headers=request_header,
json=request_body,
)
response_body = self.get_body(response)
# put the primary page within the response dictionary
page = 1
final_response = {
"page_{}".format(page) : response_body
}
# fetch all of the pages if needed
while "nextPageToken" in response_body:
# call again with the subsequent page's token
request_body.update({
"pageToken":response_body["nextPageToken"]
})
response = self.session.post(
request_url,
headers=request_header,
json=request_body,
)
response_body = self.get_body(response)
page += 1
final_response["page_{}".format(page)] = response_body
return final_response
This handles the case where response_body comprises a field called nextPageToken, which is the id of the subsequent page of knowledge that’s been generated and is able to fetch. Where that information exists, we just have to call the API again with a brand new param called pageToken , which directs it to the relevant page. We do that repeatedly shortly loop until there are not any more pages left. Our final_response dictionary subsequently now comprises one other layer denoted by page number. For calls to current_conditions there’ll only ever be one page, but for calls to historical_conditions there could also be several.
With that taken care of, we are able to write a historical_conditions function in a really similar style to current_conditions .
def historical_conditions(
client,
location,
specific_time=None,
lag_time=None,
specific_period=None,
include_local_AQI=True,
include_health_suggestion=False,
include_all_pollutants=True,
include_additional_pollutant_info=False,
include_dominant_pollutant_conc=True,
language=None,
):
"""
See documentation for this API here https://developers.google.com/maps/documentation/air-quality/reference/rest/v1/history/lookup
"""
params = {}if isinstance(location, dict):
params["location"] = location
else:
raise ValueError(
"Location argument have to be a dictionary containing latitude and longitude"
)
if isinstance(specific_period, dict) and never specific_time and never lag_time:
assert "startTime" in specific_period
assert "endTime" in specific_period
params["period"] = specific_period
elif specific_time and never lag_time and never isinstance(specific_period, dict):
# note that point have to be within the "Zulu" format
# e.g. datetime.datetime.strftime(datetime.datetime.now(),"%Y-%m-%dT%H:%M:%SZ")
params["dateTime"] = specific_time
# lag periods in hours
elif lag_time and never specific_time and never isinstance(specific_period, dict):
params["hours"] = lag_time
else:
raise ValueError(
"Must provide specific_time, specific_period or lag_time arguments"
)
extra_computations = []
if include_local_AQI:
extra_computations.append("LOCAL_AQI")
if include_health_suggestion:
extra_computations.append("HEALTH_RECOMMENDATIONS")
if include_additional_pollutant_info:
extra_computations.append("POLLUTANT_ADDITIONAL_INFO")
if include_all_pollutants:
extra_computations.append("POLLUTANT_CONCENTRATION")
if include_dominant_pollutant_conc:
extra_computations.append("DOMINANT_POLLUTANT_CONCENTRATION")
if language:
params["language"] = language
params["extraComputations"] = extra_computations
# page size default set to 100 here
params["pageSize"] = 100
# page token will get filled in if needed by the request_post method
params["pageToken"] = ""
return client.request_post("/v1/history:lookup", params)
To define the historical period, the API can accept a lag_time in hours, as much as 720 (30 days). It may possibly also accept a specific_perioddictionary, with defines start and end times within the format described within the comments above. Finally, to fetch a single hour of knowledge, it will possibly accept only one timestamp, provided by specific_time . Also note the usage of the pageSize parameter, which controls what number of time points are returned in each call to the API. The default here is 100.
Let’s try it out.
# arrange client
client = Client(key=GOOGLE_MAPS_API_KEY)
# a location in Los Angeles, CA
location = {"longitude":-118.3,"latitude":34.1}
# a JSON response
history_conditions_data = historical_conditions(
client,
location,
lag_time=720
)
We must always get an extended, nested JSON response that comprises the AQI index values and specific pollutant values at 1 hour increments during the last 720 hours. There are a lot of ways to format this right into a structure that’s more amenable to visualization and evaluation, and the function below will convert it right into a pandas dataframe in “long” format, which works well with seabornfor plotting.
from itertools import chain
import pandas as pddef historical_conditions_to_df(response_dict):
chained_pages = list(chain(*[response_dict[p]["hoursInfo"] for p in [*response_dict]]))
all_indexes = []
all_pollutants = []
for i in range(len(chained_pages)):
# need this check in case one in every of the timestamps is missing data, which might sometimes occur
if "indexes" in chained_pages[i]:
this_element = chained_pages[i]
# fetch the time
time = this_element["dateTime"]
# fetch all of the index values and add metadata
all_indexes += [(time , x["code"],x["displayName"],"index",x["aqi"],None) for x in this_element['indexes']]
# fetch all of the pollutant values and add metadata
all_pollutants += [(time , x["code"],x["fullName"],"pollutant",x["concentration"]["value"],x["concentration"]["units"]) for x in this_element['pollutants']]
all_results = all_indexes + all_pollutants
# generate "long format" dataframe
res = pd.DataFrame(all_results,columns=["time","code","name","type","value","unit"])
res["time"]=pd.to_datetime(res["time"])
return res
Running this on the output of historical_conditions will produce a dataframe whose columns are formatted for simple evaluation.
df = historical_conditions_to_df(history_conditions_data)
And we are able to now plot the lead to seaborn or another visualization tool.
import seaborn as sns
g = sns.relplot(
x="time",
y="value",
data=df[df["code"].isin(["uaqi","usa_epa","pm25","pm10"])],
kind="line",
col="name",
col_wrap=4,
hue="type",
height=4,
facet_kws={'sharey': False, 'sharex': False}
)
g.set_xticklabels(rotation=90)
That is already very interesting! There are clearly several periodicities within the pollutant time series and it’s notable that the US AQI is closely correlated with the pm25 and pm10 concentrations, as expected. I’m much less conversant in the Universal AQI that Google is providing here, so can’t explain why appears anti-correlated with pm25 and p10. Does smaller UAQI mean higher air quality? Despite some searching around I’ve been unable to search out a superb answer.
Now for the ultimate use case of the Google Maps Air Quality API — generating heatmap tiles. The documentation about this a sparse, which is a shame because these tiles are a strong tool for visualizing current air quality, especially when combined with a Folium map.
We fetch them with a GET request, which involves constructing a URL in the next format, where the situation of the tile is specified by zoom , x and y
GET https://airquality.googleapis.com/v1/mapTypes/{mapType}/heatmapTiles/{zoom}/{x}/{y}
What dozoom , x and y mean? We are able to answe this by learning about how Google Maps converts coordinates in latitude and longitude into “tile coordinates”, which is described intimately here. Essentially, Google Maps is storing imagery in grids where each cell measures 256 x 256 pixels and the real-world dimensions of the cell are a function of the zoom level. Once we make a call to the API, we’d like to specify which grid to attract from — which is decided by the zoom level — and where on the grid to attract from — which is decided by the x and y tile coordinates. What comes back is a bytes array that will be read by Python Imaging Library (PIL) or similiar imaging processing package.
Having formed our url within the above format, we are able to add a number of methods to the Client class that can allow us to fetch the corresponding image.
def request_get(self,url):request_url = self.compose_url(url)
response = self.session.get(request_url)
# for images coming from the heatmap tiles service
return self.get_image(response)
@staticmethod
def get_image(response):
if response.status_code == 200:
image_content = response.content
# note use of Image from PIL here
# needs from PIL import Image
image = Image.open(io.BytesIO(image_content))
return image
else:
print("GET request for image returned an error")
return None
This is sweet, but we what we actually need is the flexibility to convert a set of coordinates in longitude and latitude into tile coordinates. The documentation explains how — we first convert to coordinates into the Mercator projection, from which we convert to “pixel coordinates” using the required zoom level. Finally we translate that into the tile coordinates. To handle all these transformations, we are able to use the TileHelper class below.
import math
import numpy as npclass TileHelper(object):
def __init__(self, tile_size=256):
self.tile_size = tile_size
def location_to_tile_xy(self,location,zoom_level=4):
# Based on function here
# https://developers.google.com/maps/documentation/javascript/examples/map-coordinates#maps_map_coordinates-javascript
lat = location["latitude"]
lon = location["longitude"]
world_coordinate = self._project(lat,lon)
scale = 1 << zoom_level
pixel_coord = (math.floor(world_coordinate[0]*scale), math.floor(world_coordinate[1]*scale))
tile_coord = (math.floor(world_coordinate[0]*scale/self.tile_size),math.floor(world_coordinate[1]*scale/self.tile_size))
return world_coordinate, pixel_coord, tile_coord
def tile_to_bounding_box(self,tx,ty,zoom_level):
# see https://developers.google.com/maps/documentation/javascript/coordinates
# for details
box_north = self._tiletolat(ty,zoom_level)
# tile numbers advance towards the south
box_south = self._tiletolat(ty+1,zoom_level)
box_west = self._tiletolon(tx,zoom_level)
# time numbers advance towards the east
box_east = self._tiletolon(tx+1,zoom_level)
# (latmin, latmax, lonmin, lonmax)
return (box_south, box_north, box_west, box_east)
@staticmethod
def _tiletolon(x,zoom):
return x / math.pow(2.0,zoom) * 360.0 - 180.0
@staticmethod
def _tiletolat(y,zoom):
n = math.pi - (2.0 * math.pi * y)/math.pow(2.0,zoom)
return math.atan(math.sinh(n))*(180.0/math.pi)
def _project(self,lat,lon):
siny = math.sin(lat*math.pi/180.0)
siny = min(max(siny,-0.9999), 0.9999)
return (self.tile_size*(0.5 + lon/360), self.tile_size*(0.5 - math.log((1 + siny) / (1 - siny)) / (4 * math.pi)))
@staticmethod
def find_nearest_corner(location,bounds):
corner_lat_idx = np.argmin([
np.abs(bounds[0]-location["latitude"]),
np.abs(bounds[1]-location["latitude"])
])
corner_lon_idx = np.argmin([
np.abs(bounds[2]-location["longitude"]),
np.abs(bounds[3]-location["longitude"])
])
if (corner_lat_idx == 0) and (corner_lon_idx == 0):
# closests is latmin, lonmin
direction = "southwest"
elif (corner_lat_idx == 0) and (corner_lon_idx == 1):
direction = "southeast"
elif (corner_lat_idx == 1) and (corner_lon_idx == 0):
direction = "northwest"
else:
direction = "northeast"
corner_coords = (bounds[corner_lat_idx],bounds[corner_lon_idx+2])
return corner_coords, direction
@staticmethod
def get_ajoining_tiles(tx,ty,direction):
if direction == "southwest":
return [(tx-1,ty),(tx-1,ty+1),(tx,ty+1)]
elif direction == "southeast":
return [(tx+1,ty),(tx+1,ty-1),(tx,ty-1)]
elif direction == "northwest":
return [(tx-1,ty-1),(tx-1,ty),(tx,ty-1)]
else:
return [(tx+1,ty-1),(tx+1,ty),(tx,ty-1)]
We are able to see that location_to_tile_xy is taking in a location dictionary and zoom level and returning the tile by which that time will be found. One other helpful function is tile_to_bounding_box , which can find the bounding coordinates of a specified grid cell. We want this if we’re going to geolocate the cell and plot it on a map.
Lets see how this works contained in the air_quality_tile function below, which goes to absorb our client , location and a string indicating what style of tile we wish to fetch. We also have to specify a zoom level, which will be difficult to decide on at first and requires some trial and error. We’ll discuss the get_adjoining_tiles argument shortly.
def air_quality_tile(
client,
location,
pollutant="UAQI_INDIGO_PERSIAN",
zoom=4,
get_adjoining_tiles = True):
# see https://developers.google.com/maps/documentation/air-quality/reference/rest/v1/mapTypes.heatmapTiles/lookupHeatmapTile
assert pollutant in [
"UAQI_INDIGO_PERSIAN",
"UAQI_RED_GREEN",
"PM25_INDIGO_PERSIAN",
"GBR_DEFRA",
"DEU_UBA",
"CAN_EC",
"FRA_ATMO",
"US_AQI"
]
# comprises useful methods for dealing the tile coordinates
helper = TileHelper()
# get the tile that the situation is in
world_coordinate, pixel_coord, tile_coord = helper.location_to_tile_xy(location,zoom_level=zoom)
# get the bounding box of the tile
bounding_box = helper.tile_to_bounding_box(tx=tile_coord[0],ty=tile_coord[1],zoom_level=zoom)
if get_adjoining_tiles:
nearest_corner, nearest_corner_direction = helper.find_nearest_corner(location, bounding_box)
adjoining_tiles = helper.get_ajoining_tiles(tile_coord[0],tile_coord[1],nearest_corner_direction)
else:
adjoining_tiles = []
tiles = []
#get all of the adjoining tiles, plus the one in query
for tile in adjoining_tiles + [tile_coord]:
bounding_box = helper.tile_to_bounding_box(tx=tile[0],ty=tile[1],zoom_level=zoom)
image_response = client.request_get(
"/v1/mapTypes/" + pollutant + "/heatmapTiles/" + str(zoom) + '/' + str(tile[0]) + '/' + str(tile[1])
)
# convert the PIL image to numpy
try:
image_response = np.array(image_response)
except:
image_response = None
tiles.append({
"bounds":bounding_box,
"image":image_response
})
return tiles
From reading the code, we are able to see that the workflow is as follows: First, find the tile coordinates of the situation of interest. This specifies the grid cell we wish to fetch. Then, find the bounding coordinates of this grid cell. If we wish to fetch the encompassing tiles, find the closest corner of the bounding box after which use that to calculate the tile coordinates of the three adjoining grid cells. Then call the API and return each of the tiles as a picture with its corresponding bounding box.
We are able to run this in the usual way, as follows:
client = Client(key=GOOGLE_MAPS_API_KEY)
location = {"longitude":-118.3,"latitude":34.1}
zoom = 7
tiles = air_quality_tile(
client,
location,
pollutant="UAQI_INDIGO_PERSIAN",
zoom=zoom,
get_adjoining_tiles=False)
After which plot with folium for a zoomable map! Note that I’m using leafmap here, because this package can generate Folium maps which are compatible with gradio, a strong tool for generating easy user interfaces for python applications. Take a have a look at this text for an example.
import leafmap.foliumap as leafmap
import foliumlat = location["latitude"]
lon = location["longitude"]
map = leafmap.Map(location=[lat, lon], tiles="OpenStreetMap", zoom_start=zoom)
for tile in tiles:
latmin, latmax, lonmin, lonmax = tile["bounds"]
AQ_image = tile["image"]
folium.raster_layers.ImageOverlay(
image=AQ_image,
bounds=[[latmin, lonmin], [latmax, lonmax]],
opacity=0.7
).add_to(map)
Perhaps disappointingly, the tile containing our location at this zoom level is generally sea, although its still nice to see the air pollution plotted on top of an in depth map. In the event you zoom in, you may see that road traffic information is getting used to tell the air quality signals in urban areas.
Setting get_adjoining_tiles=True gives us a much nicer map since it fetches the three closest, non-overlapping tiles at that zoom level. In our case that helps quite a bit to make the map more presentable.
I personally prefer the photographs generated when pollutant=US_AQI, but there are several different options. Unfortunately the API doesn’t return a color scale, although it could be possible to generate one using the pixel values within the image and knowledge of what the colours mean.
Thanks for making it to the top! Here we explored easy methods to use the Google Maps Air Quality APIs to deliver ends in Python, which could possibly be utilized in manner of interesting applications. In future I hope to follow up with one other article concerning the air_quality_mapper tool because it evolves further, but I hope that the scripts discussed here might be useful in their very own right. As all the time, any suggestions for further development can be much appreciated!