Downloading the data#

All data files used in the lecture are available on this webserver.


For Week 02, we will use climatological data from the CERES (Clouds and the Earth’s Radiant Energy System) mission. We are going to use the EBAF-TOA and the EBAF-Surface data products (both freely available on this webpage) as climatologies (i.e. monthly averages 2005-2015).

The data quality summary of these data (PDF) can be found here, and more accessible publications can be found here for TOA and here for Surface.


ERA5 data#

Ready to use, low resolution NetCDF files#

ERA5 is an atmospheric reanalysis product. We will use it a lot! Note that you can download the data yourself (I provide some sample scripts below), but for a start you can download some files I prepared for you:


Monthly surface (3D) data:

Monthly pressure-level (4D) data:

Monthly averaged (annual cycle) data:

Full average (annual) data:

File naming conventions:

  • LowRes means that I asked for a lower spatial resolution than available (0.75° instead of the 0.25° default).

  • Monthly means that I averaged the data to calendar months

  • MonthlyAvg means that I averaged the data to all months (annual cycle)

  • AnnualAvg means that I averaged the data to all months and years (simple average)

  • 4D means that the data is also available on pressure levels

  • t2m or tp are variable names

  • Invariant means that this file contains time invariant fields such as topography or land-sea mask.

Additional ERA5 data from the CDS servers (optional)#

You may want to download ERA5 data yourself if:

  • you’d like additional variables not listed above

  • you’d like to use high resolution data (0.25°) instead of the low resolution (0.75°) that I provided

  • you’d like to download hourly data (daily data are not available unfortunately)

If you want to go this path (not mandatory at all for the projects), you’ll need an account at the Copernicus Data Store

You may want to use their online platform to analyze/download the data, or you can use a script. To get you started, here is the script I used to download all the data listed above.

CMIP5 and CMIP6 data#

Temperature and Precipitation projections, for a large number of scenarios and GCMs:

Reading data from an url#

You can also open files without downloading them locally. This is somehow inefficient (it will download all data in memory each time you run the notebook), but might be useful e.g. on MyBinder where you can’t store files. You will need the h5netcdf library to be installed for the following to work:

import xarray as xr
import urllib, io

url = ''

req = urllib.request.Request(url)
with urllib.request.urlopen(req) as resp:
    ds = xr.open_dataset(io.BytesIO(

Dimensions:    (longitude: 480, latitude: 241, time: 1)
  * longitude  (longitude) float32 -179.6 -178.9 -178.1 ... 178.1 178.9 179.6
  * latitude   (latitude) float32 90.0 89.25 88.5 87.75 ... -88.5 -89.25 -90.0
  * time       (time) datetime64[ns] 1979-01-01
Data variables:
    lsm        (time, latitude, longitude) float32 ...
    wmb        (time, latitude, longitude) float32 ...
    z          (time, latitude, longitude) float32 ...
    Conventions:  CF-1.6
    history:      2019-11-18 09:24:36 GMT by grib_to_netcdf-2.14.0: /opt/ecmw...