Export Google Earth Engine Images directly to your computer.¶

Google Earth Engine lets you export images, map tiles, tables and videos from Earth Engine. The exports can only be sent to your Google Drive account, to Google Cloud Storage or to a new Earth Engine asset. Satellite images are large in size so your free drive storage will fill up in no time and then you are a force to buy the storage which is not cheap if you are living in developing countries. But we can take advantage of the google python API to export the images, image collections directly to your local computer without uploading to your drive.

The Google Earth Engine (GEE) Python API facilitates interacting with Earth Engine servers using the Python programming language. It is available as a Python package that can be installed locally or within the cloud and accessed from a command-line interpreter or within a Jupyter notebook.

Set up Python API for GEE and continue following¶

Step 1

The Python API package is called ee. It must also be initialized for every new session and script.

In [1]:
import os
import ilwis
import requests
from zipfile import ZipFile
from osgeo import gdal
import numpy as np
import matplotlib.pyplot as plt

gdal connector: ilwis3 connector: Organizing data: 
In [2]:
# import Google earth engine module
import ee

#Assumes already authenticated the Google earth engine access with google account
ee.Initialize()

Step 2

Load the Image or ImageCollection you like. In this example, the Sentinel-2 image is Loaded to download

In [3]:
def maskS2clouds(image):
    qa = image.select('QA60')

    # Bits 10 and 11 are clouds and cirrus, respectively.
    cloudBitMask = 1 << 10
    cirrusBitMask = 1 << 11

    # Both flags should be set to zero, indicating clear conditions.
    mask = qa.bitwiseAnd(cloudBitMask).eq(0).And(qa.bitwiseAnd(cirrusBitMask).eq(0))

    return image.updateMask(mask).divide(10000)
In [4]:
#create function to download the image or imagecollection as you desire
def downloader(ee_object,region): 
    #try:
        #download image
        if isinstance(ee_object, ee.image.Image):
            print('Its Image')
            url = ee_object.getDownloadUrl({
                    'scale': 30, #note we are not using the full resolution else the filesize is too big
                    'crs': 'EPSG:4326',
                    'region': region
                })
            return url
        
        #download imagecollection
        elif isinstance(ee_object, ee.imagecollection.ImageCollection):
            print('Its ImageCollection')
            ee_object_new = ee_object.mosaic()
            url = ee_object_new.getDownloadUrl({
                    'scale': 30,
                    'crs': 'EPSG:4326',
                    'region': region
                })
            return url
In [20]:
# see also 'https://en.wikipedia.org/wiki/Cyclone_Idai' and select the appropriate start and end time stamps
bandlist = ['B8']
geometry = ee.Geometry.Rectangle([34.3705, -20.2326, 34.7852, -19.2839]) #Beira Mozambique
CloudCoverMax = 5
startDate = '2018-06-01'
endDate = '2019-10-01'

#note cloud cover will be an issue when selection a suitable S2 image, here two procedures are provided

#select image option 1
#MyImage1 = ee.ImageCollection('COPERNICUS/S2_SR').filterDate(startDate, endDate).filter(ee.Filter.lt('CLOUDY_PIXEL_PERCENTAGE',1)) \
#.filterBounds(geometry).map(maskS2clouds).select(bandlist)

#Get Sentinel-2 data option 2
MyImage1 = ee.ImageCollection('COPERNICUS/S2_SR').filterDate(startDate,endDate).filterBounds(geometry).select(bandlist) \
           .filter(ee.Filter.lt('CLOUDY_PIXEL_PERCENTAGE',CloudCoverMax)).filter(ee.Filter.lt('CLOUD_COVERAGE_ASSESSMENT',CloudCoverMax))

MyImage= MyImage1.median()

Step 3

Create a Python function to get the download link:

In [21]:
def download(url, outputfile):
    with requests.get(url, stream=True) as response: #with requests.get(link, stream=True, auth=(usern_scihub,passw_scihub)) as response:
        response.raise_for_status()
        total_length = response.headers.get('content-length')
        if total_length is not None:
            total_length = int(total_length)
        downloaded_length = 0
        with open(outputfile, 'wb') as f:
            for chunk in response.iter_content(chunk_size=8192):
                downloaded_length += len(chunk)
                f.write(chunk)
In [22]:
work_dir = os.getcwd()+'/data'

print("current dir is: %s" % (os.getcwd()))
print("current working directory is:",work_dir) 

if os.path.isdir(work_dir):
    print("Folder exists")
else:
    print("Folder doesn't exists")
    os.mkdir(work_dir)
current dir is: /home/eoafrica/surface_flood
current working directory is: /home/eoafrica/surface_flood/data
Folder exists
In [23]:
region = geometry.toGeoJSONString() #region must in JSON format

url = downloader(MyImage,region) #call function

filename = work_dir +'/download.zip'

print(url) #print the download URL

download(url, filename)
Its Image
https://earthengine.googleapis.com/v1/projects/earthengine-legacy/thumbnails/1b67b0885e9527f715cee82dbcb4e04e-40448119ee1a649fb7459b12968a614c:getPixels

Final step:

Note the URL created, you can copy and paste it in any browser, the file will be downloaded in your local download folder. The code field below will automatically download the link created into the active notebook folder /data.

Note the full download might take some time - be patient¶

Note that Google has a limitation of 1 GB per download.

In [24]:
z = ZipFile(filename)
for file in z.namelist():
    f = z.open(file)
    mmap_name = '/vsimem/'+file
    gdal.FileFromMemBuffer(mmap_name, f.read())
    f.close()
    dataset=gdal.Open(mmap_name)
    file = file.replace('.tif','.mpr')
    file = file.replace('download.','')
    file = 'data/' + file #note folder setting
    gdal.Translate(file, dataset, format='ILWIS')
    gdal.Unlink(mmap_name)
    os.remove(file + '.aux.xml')
z.close()
os.remove(filename)

Repeat procedure above and before you continue, ensure you have downloaded bands 2,3,4 and 8¶

Change 'bandlist' parameter to select the appropriate spectral channels, e.g. b2, 3, 4 and 8.

Now use ILWISPy to pre-process the downloaded images

In [25]:
#to work with ILWIS first set the working directory, here the current folder is used
data_dir = os.getcwd() + '/' + 'data'
ilwis.setWorkingCatalog(data_dir)
print(data_dir)
gdal connector: /home/eoafrica/surface_flood/data
ilwis3 connector: Organizing data: 

Load the raster data - check the dimensions

In [26]:
#import raster image extracted from GEE

B2 = ilwis.RasterCoverage('B2.mpr')

# get information on dimensions
print(B2.size().xsize)
print(B2.size().ysize)
print(B2.size().zsize)

# check the georeference of the raster
print (B2.geoReference().name())

B3 = ilwis.RasterCoverage('B3.mpr')
B4 = ilwis.RasterCoverage('B4.mpr')
B8 = ilwis.RasterCoverage('B8.mpr')
1540
3522
1
B2.grf

Conduct a linear stretch - convert to numpy

Each band is used here and a linear stretch is conducted to transform the input integer data to a byte range (note the syntax setvaluerange 0-255), note that a cut-off value of 1 percent is used at the tails of the histogram. Each band is the converted to a numpy array

In [27]:
#uncomment the lines below  with the B?s.store command, to store the data to disk
B2s = ilwis.do('linearstretch',B2,1)
b2s = ilwis.do('setvaluerange', B2s, 0, 255, 1)
#B2s.store('s2_b2s.mpr')

b2_2np = np.fromiter(iter(b2s), np.ubyte, b2s.size().linearSize()) 
b2_2np = b2_2np.reshape((b2s.size().ysize, b2s.size().xsize))

B3s = ilwis.do('linearstretch',B3,1)
b3s = ilwis.do('setvaluerange', B3s, 0, 255, 1)
#B3s.store('s2_b3s.mpr')

b3_2np = np.fromiter(iter(B3s), np.ubyte, b3s.size().linearSize()) 
b3_2np = b3_2np.reshape((b3s.size().ysize, b3s.size().xsize))

B4s = ilwis.do('linearstretch',B4,1)
b4s = ilwis.do('setvaluerange', B4s, 0, 255, 1)
#B4s.store('s2_b4s.mpr')

b4_2np = np.fromiter(iter(B4s), np.ubyte, b4s.size().linearSize()) 
b4_2np = b4_2np.reshape((b4s.size().ysize, b4s.size().xsize))

B8s = ilwis.do('linearstretch',B8,1)
b8s = ilwis.do('setvaluerange', B8s, 0, 255, 1)
#B8s.store('s2_b8s.mpr')

b8_2np = np.fromiter(iter(B8s), np.ubyte, b8s.size().linearSize()) 
b8_2np = b8_2np.reshape((b8s.size().ysize, b8s.size().xsize))
setvaluerange:  in 0.414397 seconds

setvaluerange:  in 0.439187 seconds

setvaluerange:  in 0.411514 seconds

setvaluerange:  in 0.416056 seconds

Once the spectral bands are processed and converted to a Numpy array, we can create band combinations, allowing us to display the data as a color composite. Below two composites are created: a natural and false color composite. Review the MSI spectral channel table and note the spectral regions covered by each of these spectral bands and how they are combined to create the composite.

In [28]:
# create true color composite (in RGB)
nc = np.dstack((b4_2np, b3_2np, b2_2np))

# create false color composite (in RGB)
fc = np.dstack((b8_2np, b4_2np, b3_2np))
In [29]:
#create a plot

fig1 = plt.figure(figsize=(15, 10))
plt.subplots_adjust(left=0.1, bottom=0.1, right=0.6, top=0.9)
plt.subplot(1, 2, 1)
plt.imshow(nc)
plt.title('nc')
plt.subplot(1, 2, 2)
plt.imshow(fc)
plt.title('fc');
In [30]:
#save plot of fcc only

fig1 = plt.figure(figsize=(15, 10))
plt.imshow(fc)
#plt.title('fc');
fig1.savefig(work_dir + '/s2_cc.png')

Create overlay of S2 and S1¶

Note that this can only be completed if S1 image has been processed!

In [31]:
from PIL import Image

bg = Image.open(work_dir + "/s2_cc.png")
fg = Image.open(work_dir + "/s1_cc.png")

# set alpha to .5
fin_img = Image.blend(bg, fg, .5).save(work_dir +"/s1_s2_blended.png")

fin_img = Image.open(work_dir +"/s1_s2_blended.png")

fig = plt.figure(figsize = (35, 25))
plt.imshow(fin_img)
plt.axis('off')
plt.show();

Navigate to the data folder and double click the image 's1_s2_blended.png'¶

In [ ]: