Before performing any kind of flood exposure analysis, it is essential to understand the metadata of your input raster datasets. Metadata such as spatial resolution, projection (CRS), and value ranges (e.g., minimum and maximum flood depths) tell you how suitable the data is for your analysis and whether preprocessing steps like reprojection or resampling are needed.
In Google Earth Engine (GEE), you can programmatically inspect these properties for multiple image assets. The code below loads each flood raster dataset, prints the resolution, projection, and computes the minimum and maximum water depth values.
// A list of flood raster assets from different years and sources
var assetList = [
"projects/ee-desmond/assets/nbracer/ADeHaan_pluvial2050_effectenenimpacts_s0_wateroverlastWaterdiepteCm_hoog_2050_t1000",
"projects/ee-desmond/assets/nbracer/AGistel_pluvial2050_effectenenimpacts_s0_wateroverlastWaterdiepteCm_hoog_2050_t1000",
// ... (other 28 asset IDs)
];
// Loop through each dataset and print metadata
assetList.forEach(function(asset) {
var image = ee.Image(asset); // Load the raster image
var proj = image.projection(); // Get projection object
var scale = proj.nominalScale(); // Get spatial resolution in meters
print("Dataset:", asset);
print("Projection:", proj);
print("Resolution (m):", scale);
// Compute min and max values using reduceRegion
var minMax = image.reduceRegion({
reducer: ee.Reducer.minMax(),
geometry: image.geometry(),
scale: scale,
bestEffort: true
});
print("Min/Max Water Depth (cm):", minMax);
print("------------------------");
});
ee.Image(asset)
: Loads a raster image from your Earth Engine asset folder.image.projection()
: Retrieves the projection metadata, including the coordinate reference system (CRS).nominalScale()
: Extracts the pixel resolution (usually in meters).reduceRegion()
: Performs a statistical operation (in this case, finding the min and max pixel values) over the image’s spatial extent.This step is critical because it tells you whether all datasets are on the same spatial scale and CRS. Inconsistent projections or pixel sizes can lead to errors during mosaicking or overlaying with vector data, such as your 100m² hexagonal grid for exposure assessment.
Once you've inspected the metadata and are confident of their compatibility (or know where to apply fixes), you can move on to data harmonization and processing.
In many climate hazard datasets, we often lack predictions for every year of interest. In our case, we want to combine the flood layers at different scenarios. This makes much sense in comparisons if we already have uniform comparison years, and ofcourse also in scale and in resolution. The storm surge model data we have downloaded provided predictions for the years 2017 and 2075, but the year 2050 (a key planning milestone) was missing. The best way to estimate or get flood values for unaccounted or mapped years is to perform an interpolation. There are different kinds of interpolation techniques. But for this particular task, You will ordinarily perform a linear interpolation between the known rasters for 2017 and 2075 to estimate 2050 storm surge depths.
Lets now write a function to achieve this:
// Load storm surge datasets for 2017 and 2075
var img2017 = ee.Image("projects/ee-desmond/assets/nbracer/Astormesurge2017_effectenenimpacts_s0_stormvloedWaterdiepteCm_huidig_2017_t1000");
var img2075 = ee.Image("projects/ee-desmond/assets/nbracer/Astormesurge2075_effectenenimpacts_s0_stormvloedWaterdiepteCm_midden_2075_t1000");
// Define the interpolation factor for 2050
var factor = ee.Number(2050 - 2017).divide(2075 - 2017); // ≈ 0.56897
// Perform linear interpolation
var img2050 = img2017.add(img2075.subtract(img2017).multiply(factor)).rename('stormvloedWaterdiepteCm_2050');
img2017
and img2075
: These represent the known water depth rasters.factor
: This is the weight of the interpolation, computed as a proportion of time between 2017 and 2075.img2075.subtract(img2017)
: Computes the change in depth between the two years....multiply(factor)
: Scales this change to reflect how much change should have occurred by 2050.img2017.add(...)
: Adds the scaled change to the 2017 baseline to create the 2050 estimate.You now have an interpolated storm surge map for 2050. You can inspect and visualize the result using Earth Engine’s Map API:
Map.centerObject(img2017, 10);
Map.addLayer(img2017, {min: 0, max: 2600, palette: ['white', 'lightblue', 'blue']}, 'Storm Surge 2017');
Map.addLayer(img2050, {min: 0, max: 2600, palette: ['white', 'navy']}, 'Interpolated Storm Surge 2050');
Map.addLayer(img2075, {min: 0, max: 2600, palette: ['white', 'darkblue']}, 'Storm Surge 2075');
Lastly, you should export this interpolated image as a new asset so that it can be used in further geoprocessing and flood exposure analysis.
Export.image.toAsset({
image: img2050,
description: 'Export_StormSurge_2050',
assetId: 'projects/ee-desmond/assets/nbracer/Astormesurge2050_interpolated',
region: img2017.geometry(),
scale: 20,
maxPixels: 1e13
});
This method allows you to estimate critical missing flood layers based on existing data, which is essential for consistent multi-year scenario comparisons and risk modeling.
Before conducting any geospatial analysis, it is essential to inspect and document the metadata of the input datasets. This step allows you to understand the underlying resolution, projection, and data value characteristics, which will influence further processing such as mosaicking, reprojection, or statistical extraction.
Dataset | Projection (CRS) | Resolution (m) | Min Water Depth (cm) | Max Water Depth (cm) |
---|---|---|---|---|
DeHaan_pluvial2050 | EPSG:31370 | 2.0258 | 0.09999 | 313.55 |
Gistel_pluvial2050 | EPSG:31370 | 2.0335 | 0.01678 | 455.24 |
Ichtegem_pluvial2050 | EPSG:31370 | 2.0386 | 0.03009 | 455.24 |
Oostende_pluvial2050 | EPSG:31370 | 2.0364 | 0.00973 | 554.53 |
Oudenburg_pluvial2050 | EPSG:31370 | 2.0305 | 0.00043 | 410.66 |
Bredene_pluvial2050 | EPSG:31370 | 2.0288 | 0.00973 | 311.70 |
Middelkerke_pluvial2050 | EPSG:31370 | 2.0363 | 0.03033 | 300.72 |
Ostend_pluvial2017 | EPSG:31370 | 2.0364 | 0.03388 | 376.91 |
Stormsurge2017 | EPSG:31370 | 20.368 | 1 | 2589 |
Stormsurge2075 | EPSG:31370 | 20.368 | 1 | 2571 |
Once the metadata has been inspected and the missing storm surge year interpolated, you need to ensure that all raster datasets are harmonized. This means every raster must have the same spatial resolution, projection, and alignment for consistent geospatial analysis—especially when mosaicking or overlaying data with analysis units like a hexagonal grid.
In this case, we reproject all rasters to the Belgian Lambert 72 coordinate reference system (EPSG:31370) and use a common spatial resolution of 2.05 meters. This resolution was chosen because it is the median of the native resolutions from the datasets (which ranged from approximately 2.025 to 2.054 meters). Choosing 2.05 m preserves most of the original spatial detail while aligning the grid across all inputs.
The storm surge datasets, however, had a much coarser resolution of approximately 20.37 meters. Since this would have caused a major mismatch in pixel size during the mosaicking stage, we resampled and reprojected those layers as well, using bilinear resampling to preserve depth gradients over larger areas.
// Define target projection (EPSG:31370) and common scale
var targetProj = ee.Projection('EPSG:31370').atScale(2.05);
// Utility function to reproject and mosaic a list of assets
function processAssets(assetList) {
return ee.ImageCollection(assetList.map(function(path) {
return ee.Image(path).reproject({crs: targetProj});
})).mosaic();
}
// Define grouped asset collections
var pluvial2017_assets = [
.....All assets added
];
var pluvial2050_assets = [
.....All assets added
];
var fluvial2017_assets = [
.....All assets added
];
var fluvial2050_assets = [
.....All assets added
];
var stormsurge2017 = ee.Image("projects/ee-desmond/assets/nbracer/Astormesurge2017_effectenenimpacts_s0_stormvloedWaterdiepteCm_huidig_2017_t1000")
.resample('bilinear')
.reproject({crs: targetProj});
var stormsurge2050 = ee.Image("projects/ee-desmond/assets/nbracer/Astormesurge2050_interpolated")
.resample('bilinear')
.reproject({crs: targetProj});
Realigning all datasets ensures compatibility with downstream exposure modeling, particularly when using a 100 m² hexagonal grid as the unit of analysis. Without this alignment:
With harmonized spatial scales, each pixel accurately represents surface water depth at a resolution compatible with the final unit of analysis.
Finally, export each raster to your Earth Engine Asset folder.
// Export example for Pluvial 2017
Export.image.toAsset({
image: flood_pluvial2017,
description: 'Export_Plulvial2017_Merged',
assetId: 'projects/ee-desmond/assets/nbracer/flood_pluvial2017_merged',
region: flood_pluvial2017.geometry(),
scale: 2.05,
maxPixels: 1e13
});
Export.image.toAsset({
image: flood_pluvial2050,
description: 'Export_Plulvial2050_Merged',
assetId: 'projects/ee-desmond/assets/nbracer/flood_pluvial2050_merged',
region: flood_pluvial2050.geometry(),
scale: 2.05,
maxPixels: 1e13
});
Export.image.toAsset({
image: flood_fluvial2017,
description: 'Export_Fluvial2017_Merged',
assetId: 'projects/ee-desmond/assets/nbracer/flood_fluvial2017_merged',
region: flood_fluvial2017.geometry(),
scale: 2.05,
maxPixels: 1e13
});
Export.image.toAsset({
image: flood_fluvial2050,
description: 'Export_Fluvial2050_Merged',
assetId: 'projects/ee-desmond/assets/nbracer/flood_fluvial2050_merged',
region: flood_fluvial2050.geometry(),
scale: 2.05,
maxPixels: 1e13
});
Export.image.toAsset({
image: stormsurge2017,
description: 'Export_StormSurge2017',
assetId: 'projects/ee-desmond/assets/nbracer/stormsurge_2017_resampled',
region: stormsurge2017.geometry(),
scale: 2.05,
maxPixels: 1e13
});
Export.image.toAsset({
image: stormsurge2050,
description: 'Export_StormSurge2050',
assetId: 'projects/ee-desmond/assets/nbracer/stormsurge_2050_resampled',
region: stormsurge2050.geometry(),
scale: 2.05,
maxPixels: 1e13
});
Scenario | Effect on Exposure Estimation |
---|---|
Raster resolution << hex size | Reduces precision at fine-scale impact |
Raster variation within hex | Requires summarizing (e.g., max, mean) |
Misalignment of grid centroids | May introduce edge effect distortions |
Flood footprint near boundaries | Partial inundation may be exaggerated |
reduceRegions()
with ee.Reducer.max()
or mean()
to capture meaningful hazard statistics.Each pixel in our flood data is like a small tile on the ground that's about 2 meters wide (covering ~4.2 m²). Each hexagon we’ll use for exposure analysis is like a big honeycomb cell that covers 100 square meters.
Now, each hexagon will contain about 24–25 of these smaller flood pixels. So when we try to say how flooded a hexagon is, we need to summarize the information from those 24 small tiles into a single number — like the maximum water depth or the average across the hexagon.
Summary: Each flood raster pixel is about 2 meters wide, covering ~4.2 m². Your hexagon covers 100 m², meaning it encloses ~24 such pixels. To calculate exposure at the hexagon level, you need to summarize those 24 flood pixels into one value. That could be:
Situation | What Could Happen |
---|---|
Hexagon includes just a few flooded pixels | You might overestimate risk |
Flood depth varies a lot inside hexagon | Mean might hide extremes, underestimating risk |
Hexagons don't align with raster grid | Boundary errors may appear due to pixel cutoffs |
Data is high-resolution (2.05 m) | You need to summarize carefully to avoid bias |
For key infrastructure like hospitals or wetlands:
Resampling to 100 m can be useful but introduces trade-offs:
Recommendation: Keep original resolution for analysis with reduceRegions()
, resample to 100 m only if needed for exports or dashboards.
Version | Author | Date |
---|---|---|
38ca5cc | Desmond Lartey | 2025-01-02 |