Flood Raster Metadata

Before performing any kind of flood exposure analysis, it is essential to understand the metadata of your input raster datasets. Metadata such as spatial resolution, projection (CRS), and value ranges (e.g., minimum and maximum flood depths) tell you how suitable the data is for your analysis and whether preprocessing steps like reprojection or resampling are needed.

In Google Earth Engine (GEE), you can programmatically inspect these properties for multiple image assets. The code below loads each flood raster dataset, prints the resolution, projection, and computes the minimum and maximum water depth values.

// A list of flood raster assets from different years and sources
    var assetList = [
      "projects/ee-desmond/assets/nbracer/ADeHaan_pluvial2050_effectenenimpacts_s0_wateroverlastWaterdiepteCm_hoog_2050_t1000",
      "projects/ee-desmond/assets/nbracer/AGistel_pluvial2050_effectenenimpacts_s0_wateroverlastWaterdiepteCm_hoog_2050_t1000",
      // ... (other 28 asset IDs)
    ];
    
    // Loop through each dataset and print metadata
    assetList.forEach(function(asset) {
        var image = ee.Image(asset); // Load the raster image
        var proj = image.projection(); // Get projection object
        var scale = proj.nominalScale(); // Get spatial resolution in meters
    
        print("Dataset:", asset);
        print("Projection:", proj);
        print("Resolution (m):", scale);
    
        // Compute min and max values using reduceRegion
        var minMax = image.reduceRegion({
            reducer: ee.Reducer.minMax(),
            geometry: image.geometry(),
            scale: scale,
            bestEffort: true
        });
    
        print("Min/Max Water Depth (cm):", minMax);
        print("------------------------");
    });

Explanation of Code Functions

  • ee.Image(asset): Loads a raster image from your Earth Engine asset folder.
  • image.projection(): Retrieves the projection metadata, including the coordinate reference system (CRS).
  • nominalScale(): Extracts the pixel resolution (usually in meters).
  • reduceRegion(): Performs a statistical operation (in this case, finding the min and max pixel values) over the image’s spatial extent.

This step is critical because it tells you whether all datasets are on the same spatial scale and CRS. Inconsistent projections or pixel sizes can lead to errors during mosaicking or overlaying with vector data, such as your 100m² hexagonal grid for exposure assessment.

Once you've inspected the metadata and are confident of their compatibility (or know where to apply fixes), you can move on to data harmonization and processing.

Interpolating Missing Time Points for Storm Surge

In many climate hazard datasets, we often lack predictions for every year of interest. In our case, we want to combine the flood layers at different scenarios. This makes much sense in comparisons if we already have uniform comparison years, and ofcourse also in scale and in resolution. The storm surge model data we have downloaded provided predictions for the years 2017 and 2075, but the year 2050 (a key planning milestone) was missing. The best way to estimate or get flood values for unaccounted or mapped years is to perform an interpolation. There are different kinds of interpolation techniques. But for this particular task, You will ordinarily perform a linear interpolation between the known rasters for 2017 and 2075 to estimate 2050 storm surge depths.

Lets now write a function to achieve this:

// Load storm surge datasets for 2017 and 2075
      var img2017 = ee.Image("projects/ee-desmond/assets/nbracer/Astormesurge2017_effectenenimpacts_s0_stormvloedWaterdiepteCm_huidig_2017_t1000");
      var img2075 = ee.Image("projects/ee-desmond/assets/nbracer/Astormesurge2075_effectenenimpacts_s0_stormvloedWaterdiepteCm_midden_2075_t1000");
      
      // Define the interpolation factor for 2050
      var factor = ee.Number(2050 - 2017).divide(2075 - 2017);  // ≈ 0.56897
      
      // Perform linear interpolation
      var img2050 = img2017.add(img2075.subtract(img2017).multiply(factor)).rename('stormvloedWaterdiepteCm_2050');

Explanation

  • img2017 and img2075: These represent the known water depth rasters.
  • factor: This is the weight of the interpolation, computed as a proportion of time between 2017 and 2075.
  • img2075.subtract(img2017): Computes the change in depth between the two years.
  • ...multiply(factor): Scales this change to reflect how much change should have occurred by 2050.
  • img2017.add(...): Adds the scaled change to the 2017 baseline to create the 2050 estimate.

You now have an interpolated storm surge map for 2050. You can inspect and visualize the result using Earth Engine’s Map API:

Map.centerObject(img2017, 10);
      Map.addLayer(img2017, {min: 0, max: 2600, palette: ['white', 'lightblue', 'blue']}, 'Storm Surge 2017');
      Map.addLayer(img2050, {min: 0, max: 2600, palette: ['white', 'navy']}, 'Interpolated Storm Surge 2050');
      Map.addLayer(img2075, {min: 0, max: 2600, palette: ['white', 'darkblue']}, 'Storm Surge 2075');

Lastly, you should export this interpolated image as a new asset so that it can be used in further geoprocessing and flood exposure analysis.

Export.image.toAsset({
        image: img2050,
        description: 'Export_StormSurge_2050',
        assetId: 'projects/ee-desmond/assets/nbracer/Astormesurge2050_interpolated',
        region: img2017.geometry(),
        scale: 20,
        maxPixels: 1e13
      });

This method allows you to estimate critical missing flood layers based on existing data, which is essential for consistent multi-year scenario comparisons and risk modeling.

Before conducting any geospatial analysis, it is essential to inspect and document the metadata of the input datasets. This step allows you to understand the underlying resolution, projection, and data value characteristics, which will influence further processing such as mosaicking, reprojection, or statistical extraction.

Flood Raster Metadata Summary Table

Dataset Projection (CRS) Resolution (m) Min Water Depth (cm) Max Water Depth (cm)
DeHaan_pluvial2050EPSG:313702.02580.09999313.55
Gistel_pluvial2050EPSG:313702.03350.01678455.24
Ichtegem_pluvial2050EPSG:313702.03860.03009455.24
Oostende_pluvial2050EPSG:313702.03640.00973554.53
Oudenburg_pluvial2050EPSG:313702.03050.00043410.66
Bredene_pluvial2050EPSG:313702.02880.00973311.70
Middelkerke_pluvial2050EPSG:313702.03630.03033300.72
Ostend_pluvial2017EPSG:313702.03640.03388376.91
Stormsurge2017EPSG:3137020.36812589
Stormsurge2075EPSG:3137020.36812571

Reprojecting and Merging Flood Hazard Data

Once the metadata has been inspected and the missing storm surge year interpolated, you need to ensure that all raster datasets are harmonized. This means every raster must have the same spatial resolution, projection, and alignment for consistent geospatial analysis—especially when mosaicking or overlaying data with analysis units like a hexagonal grid.

In this case, we reproject all rasters to the Belgian Lambert 72 coordinate reference system (EPSG:31370) and use a common spatial resolution of 2.05 meters. This resolution was chosen because it is the median of the native resolutions from the datasets (which ranged from approximately 2.025 to 2.054 meters). Choosing 2.05 m preserves most of the original spatial detail while aligning the grid across all inputs.

The storm surge datasets, however, had a much coarser resolution of approximately 20.37 meters. Since this would have caused a major mismatch in pixel size during the mosaicking stage, we resampled and reprojected those layers as well, using bilinear resampling to preserve depth gradients over larger areas.

Reprojecting and Mosaicking

// Define target projection (EPSG:31370) and common scale
var targetProj = ee.Projection('EPSG:31370').atScale(2.05);

// Utility function to reproject and mosaic a list of assets
function processAssets(assetList) {
  return ee.ImageCollection(assetList.map(function(path) {
    return ee.Image(path).reproject({crs: targetProj});
  })).mosaic();
}

Grouped Raster Sets and Storm Surge Resampling

// Define grouped asset collections
  var pluvial2017_assets = [
    .....All assets added
  ];
  
  var pluvial2050_assets = [
  .....All assets added
  ];
  
  var fluvial2017_assets = [
  .....All assets added
  ];
  
  var fluvial2050_assets = [
  .....All assets added
  ];
  
  var stormsurge2017 = ee.Image("projects/ee-desmond/assets/nbracer/Astormesurge2017_effectenenimpacts_s0_stormvloedWaterdiepteCm_huidig_2017_t1000")
    .resample('bilinear')
    .reproject({crs: targetProj});
  
  var stormsurge2050 = ee.Image("projects/ee-desmond/assets/nbracer/Astormesurge2050_interpolated")
    .resample('bilinear')
    .reproject({crs: targetProj});

Realigning all datasets ensures compatibility with downstream exposure modeling, particularly when using a 100 m² hexagonal grid as the unit of analysis. Without this alignment:

  • Values from different flood types and years could fall on misaligned pixels.
  • Exposure results would be over- or under-estimated due to pixel shifts or area mismatches.
  • Storm surge impacts would appear blurred or displaced because of coarse resolution if not resampled.

With harmonized spatial scales, each pixel accurately represents surface water depth at a resolution compatible with the final unit of analysis.

Exporting Merged Rasters

Finally, export each raster to your Earth Engine Asset folder.

// Export example for Pluvial 2017
        Export.image.toAsset({
          image: flood_pluvial2017,
          description: 'Export_Plulvial2017_Merged',
          assetId: 'projects/ee-desmond/assets/nbracer/flood_pluvial2017_merged',
          region: flood_pluvial2017.geometry(),
          scale: 2.05,
          maxPixels: 1e13
        });
        
        Export.image.toAsset({
          image: flood_pluvial2050,
          description: 'Export_Plulvial2050_Merged',
          assetId: 'projects/ee-desmond/assets/nbracer/flood_pluvial2050_merged',
          region: flood_pluvial2050.geometry(),
          scale: 2.05,
          maxPixels: 1e13
        });
        
        Export.image.toAsset({
          image: flood_fluvial2017,
          description: 'Export_Fluvial2017_Merged',
          assetId: 'projects/ee-desmond/assets/nbracer/flood_fluvial2017_merged',
          region: flood_fluvial2017.geometry(),
          scale: 2.05,
          maxPixels: 1e13
        });
        
        Export.image.toAsset({
          image: flood_fluvial2050,
          description: 'Export_Fluvial2050_Merged',
          assetId: 'projects/ee-desmond/assets/nbracer/flood_fluvial2050_merged',
          region: flood_fluvial2050.geometry(),
          scale: 2.05,
          maxPixels: 1e13
        });
        
        Export.image.toAsset({
          image: stormsurge2017,
          description: 'Export_StormSurge2017',
          assetId: 'projects/ee-desmond/assets/nbracer/stormsurge_2017_resampled',
          region: stormsurge2017.geometry(),
          scale: 2.05,
          maxPixels: 1e13
        });
        
        Export.image.toAsset({
          image: stormsurge2050,
          description: 'Export_StormSurge2050',
          assetId: 'projects/ee-desmond/assets/nbracer/stormsurge_2050_resampled',
          region: stormsurge2050.geometry(),
          scale: 2.05,
          maxPixels: 1e13
        });
        

Resolution vs. Unit-of-Analysis: Hexagonal Grid at 100 m²

Technical Consideration

  • Raster resolution: 2.05 m → Pixel area ≈ 4.2 m²
  • Vector unit: 100 m² hexagon → Hexagon radius ≈ 5.3 m
  • This implies each hexagon contains ≈ 24 raster pixels (100 / 4.2)

Implications

Scenario Effect on Exposure Estimation
Raster resolution << hex size Reduces precision at fine-scale impact
Raster variation within hex Requires summarizing (e.g., max, mean)
Misalignment of grid centroids May introduce edge effect distortions
Flood footprint near boundaries Partial inundation may be exaggerated

Recommendations

  • Use reduceRegions() with ee.Reducer.max() or mean() to capture meaningful hazard statistics.
  • Consider spatial smoothing if using categorical thresholds.

What Does It Mean to Use 2.05 m Data with a 100 m² Hexagon?

Each pixel in our flood data is like a small tile on the ground that's about 2 meters wide (covering ~4.2 m²). Each hexagon we’ll use for exposure analysis is like a big honeycomb cell that covers 100 square meters.

Now, each hexagon will contain about 24–25 of these smaller flood pixels. So when we try to say how flooded a hexagon is, we need to summarize the information from those 24 small tiles into a single number — like the maximum water depth or the average across the hexagon.

  • Max value — good for understanding worst-case hazards (e.g. critical infrastructure).
  • Mean value — better for general trends and average conditions (e.g. ecosystems).

Summary: Each flood raster pixel is about 2 meters wide, covering ~4.2 m². Your hexagon covers 100 m², meaning it encloses ~24 such pixels. To calculate exposure at the hexagon level, you need to summarize those 24 flood pixels into one value. That could be:

  • Max value (for worst-case hazard)
  • Mean value (for average condition)

What’s the Impact?

Situation What Could Happen
Hexagon includes just a few flooded pixelsYou might overestimate risk
Flood depth varies a lot inside hexagonMean might hide extremes, underestimating risk
Hexagons don't align with raster gridBoundary errors may appear due to pixel cutoffs
Data is high-resolution (2.05 m)You need to summarize carefully to avoid bias

What Does This Mean for Infrastructure & Ecosystems?

For key infrastructure like hospitals or wetlands:

  • Overestimating risk: One small flooded pixel may classify the whole hex as exposed.
  • Underestimating risk: Averaging may downplay serious flood depth at sensitive spots.

What Should We Do?

  • Use max() if targeting worst-case impacts (e.g. emergency planning).
  • Use mean() for general conditions (e.g. ecological studies).
  • Run both and compare results as a sensitivity check.
  • Visual inspection: Always verify visually that patterns make sense.

Should You Resample to 100 m?

Resampling to 100 m can be useful but introduces trade-offs:

Pros:
  • Speeds up processing
  • Makes hex grid alignment cleaner
  • Reduces mismatches and artifacts
Cons:
  • Flood details get flattened
  • Small-scale flooding may disappear
  • Max resampling may exaggerate exposure

Recommendation: Keep original resolution for analysis with reduceRegions(), resample to 100 m only if needed for exports or dashboards.

Version Author Date
38ca5cc Desmond Lartey 2025-01-02