Skip to content

Combine LargeFire Workflow Addiitons

gcorradini requested to merge feature/combine_lf_archive into conus-dps

Changes

  1. add MAAP to VEDA s3 copy
  2. add dask worker parallelism around years loop
  3. write merge function for each layer across years
  4. make sure writes before merge always happen on disk (as opposed to s3) to avoid race conditions

Testing

  1. log into ADE
  2. cd ./fireatlas_nrt/
  3. run the only year we have data for:
python3 combine_largefire.py -s 2023 -e 2023
# or run the dask worker version in parallel (doesn't do much until we can run multiple years)
python3 combine_largefire.py -s 2023 -e 2023 -p
  1. after it finishes running check that outputs in /projects/shared-buckets/gsfc_landslides/FEDSoutput-s3-conus/CONUS_NRT_DPS/LargeFire_Outputs/merged were updated
  2. grok the tail of the logs (running.log) to make sure there are no errors

Merge request reports

Loading