Changes
- add MAAP to VEDA s3 copy
- add dask worker parallelism around years loop
- write
merge
function for each layer across years - make sure writes before merge always happen on disk (as opposed to s3) to avoid race conditions
Testing
- log into ADE
- cd
./fireatlas_nrt/
- run the only year we have data for:
python3 combine_largefire.py -s 2023 -e 2023
# or run the dask worker version in parallel (doesn't do much until we can run multiple years)
python3 combine_largefire.py -s 2023 -e 2023 -p
- after it finishes running check that outputs in
/projects/shared-buckets/gsfc_landslides/FEDSoutput-s3-conus/CONUS_NRT_DPS/LargeFire_Outputs/merged
were updated - grok the tail of the logs (
running.log
) to make sure there are no errors