Skip to content
Snippets Groups Projects
Commit d23ba9d9 authored by Neha Hunka's avatar Neha Hunka
Browse files

Code for Nature Scientific Data

parent a27ab646
No related branches found
No related tags found
No related merge requests found
...@@ -7,14 +7,19 @@ nhunka@umd.edu ...@@ -7,14 +7,19 @@ nhunka@umd.edu
##################################################################################################### #####################################################################################################
The following process is set up for the classification of the world's forests into primary, young secondary and old The following process is set up for the classification of the world's forests into primary, young secondary and old
secondary forests secondary forests, as per the IPCC 2019 Guidelines Table 4.7 for natural forests.
1. Various EO-derived and spatial datasets are downloaded from source (wget or curl commands) 1. Various EO-derived and spatial datasets are downloaded from source (wget or curl commands)
2. Layers are sptially resampled and aligned to an approx. 30 m grid (gdal commands) 2. Layers are sptially resampled and aligned to an approx. 30 m grid (gdal commands)
3. A Boolean set of conditions is applied to layers to classify into forest statuses/conditions (AWS DPS algorithm) 3. A Boolean set of conditions is applied to layers to classify into forest statuses/conditions (AWS DPS algorithm)
All steps are reproducible for batch processing on the AWS DPS cloud-computing system that supports the NASA MAAP. For ease of use, step 1 and step 2 are broken down per 10 x 10 degree tile and described in the file NOTES_data_download_and_preprocessing.ipynb such that they are implementable on local machines using R. All steps are reproducible for batch processing on the AWS DPS cloud-computing system that supports the NASA MAAP.
For ease of use, step 1 and step 2 are broken down per 10 x 10 degree tile and described in the file
NOTES_data_download_and_preprocessing.ipynb such that they are implementable on local machines using R.
Step 3 is provided as a DPS algorithm, which means that every 10 x 10 degree tile across the globe runs in parallel on AWS. For ease of understanding, it is recommended to start with the file FOREST_Classification/IPCC_GEDI_Table4.7.py. The Boolean combination used for the global forest classification is contained entirely within this file. It can be run from the command like and executed for a single tile if needed. Step 3 is provided as a DPS algorithm, which means that every 10 x 10 degree tile across the globe runs in parallel
on AWS. For ease of understanding, it is recommended to start with the file FOREST_Classification/IPCC_GEDI_Table4.7.py.
The Boolean combination used for the global forest classification is contained entirely within this file. It can be
run from the command like and executed for a single tile if needed.
##################################################################################################### #####################################################################################################
\ No newline at end of file
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment