Howto:Using QGIS to process NLCD data: Difference between revisions

From FlightGear wiki
Jump to navigation Jump to search
mNo edit summary
 
Line 180: Line 180:
* {{forum link|t=36913|title=How to create US scenery with NLCD land cover data and QGIS}}
* {{forum link|t=36913|title=How to create US scenery with NLCD land cover data and QGIS}}


[[Category:TerraGear]]
[[Category:Scenery enhancement]]
[[Category:Scenery enhancement]]

Latest revision as of 17:35, 5 February 2022

This tutorial will describe how to use data from the National Land Cover Database (NLCD) to create US scenery that is on par with the existing European scenery.

The default scenery (World Scenery 2.0) in FlightGear is of higher quality in some areas than others. For example, European scenery is generally of very high quality due to the use of CORINE land cover data. Compared to the European scenery, the US scenery is much less detailed and less accurate. Thankfully, there is a wealth of freely-available geographical information for the US that can be used to improve the scenery.

Getting set up for scenery generation

Prerequisites

Please read and understand the Using TerraGear wiki article. This tutorial will follow the same process but replace the "Landuse data" section with custom shapefiles generated from NLCD data.

Tools

This tutorial will make use of the following GIS software:

  1. QGIS (version 3.10)
  2. GRASS (version 7.8, used as a QGIS plugin)
  3. TerraGear

Elevation data

Elevation data for the entire world is available from the Shuttle Radar Topology Mission (SRTM). It is highly recommended to use the void-filled version of these datasets, which can be obtained from the USGS EarthExplorer (note: it is required to create a login to download data from that page). The void-filled datasets reduce or eliminate the small gaps that can occur in scenery generated using the original SRTM datasets. The elevation data is in GeoTIFF format. To convert it to hgt format as expected by TerraGear, use gdal_translate as follows:

 gdal_translate -of SRTMHGT input_file.tif output_file.hgt

For the output file, TerraGear might expect it to follow a certain naming convention, like N38W080.hgt. The data from USGS follows a naming convention like n38_w080_3arc_v2.tif. The following (sh/bash shell) script will batch convert all the files in the working directory following the expected naming convention:

 for file in *.tif; do
     base=$(echo $file | cut -f 1 -d '.')
     NORTH=$(echo $base | cut -f 1 -d '_' | cut -f 2 -d 'n')
     WEST=$(echo $base | cut -f 2 -d '_' | cut -f 2 -d 'w')
     gdal_translate -of SRTMHGT $file N${NORTH}W${WEST}.hgt
 done

NLCD data

You can download the NLCD data here. At the time of this writing, the latest dataset is from 2016. For the continental US, you should choose the dataset labeled "NLCD 2016 Land Cover (CONUS)." There is also a separate dataset for Alaska. The download is a large (>1 GB) zip file. After downloading, unzip the files.

Processing the data

  • Fire up QGIS and select the WGS 84 template. If you don't have a template to choose from, don't worry about it, as we'll be setting the coordinate reference system for the project later. Just create a new project in that case.
  • In the menu bar, select Project -> Properties -> CRS. Pick WGS 84 - EPSG:4326 as the project coordinate reference system.
  • Set a basemap. An easy way to do this is to double click on XYZ Tiles -> OpenStreetMap in the browser on the left. You can also get them via the QuickMapServices plugin.
  • Browse to the location where you have unzipped the NLCD data in the QGIS project browser and double click the .img file to load it as a layer.
  • Right-click on the NLCD layer -> Set CRS -> Set Layer CRS and select NAD83 / Conus Albers - EPSG:5070 -> Transformation 1. This will place the layer in the right location on top of the basemap. Note: it may not look like it's in the right spot until you zoom in on it.
  • Now we need to clip the data to the region of interest. The best way to do this is to by clicking the New Shapefile Layer button (see screenshot below). In the dialog, change Geometry type to Polygon and save it as a shapefile, and then click OK.
  • Click the Toggle Editing button to edit the layer.
  • Click the Add Polygon Feature button to create a polygon. Now you can click on the map to create a polygon covering the area of interest. When you are done, you can right-click to stop adding vertices.
  • If you want to edit the vertex locations, you can click the Vertex Tool button, right-click on your polygon, and then edit the coordinates in the table on the bottom left of the GUI window.
  • When you are done editing the layer, click the Save Layer Edits button (next to the Toggle Editing button).

Note: when you are learning the process, start with a small polygon, say 1x1 degrees. Some of the steps take a very long time if processing very large sections of this data.

  • Once the polygon is completed, the next step is to clip the NLCD layer using this polygon. Select the NLCD layer and choose Raster -> Extraction -> Clip Raster by Mask Layer. Make sure the input layer is your NLCD layer and the mask layer is the polygon you have created. For Source CRS choose the Conus Albers CRS of the original NLCD layer, and for Target CRS choose the Project CRS, EPSG:4326. Near the bottom of the dialog box, where under Clipped (mask) it says [Save to temporary file], click the button to the right and choose Save to File. Select IMG files as the filter and save it somewhere in your working directory with a descriptive name like NLCD_clip.img. Click Run. Once it completes, if you hide the original NLCD layer and the polygon, you should see the new clipped layer. As long as it worked correctly, you can now safely remove the big original NLCD layer. The image below shows a small sample section in Idaho that has been clipped using this method.
  • If you zoom in on the data, you will see that it is highly detailed but also pixelated due to the raster format. It needs to be simplified and smoothed out to look good in FlightGear and also to reduce the polygon count on the resulting shapefiles. We will first do some smoothing of the raster itself, and then we'll do additional smoothing after converting to vector format. In the Processing Toolbox, find GRASS -> r.neighbors. The clipped NLCD layer should be selected for both of the top input dropdowns. For Neighborhood Operation, select median with a Neighborhood size of 5. As before, you may want to choose an actual IMG file to save the output to instead of a temporary file. Click Run.

Note: never use an averaging neighborhood operation, as the raster values are uniquely associated with a specific type of landcover. The averaging operation will introduce new values that do not correspond to a landcover type.

  • Now we'll run r.neighbors once more on the resulting layer using the mode Neighborhood Operation with a Neighborhood of 3. The result should look a lot smoother and a bit less detailed than the original dataset (but still very highly detailed compared to the default scenery). An image from the sample dataset is below. This area shows McCall, Idaho and the south end of Payette Lake.
  • We're now ready to convert the raster data to vector. In the Processing Toolbox, find Grass -> r.to.vect. Choose the output layer from the last step (median neighbors filter), choose area as the feature type, and check the option "Smooth corners of area features." This will make the output vector a little less pixely than the input. As before, you may want to save to a real shapefile instead of a temporary one. Click Run. This step will take longer than any of the previous ones to process.
  • We could use this vector data as is, but it's still a bit blocky and unnatural looking. Instead, we'll do a couple processing steps on the vector data. GRASS -> v.generalize. Choose the default douglas algorithm with a tolerance of 0.0003 (this is in degrees, since our CRS is in degrees). You may want to play around with different values of tolerance until you find something that works best. Click Run. If errors result, it will create a new layer, which can be deleted. Note: this step will take a very long time for large areas.
  • Now we'll use v.generalize one more to smooth the simplified layer. This time, choose the chaiken algorithm with the same tolerance. Click Run. Note: this step will take a very long time for large areas.
  • At this point, it is a good idea to compare the smoothed vector layer to the original NLCD raster data and make sure the smoothing steps have not introduced too much error in the shape of coastlines and other features. If they have, you may need to redo some or all of the last several steps (r.neighbors, r.to.vect, and v.generalize) adjusting the number of neighbors and tolerance until you get satisfactory results. However, I've found these parameters to produce quite good results.
  • We now have a high-resolution vectorized dataset that can be used to define landclasses in FlightGear. We just need to save individual shape files representing each landclass. At this point, it is a good idea to have the NLCD map with the legend displayed to match to values in the shapefiles. That map can be found here:

https://www.mrlc.gov/viewer/

  • You will also want to have default_priorities.txt from your TerraGear installation open to map NLCD landclass types to available materials in FlightGear. Some artistic license is possible here, but I use this mapping:
NLCD Value Legend label FlightGear material
11 Open Water Lake or Ocean
12 Perennial Ice/Snow SnowCover or Glacier
21 Developed, Open Space Greenspace
22 Developed, Low Intensity Town
23 Developed, Medium Intensity SubUrban
24 Developed, High Intensity Urban
31 Barren Land (Rock/Sand/Clay) Rock or Sand (depends on location)
32 Unconsolidated Shore Sand
41 Deciduous Forest DeciduousForest
42 Evergreen Forest EvergreenForest
43 Mixed Forest MixedForest
51,52 Dwarf Scrub/Shrub/Scrub Scrub
71 Grasslands/Herbaceous Grassland
72-74 Herbaceous/Moss/Lichens (AK only) I haven't used these, so check default_priorities.txt if you find them in your data
81 Pasture/Hay DryCrop
82 Cultivated Crops IrrCrop
90 Woody Wetlands Marsh
95 Emergent Herbaceous Wetlands Bog
  • To actually select and export individual shapefiles for each type, do the following:
    • Select the final smoothed vector data layer. Click the Deselect Features from All Layers button near the top middle of the QGIS toolbar. Then click the button right next to it, Select Features by Value. In the dialog, enter the number from the NLCD legend in the value field, make sure the filter says Equal to (=), and then click Select Features and Close. Alternatively, if you want to combine multiple NLCD values into a single output shapefile, rather than clicking Close, you could enter in another value and then choose Add to Current Selection in the dropdown next to Select Features.
    • Once you have the desired features selected, right click on the smoothed vector layer, click Export -> Save Selected Features As, choose the ESRI shapefile format, and save it somewhere under the project directory with a name corresponding to the appropriate area type in default_priorities.txt (e.g. data/shapefiles/Lake.shp). For CRS, choose the project CRS EPSG:4326 - WGS 84.
    • Repeat this process until you've gone over all the values in the NLCD data legend. When finished, you should have individual layers representing different area types covering the entire area of interest. The result for the sample area is shown in the image below.

Converting with ogr-decode

If you have saved the shapefiles with a name corresponding to a material in default_priorities.txt, then they can be batch-converted with ogr-decode as follows:

 # work and shapefiles directories
 WORK=./work
 DATADIR=./data/shapefiles/NLCD
 # Process landcover
 for file in $DATADIR/*.shp; do
     fname=$(basename $file)
     area_type=$(echo $fname | cut -f 1 -d '.')
     if [ -d $WORK/$area_type ]; then
         rm -rf $WORK/$area_type
     fi
   ogr-decode --area-type $area_type work/$area_type $file
 done

(The script above is a bash/sh script for Linux. Something similar could probably done with a .bat script on Windows. The locations of the work and data directories will need to be edited if you use a different directory structure.)

Building with tg-construct

The final step in building the terrain is to run tg-construct. This process is described in the Using TerraGear wiki article. This should be done after adding an supplementary shapefiles, as described in the next section. When running tg-construct, you will want to use the --ignore-landmass flag so that lakes and other waterbodies are not assumed to be at sea level.

Supplementary data

You will probably want to supplement your custom scenery with some other data. Sources include:

  • National Hydrography Data for waterways and waterbodies. Though NLCD includes water landcover, it is not as accurate or complete as NHD, and the smoothing process described above will cause thin rivers to turn into "islands." I recommend using the NHDArea and NHDWaterbody datasets but skipping the line data. The data is in shapefile format, which can be viewed and manipulated with QGIS. Be sure to filter the data to remove extraneous elements. It includes things like marshes, dams, and intermittent streams that probably should not be included. The NHD data can be accessed here: Access National Hydrography Products. To filter the data, you will want to refer to this page: NHD FCodes.
  • OpenStreetMap for roads. QGIS can open .osm and .pbf files extracted from OpenStreetMap and edit them just like all the other datasets that have been discussed here. These can be "baked into" the terrain using TerraGear, or you can use:
  • osm2city, an autogen tool for FlightGear that creates buildings, roads, bridges, pylons, docks, and other details from OpenStreetMap data. This tool has a bit of a learning curve and does not flow through QGIS, but it can really make your project shine.

Sample results

Available scenery using NLCD data

Related content

Forum topic