Welcome to GDAL!
This is the first of what should be several posts on GDAL, the bread-and-butter library and toolset for playing with raster data. There are several introductions to GDAL on the web already that eloquently explain what GDAL is and why it is useful—for example, Robert Simmon's 4-part series on Medium.
This series will be geared towards working GIS professionals: people who understand projections, have used the raster tools in ArcMap or QGIS, but may be a little hesitant to dip into the command line. It will be geared around the tools available and how they're used for common tasks—an elaboration on the official docs with an emphasis on practical examples.
Installing GDAL On Windows
Installing GDAL used to be either really simple or really annoying, depending on your computing environment. GDAL is written and actively maintained by developers: people who are comfortable compiling from source and managing environment variables and installations. As such, it works great in their world but requires a little massaging for everyday GISers, especially those of us coming from a Windows-centric ArcMap world.
Then came Conda.
Conda is amazing. Conda is package and environment management for python and python-related packages. Conda makes installing and running GDAL easy.
So, how do you get it?
First, download and install Anaconda. The Anaconda Platform is the support structure for the Conda command line tools and includes helpful utilities and installers.
Next, we'll use the Anaconda Navigator to create a new environment. To do this, go to the Environments pane and select "Create" at the bottom. Give it a name—I call mine "gdal" for clarity's sake—and select the latest 3.x version of Python (unless you've got a good reason to use 2.7). Click "Create" to finish it off.
Note: It doesn't matter what python is currently installed on your computer (like if you've got python 2.7 from ArcGIS Desktop) because Conda creates its own sandbox for each environment. These sandboxes are completely separate from each other and from any other Python installation on your computer—that's the beauty of Conda!
Next, we're going to add the conda-forge repository ("repo"), as it usually has a more up-to-date version of GDAL. To do this, click on the "Channels" button and then select "Add." Type "conda-forge" into the text box and hit Enter.
Now, you can search for and add the gdal packages to your new environment. With your environment selected, change the dropdown menu at the top of the packages pane to "All" and type "gdal" (w/o quotes) into the search box. Select the checkbox next to the entry for gdal when it appears and click "Apply" in the bottom right corner. It will work out what other packages are needed and install everything.
Another Note: Getting the latest version of GDAL gets you the latest updates and bug fixes. For example, GDAL 2.1.3 has a bug that does not allow you to set the altitude parameter when calling DEMProcessing() via the Python bindings (see this bug report). This bug was fixed in 2.2.0.
More Notes! As of this writing (end of 2017), I've noticed that installing the SciPy libraries with the Anaconda Navigator downgrades GDAL to 2.1.3. After you install SciPy, you should see a little green arrow in the checkbox for GDAL in the installed packages view, indicating an update is available. Go ahead and update GDAL back to 2.1.3 by clicking this green arrow and then selecting "Apply" at the bottom.
Switching Environment
To play in your new environment, navigate to the Anaconda folder group in the Start Menu and select Anaconda Prompt. This opens a new command prompt that is aware of our Conda environments.
Like I mentioned, Conda gives us a separate sandbox for each environment we create. We need to switch to our new environment by typing activate gdal (replacing gdal with whatever environment name you chose earlier). After a pause, the command prompt will change to something like (gdal) C:\Users\jadams. This means that you're now in the gdal sandbox.
To verify GDAL is installed and working properly, type gdalinfo --version and hit Enter. You should get output that looks like GDAL 2.2.2, released 2017/09/15.
If you create other environments, you can switch between them by activating the other environment with activate foo, just like you'd do to activate your first environment.
Conda on Other OSes
macOS
Installation on macOS is similar to Windows, and Anaconda Navigator works the same in both OSes. The only difference is that we'll use the standard Terminal app instead of the Anaconda Prompt. In addition, we have to type source activate gdal instead of just activate gdal. Beyond that, enjoy the flexibility of bash (or csh, or whatever shell you're using).
Linux
Conda on Linux requires more CLI-based magic than Windows or MacOS. I've not tried it myself, but a quick search for "install conda ubuntu" gives us these instructions, and it should work just the same for other distros as well.
Linux also shares the same source activate requirement as macOS to switch between environments.
Coming Up Next
The next post will focus on some common file creation considerations that you'll use whenever your GDAL commands result in a new file.
Resources
GDAL Homepage: http://www.gdal.org
Conda User Guide: https://conda.io/docs/user-guide/index.html
Anaconda User Guide: https://docs.anaconda.com/anaconda/user-guide/
GISInterals pre-built GDAL binaries for Windows: http://www.gisinternals.com/release.php
Alternate Windows Installation: https://sandbox.idre.ucla.edu/sandbox/tutorials/installing-gdal-for-windows
Tuesday, December 26, 2017
Thursday, August 10, 2017
Using SDE Feature Classes Inside a Python Geoprocessing Service
I finally had a breakthrough today at work on a problem I've been working on off-and-on for a while now at work, and I want to share it with you (and write it down so I remember it...).
Like a lot of places, we use a backend SDE to hold the data that we work on internally and expose publicly through map services. I've been trying to create a Geoprocessing Service based on a Python script tool to do some things we need done. Within this script, I directly access the parcels feature class in our SDE that has tens of thousands of features.
The script tool runs just fine from within ArcMap, with the path to the feature class either hardcoded in or provided as a tool parameter:
parcels_fc = "Database Connections\\connection.sde\\Cadastral\\parcels"
or
parcels_fc = arcpy.GetParameterAsText[4]
The Problem
However, sharing the results to our ArcGIS Server as a GP service never worked out. If I hardcoded the path, it would upload but the service would always fail when I'd try to run it in a web app. If I got it through a parameter and selected "Constant Value" for that parameter when sharing the service, the server would say that it couldn't find the data in its data store.
I tried all sorts of different permutations to try to get this to work, to get the server to recognize that it had the SDE connections registered in its data store, but to no effect. A "Failed to consolidate data" error was common when trying to stage the service prior to uploading to the server.
Solution
Ok, enough background, what fixed it? The key was to put a copy of the SDE connection file in a place where it would get copied into the server's data store and to use a relative path to access it from within the script.
First, I created a folder in the same directory as the script (which we'll call "path" for this example) and copied the SDE connection file into it. Next, I used the sample code from the Esri documentation (scroll down to "Relative Paths to Folders") to create something like this:
script_path = sys.path[0] # where the script lives
folder_path = os.path.join(script_path, "path")
parcels_fc = os.path.join(folder_path, "Cadastral\\parcels")
In the script tool properties, make sure "Store absolute paths as relative paths" is checked.
And voila. When I share the results as a GP service, the server picks up the "path" folder and copies it to its data store along with the connection file inside of it. The script knows to look for the connection file in the folder "path" that resides in the same directory as the script, then uses that connection to access the SDE and do whatever magic it is that you've told it to do.
Resources:
https://gis.stackexchange.com/questions/10503/creating-a-gp-script-to-update-a-table-in-sde
https://gis.stackexchange.com/search?q=geoprocessing+sde
Like a lot of places, we use a backend SDE to hold the data that we work on internally and expose publicly through map services. I've been trying to create a Geoprocessing Service based on a Python script tool to do some things we need done. Within this script, I directly access the parcels feature class in our SDE that has tens of thousands of features.
The script tool runs just fine from within ArcMap, with the path to the feature class either hardcoded in or provided as a tool parameter:
parcels_fc = "Database Connections\\connection.sde\\Cadastral\\parcels"
or
parcels_fc = arcpy.GetParameterAsText[4]
The Problem
However, sharing the results to our ArcGIS Server as a GP service never worked out. If I hardcoded the path, it would upload but the service would always fail when I'd try to run it in a web app. If I got it through a parameter and selected "Constant Value" for that parameter when sharing the service, the server would say that it couldn't find the data in its data store.
I tried all sorts of different permutations to try to get this to work, to get the server to recognize that it had the SDE connections registered in its data store, but to no effect. A "Failed to consolidate data" error was common when trying to stage the service prior to uploading to the server.
Solution
Ok, enough background, what fixed it? The key was to put a copy of the SDE connection file in a place where it would get copied into the server's data store and to use a relative path to access it from within the script.
First, I created a folder in the same directory as the script (which we'll call "path" for this example) and copied the SDE connection file into it. Next, I used the sample code from the Esri documentation (scroll down to "Relative Paths to Folders") to create something like this:
script_path = sys.path[0] # where the script lives
folder_path = os.path.join(script_path, "path")
parcels_fc = os.path.join(folder_path, "Cadastral\\parcels")
In the script tool properties, make sure "Store absolute paths as relative paths" is checked.
And voila. When I share the results as a GP service, the server picks up the "path" folder and copies it to its data store along with the connection file inside of it. The script knows to look for the connection file in the folder "path" that resides in the same directory as the script, then uses that connection to access the SDE and do whatever magic it is that you've told it to do.
Resources:
https://gis.stackexchange.com/questions/10503/creating-a-gp-script-to-update-a-table-in-sde
https://gis.stackexchange.com/search?q=geoprocessing+sde
Saturday, June 24, 2017
Mental Maps and The Shadows of Human Impact
I like maps. I like to stare at them, try to figure out what their story is. I especially like maps of the physical environment—I've spent hours on Google Maps' terrain view exploring different areas, trying to read the earth, what the bare ground is telling us.
But even with Google Maps, the roads are still there, still distracting us, still saying "look what humans have created! Look at this mighty road we built in this impossible canyon!" So I went to make my own terrain maps, free of the influence of humans.
My first attempt actually ended up telling more of humankind's story than I intended.
Looking for good-but-not-overwhelmingly-detalied elevation data, I downloaded the 5-meter auto-correlated DEM data for the Salt Lake valley from the Utah AGRC. Auto-correlated elevation models are created examining the differences in stereo image pairs (see https://gis.stackexchange.com/a/61993) and thus show various parts of the built environment. What this means is that my attempt to scrub away humankind was beautifully flawed and ended up showing some really cool artifacts.
Let's start with freeways, because every post about the evils of human influence has to talk about freeways, right? This is an overpass. Even without showing the roads, the massive amount of earthmoving that's been done to raise one high-volume stream of metal, cloth, and human beings over another stream is a little stupefying.
This is a wonderfully disjointed bit of geomorphology. Freeway planners of the 50s and 60s took advantage of a natural stream channel/canyon to build a freeway without having to tear down homes. The steep canyon walls gradually give way to the valley floor, however, and are replaced by the flat bed and smooth, flowing, engineered curves of the Interstate Freeway system.
For all our love of attacking modern human's impact on the environment, let's not forget that something as Utah-Pioneer-'Merica!-ish as farming shows up as well. This is one of the last center-pivot irrigation fields I could find in the valley, though the latest satellite imagery shows it's been dug up in preparation for some sort of development.
File this one in the "What the ...?" department. A look at the satellite imagery shows this massive, angular cut into the hillside was made for... a park. A park?! I'm sure there's a story and reason behind it, but I can't for the life of me figure it out.
Look familiar? This is the Salt Lake valley. I've always wondered what it looked like before 1847, when it the only human inhabitants were the Northwestern Shoshone. Like I said above, it's not perfect, but it still gives us a much better picture of what the land beneath our tires looks like.
The human mind is amazing at creating mental maps of areas. We know the routes that are familiar, the topological links between our house, the roads, the freeway, and our office or favorite shopping center. We know the canyons and tall mountains are "east," that Ogden is "north" and Provo is "south." We recognize the Bingham Canyon Mine slag piles as "west."
But this plays tricks on us as well. For years, I always though of going downtown as "going up" somehow. Looking at the elevation color (AKA hypsometric tint), downtown is lower than most of the valley! My mental map doesn't match reality, and it doesn't even match the word "downtown." But the "downtown is up" concept was so engrained in my brain that in Portland I always thought I was going north when I got off the train at Pioneer Square and walked uphill to the PSU campus... on the south end of downtown. But I was going up!
What I like about this map is that except for the few highly-identifiable freeway cuts and raises, you can't use any of your normal mental map clues to figure out where you are. You have to stop thinking about a home, the road, or an arbitrary address grid coordinate, and start thinking about river channels, glacial moraines, and canyons.
After looking at this map in detail, someone told me they couldn't figure out where was where. That's the point.
As Yoda once said, "you must unlearn what you have learned." Let go of your east side vs west side mentality, your downtown vs suburbs bias, your industrial wastelands vs green-grass dreamworld. Release your a priori knowledge, and see what you can find.
Read the story of the valley.
But even with Google Maps, the roads are still there, still distracting us, still saying "look what humans have created! Look at this mighty road we built in this impossible canyon!" So I went to make my own terrain maps, free of the influence of humans.
My first attempt actually ended up telling more of humankind's story than I intended.
Looking for good-but-not-overwhelmingly-detalied elevation data, I downloaded the 5-meter auto-correlated DEM data for the Salt Lake valley from the Utah AGRC. Auto-correlated elevation models are created examining the differences in stereo image pairs (see https://gis.stackexchange.com/a/61993) and thus show various parts of the built environment. What this means is that my attempt to scrub away humankind was beautifully flawed and ended up showing some really cool artifacts.
Let's start with freeways, because every post about the evils of human influence has to talk about freeways, right? This is an overpass. Even without showing the roads, the massive amount of earthmoving that's been done to raise one high-volume stream of metal, cloth, and human beings over another stream is a little stupefying.
Even old-school grid systems, the darling child of walkability nerds and new urbanists, get in on the act. They're easier to see when they're cut into a hillside and the streets square off the lots to accommodate our discomfort of weird, unordered shapes.
This is a wonderfully disjointed bit of geomorphology. Freeway planners of the 50s and 60s took advantage of a natural stream channel/canyon to build a freeway without having to tear down homes. The steep canyon walls gradually give way to the valley floor, however, and are replaced by the flat bed and smooth, flowing, engineered curves of the Interstate Freeway system.
For all our love of attacking modern human's impact on the environment, let's not forget that something as Utah-Pioneer-'Merica!-ish as farming shows up as well. This is one of the last center-pivot irrigation fields I could find in the valley, though the latest satellite imagery shows it's been dug up in preparation for some sort of development.
File this one in the "What the ...?" department. A look at the satellite imagery shows this massive, angular cut into the hillside was made for... a park. A park?! I'm sure there's a story and reason behind it, but I can't for the life of me figure it out.
————————————————————
But the point of this post isn't to tell this micro story, the story of how we've shaped the earth. No, the story I want to tell is this one.
The human mind is amazing at creating mental maps of areas. We know the routes that are familiar, the topological links between our house, the roads, the freeway, and our office or favorite shopping center. We know the canyons and tall mountains are "east," that Ogden is "north" and Provo is "south." We recognize the Bingham Canyon Mine slag piles as "west."
But this plays tricks on us as well. For years, I always though of going downtown as "going up" somehow. Looking at the elevation color (AKA hypsometric tint), downtown is lower than most of the valley! My mental map doesn't match reality, and it doesn't even match the word "downtown." But the "downtown is up" concept was so engrained in my brain that in Portland I always thought I was going north when I got off the train at Pioneer Square and walked uphill to the PSU campus... on the south end of downtown. But I was going up!
What I like about this map is that except for the few highly-identifiable freeway cuts and raises, you can't use any of your normal mental map clues to figure out where you are. You have to stop thinking about a home, the road, or an arbitrary address grid coordinate, and start thinking about river channels, glacial moraines, and canyons.
After looking at this map in detail, someone told me they couldn't figure out where was where. That's the point.
As Yoda once said, "you must unlearn what you have learned." Let go of your east side vs west side mentality, your downtown vs suburbs bias, your industrial wastelands vs green-grass dreamworld. Release your a priori knowledge, and see what you can find.
Read the story of the valley.
————————————————————
I didn't start intending to write about such abstract ideas as mental maps and such conscientious topics as humankind's impact on the world. No, I wanted to share the discoveries I've made in this map of the place I've lived for over twenty years of my life.
Looking at this map, I realize I barely know it.
I meant to write about the cartographic tools and techniques I used, the thrill of getting something to work after trying half a dozen different ways. Someday I'll write that, and the many other techniques I've discovered in the two years since I first wrote this on my personal blog. But for now, I'll keep exploring.
Subscribe to:
Posts (Atom)
-
This is part 2 of a three-part post. Read part 1 (Introduction) here and part 3 (Example Output) here . Installation Git Repo RCP lives...
-
I finally had a breakthrough today at work on a problem I've been working on off-and-on for a while now at work, and I want to share it ...
-
This is part 1 of a three-part post. Read part 2 (Installation and Usage) here and part 3 (Example Output) here . After over a year of of...