Thursday, December 22, 2011
Thursday, December 15, 2011
My son is in second grade, and his teacher was absent for a couple of days this week so I volunteered to teach the geography lesson. I spent about 45 minutes teaching the kids about geographic projections. It was so much fun! The kids were all enthusiastic, attentive, observant, and lots of fun.
3. Take four pieces of tape, cross two, cross the other two, then cross the two sets (so you've created an asterisk of tape) and place it on the pointed end of one of the gores, so the sticky side faces upwards.
4. Bend the other 11 gores into the middle, so the tips meet at the same point.
5. Stick them down firmly on the tape. This end should now be cupped.
Don't be disappointed if your globe is not a perfect sphere. It is actually impossible to recreate a perfect sphere from a simple template like this! In fact, at the beginning of the lesson, I told the kids we were going to attempt two impossible things. The orange peel projection was the first, and creating a 3D sphere from a 2D sheet of paper is the second!
Tuesday, July 12, 2011
And at the second:
Hopefully the rest of the day will be better...?
- Dynamic legneds will be supported (finally!!)
- Automatic image enhancement
- Dynamic viewing of LAS files (LiDAR)
- Right-click within an MXD to package and/or convert map into a service
- ArcGIS online now supports a wide range of data import options, including drag-and-drop of CSV for point data (that was one of the coolest demos of the plenary)
- ESRI is in the process of building a global, multi-resolution topographic mosaic that will also include user-contributed hi-res LiDAR. Not clear about when that will launch, but the data will be accessible as WMS *and* as points and rasters.
BEST NEWS OF ALL: ESRI is releasing 'ArcGIS Home' desktop + extensions for $100. The idea is to enable neophytes to get started, veterans to develop new skills, and anyone to use their software for volunteer projects.
Unfortunately, I felt like the message from ESRI at the plenary was WEB WEB WEB WEB and there was very little emphasis on any of the cool new analytical tools they are developing in the desktop, much less on how they're dealing with the huge list of bugs at 10.0 (for example, I still can't get dissolve to work, which is why I'm still about 90% in 9.3.1). I understand where they're coming from, but it would sure be great if they'd get the core functionality of their software working before they push us all into the bright shiny cloud-computing future.
After the morning plenary we had a two hour break, and after a great lunch with my former colleague (and current Cascade Land Conservancy GIS Manage)r Christopher Walter, I took in some of the sights of the waterfront, including the San Diego Maritime Museum. It was really great to finally explore the Surprise, a ship I've read so much about in Patrick O'Brian's novels.
Monday, July 11, 2011
I'm presenting on Thursday morning at 10:15 in Room 30E on the work CORE GIS did for San Juan County. My co-presenter is Breece Roberts of the Trust for Public Land. If you're here at the conference, please stop by and check it out.
I'll be posting throughout the conference and will include many more photos. I'll also post the occasional tweet, you can follow me at http://twitter.com/COREGIS.
CORE GIS posters all mounted in the Map Gallery
My ride during the conference
Tuesday, March 22, 2011
An impromptu presentation by Greg Sutherland of San Juan County was particularly noteworthy--he showed a YouTube clip of a demo the PPI Group did for San Juan County Public Works. They showed up on the 10:00 am boat, drove their special data collection vehicle around Friday Harbor for a couple of hours, and prepared a demo by 1:00. They collected multi-angle high-resolution photos (similar to the Google StreetView photos) and a very dense LiDAR point cloud. The end result? Engineers are able to take measurements of rights-of-way, cross walks, stop signs, utility poles, you name it, all from a street-level photo. I think the video might be 'invite only' since PPI is courting San Juan county as a client, but once I find the link I'll post here.
My talk seemed to go fairly well, and I received some positive feedback afterwards and lots of helpful suggestions about some of the trickier analytical questions. Here is a PDF if you would like to check it out. Many thanks to Dan Siemann and all of the folks at the National Wildlife Federation for asking me to work on this interesting project, and the Mountaineers Foundation for providing some of the funding.
Thursday, March 17, 2011
March 18th, 2011
City of Bellingham, Fireplace Room
625 Halleck St, Bellingham
Click here for a PDF map.
I'll be talking about some recently completed work I did for the National Wildlife Federation examining floodplains, flooding, and wildlife. I'll post the presentation here and on the website next week.
Thursday, February 24, 2011
After doing a bit of research, I found a great source for integrated terrain and bathymetry at the NOAA coastal relief site. I found my area of interest, clicked on the link to download the data, and was somewhat surprised to see no obvious way to download the data--no 'download' button, no FTP link, nothing. I did notice a 'Create Custom Grid', but since I wanted all of the data shown in the map, that seemed like a hassle. So I called NOAA, and to my utter astonishment, reached an extremely helpful human being on my first attempt. She didn't know how to obtain the data either, but transferred me to someone who did.
Turns out, the ONLY way to obtain the data is by the 'Create Custom Grid' button, even if you want the whole enchilada. But it gets better--you can't specify the whole extent in the create custom grid dialogue, because the extractor limits your request to about 8,000 cells in either direction. So, I had to split the grid into fourths. Here are my notes outlining how to do that:
The grids are extracted and exported as ASCII text files, which I imported to ArcGIS GRIDs. But there's another catch: the upper value of each grid showed up as 65,535. This did not seem like a plausible elevation value (unless the Z values were in centimeters?!) so after a bit of Googling found this helpful post by Thomas Ballatore on the ESRI Support website. This is in response to someone working with SRTM data, but the same procedure worked equally well on the NOAA coastal relief data:
The SRTM files contain values from 0 to 65535. In a given SRTM tile, most of the values will be "typical" elevation values like the 1,133m and below you mention. However, areas where there is no data (voids) are assigned a value of 32768. The setnull command mentioned above would indeed set those values to "nodata".After running this on each of my four grids, I mosaiced them into a single grid, derived some hillshade, and was good to go.
HOWEVER, the SRTM data also correctly contains areas below sea level. These are reported as decreasing values from 65536. For example, an elevation of -2m would have a value of 65534, and elevation of -10 would be 65526 and so on.
For every country I have worked with that has a coastline, there will be a number of these along the coastline. Whether they are truly below sea level or if that is just an inaccuracy of the SRTM data would require further investigation.
Anyway, if you set all values greater than 1133 to nodata, then you will incorrectly set these negative values also to nodata. To avoid that, I use the following two steps in Raster Calculator to correctly prepare the SRTM data:
Step 1. Execute the following:
setnull([N30E119.bil] == 32768,[N30E119.bil])
where N30E119.bil is an SRTM file I was recently working with...change this to your tile's name. This will set the voids to nodata.
Step 2. Then execute this:
con([N30E119.bil] > 32768,[N30E119.bil] - 65536,[N30E119.bil])
This will correctly convert the negative values to negative values. Remember that spaces are important in raster calculator!
Once the Florida Resilient Habitats map is finalized and Sierra Club has signed off on it, I will post it here.
Saturday, February 5, 2011
Captain William Kidd's 315 year old commission from the King of England
Similarly, but on a much more localized scale, I have been working with a client to recreate the land ownership pattern of portions of southwest Washington based on a digital scan of a hard copy map that was produced in 1891. This 120 year old map was created by hand with pen and ink, but it is remarkably accurate given the limitations of surveying and cartographic technologies available at the time. But similarly to Zacks, all we need to do is use our eyes to interpret the map and determine what was happening on the landscape back in the 19th century.
Let's contrast that with the current situation. Most of the maps we create are transmitted as PDFs, JPGs, or some other electronic format, and rarely printed out (maybe 5-10% are printed out). When they are printed, most of the non-profit organizations and government agencies do not have any clear system for cataloging and archiving the maps (or data used to create them) for posterity. Compounding the problem, more and more maps are completely web-based: the base layers might come from a variety of sources--Google, USGS, NOAA, NRCS, you name it--and the 'value-added' content might be coming from any number of other servers. The various layers are widely distributed across dozens of servers, and the underlying information is updated at varying intervals. How do you even capture a "snapshot" of an interactive map like that? What are the chances that the full functionality of any web map can be reproduced 120 years from now? I think very slim indeed, but perhaps I am being overly pessimistic.
The great irony seems to be that we now have access to more information than any other generation in history, but future historians will be able to access only a tiny fraction of it, because we are not doing a sufficient job of preserving it.
Obviously I am not the first person to surf this particular brainwave--for example, see here, here, here, here, and especially here.
Monday, January 31, 2011
I volunteer on the board of my son's Parent-Teacher-Student Association board. My role is to maintain and update the PTSA's website, but at a recent meeting one of the board members mentioned needing to find grant money to provide world maps to several classrooms in the school. My first thought was "How could it be possible that any classroom in any school lacks something as fundamental to a decent education as a world map?!?" My second thought was "Hey, I have a color plotter, and know a bit about maps..." and before I realized what I was saying had volunteered to provide all of the maps as a donation to the school.
I love designing maps and it would have been fun to create a world map from scratch; unfortunately, I did not have time for such a monumental undertaking, so decided to search for a decent, public domain map. I found one here at the World Factbook.
So far I have provided eight poster-sized versions of the map, and they seem to be a hit with the teachers and the students.
A couple of future world travelers in front of the printed map.
Tuesday, January 25, 2011
An interesting spin-off of this work is the recreation of historic land ownership patterns. In order to fully understand the current land ownership and management situation in and around MSH, it's important to have an understanding of the previous policy decisions that got us here. I have been working with Charlie Raines to develop a variety of data from historic maps (100 to 130 years before the present day) and it is really fascinating.
On a personal level, it is humbling to work with these maps that were created by hand--with pen, ink and paper--before there were aerial photographs, orthophotos, satellites, even NAD 27! These old maps georeference remarkably well with the modern-day USPLS townships and sections.
However, there have been some major changes to this landscape over the past 100 years, both natural and man-made. Mt St Helens lost over 1,300 feet of elevation as a result of the 1980 eruption, and we humans built roads, railroads, and numerous large reservoirs on the Lewis, Cowlitz, Nisqually, Bumping, and Columbia river systems. As part of the historic mapping process, I've had a lot of fun 'reversing' these changes by deleting the reservoirs, re-drawing the rivers in their original channels, and erasing highways and interstates.
These maps are still a work in progress so I cannot share them just yet, but here are a couple of screen shots to give you a feel of the historic/modern juxtaposition.
A "split-screen" view, showing an 1891 Northern Pacific Railroad land grant map on the left, and the modern CORE GIS map on the right. Note the fold in the historic map running right-to-left between the 'river' label on the left and Swift Reservoir on the right. Click on thumbnail to embiggen.
Another view of the Northern Pacific Railroad land grant map, this one focused on the region around Mossyrock.
Modern-day view of the same extent.