Planet Lab wants YOU to work with their data!

They say: 

Are you a college student, researcher or professor? We’re looking for innovative academics, researchers and scientists to unlock the power of a one-of-a-kind dataset. You can now apply for access to Planet’s unique dataset for non-commercial research purposes. In an area as large as 2,000 square kilometers, you’ll have access to download imagery, analyze trends, and publish your results.

Check it: https://www.planet.com/products/education-and-research/

AAG 2017 Wrap Up: Day 3

Day 3: I opened the day with a lovely swim with Elizabeth Havice (in the largest pool in New England? Boston? The Sheraton?) and then embarked on a multi-mile walk around the fair city of Boston. The sun was out and the wind was up, showing the historical buildings and waterfront to great advantage. The 10-year old Institute of Contemporary Art was showing in a constrained space, but it did host an incredibly moving video installation from Steve McQueen (Director of 12 Years a Slave) called “Ashes” about the life and death of a young fisherman in Grenada.

My final AAG attendance involved two plenaries hosted by the Remote Sensing Specialty Group and the GIS Specialty Group, who in their wisdom, decided to host plenaries by two absolute legends in our field – Art Getis and John Jensen – at the same time. #battleofthetitans. #gisvsremotesensing. So, I tried to get what I could from both talks. I started with the Waldo Tobler Lecture given by Art Getis: The Big Data Trap: GIS and Spatial Analysis. Compelling title! His perspective as a spatial statistician on the big data phenomena is a useful one. He talks about how data are growing fast: Every minute – 98K tweets; 700K FB updates; 700K Google searches; 168+M emails sent; 1,820 TB of data created. Big data is growing in spatial work; new analytical tools are being developed, data sets are generated, and repositories are growing and becoming more numerous. But, there is a trap. And here is it. The trap of Big Data:

10 Erroneous assumptions to be wary of:

  1. More data are better
  2. Correlation = causation
  3. Gotta get on the bandwagon
  4. I have an impeccable source
  5. I have really good software
  6. I am good a creating clever illustrations
  7. I have taken requisite spatial data analysis courses
  8. It’s the scientific future
  9. Accessibly makes it ethical
  10. There is no need to sample

He then asked: what is the role of spatial scientists in the big data revolution? He says our role is to find relationships in a spatial setting; to develop technologies or methods; to create models and use simulation experiments; to develop hypotheses; to develop visualizations and to connect theory to process.

The summary from his talk is this: Start with a question; Differentiate excitement from usefulness; Appropriate scale is mandatory; and Remember more may or may not be better. 

When Dr Getis finished I made a quick run down the hall to hear the end of the living legend John Jensen’s talk on drones. This man literally wrote the book(s) on remote sensing, and he is the consummate teacher – always eager to teach and extend his excitement to a crowded room of learners.  His talk was entitled Personal and Commercial Unmanned Aerial Systems (UAS) Remote Sensing and their Significance for Geographic Research. He presented a practicum about UAV hardware, software, cameras, applications, and regulations. His excitement about the subject was obvious, and at parts of his talk he did a call and response with the crowd. I came in as he was beginning his discussion on cameras, and he also discussed practical experience with flight planning, data capture, and highlighted the importance of obstacle avoidance and videography in the future. Interestingly, he has added movement to his “elements of image interpretation”. Neat. He says drones are going to be routinely part of everyday geographic field research. 

What a great conference, and I feel honored to have been part of it. 

AAG Boston 2017 Day 1 wrap up!

Day 1: Thursday I focused on the organized sessions on uncertainty and context in geographical data and analysis. I’ve found AAGs to be more rewarding if you focus on a theme, rather than jump from session to session. But less steps on the iWatch of course. There are nearly 30 (!) sessions of speakers who were presenting on these topics throughout the conference.

An excellent plenary session on New Developments and Perspectives on Context and Uncertainty started us off, with Mei Po Kwan and Michael Goodchild providing overviews. We need to create reliable geographical knowledge in the face of the challenges brought up by uncertainty and context, for example: people and animals move through space, phenomena are multi-scaled in space and time, data is heterogeneous, making our creation of knowledge difficult. There were sessions focusing on sampling, modeling, & patterns, on remote sensing (mine), on planning and sea level rise, on health research, on urban context and mobility, and on big data, data context, data fusion, and visualization of uncertainty. What a day! All of this is necessarily interdisciplinary. Here are some quick insights from the keynotes.

Mei Po Kwan focused on uncertainty and context in space and time:

  • We all know about the MAUP concept, what about the parallel with time? The MTUP: modifiable temporal unit problem.
  • Time is very complex. There are many characteristics of time and change: momentary, time-lagged response, episodic, duration, cumulative exposure
    • sub-discussion: change has patterns as well - changes can be clumpy in space and time. 
  • How do we aggregate, segment and bound spatial-temporal data in order to understand process?
  • The basic message is that you must really understand uncertainty: Neighborhood effects can be overestimated if you don’t include uncertainty.

As expected, Michael Goodchild gave a master class in context and uncertainty. No one else can deliver such complex material so clearly, with a mix of theory and common sense. Inspiring. Anyway, he talked about:

  • Data are a source of context:
    • Vertical context – other things that are known about a location, that might predict what happens and help us understand the location;
    • Horizontal context – things about neighborhoods that might help us understand what is going on.
    • Both of these aspects have associated uncertainties, which complicate analyses.
  • Why is geospatial data uncertain?
    • Location measurement is uncertain
    • Any integration of location is also uncertain
    • Observations are non-replicable
    • Loss of spatial detail
    • Conceptual uncertainty
  • This is the paradox. We have abundant sources of spatial data, they are potentially useful. Yet all of them are subject to myriad types of uncertainty. In addition, the conceptual definition of context is fraught with uncertainty.
  • He then talked about some tools for dealing with uncertainty, such as areal interpolation, and spatial convolution.
  • He finished with some research directions, including focusing on behavior and pattern, better ways of addressing confidentiality, and development of a better suite of tools that include uncertainty.

My session went well. I chaired a session on uncertainty and context in remote sensing with 4 great talks from Devin White and Dave Kelbe from Oak Ridge NL who did a pair of talks on ORNL work in photogrammetry and stereo imagery, Corrine Coakley from Kent State who is working on reconstructing ancient river terraces, and Chris Amante from the great CU who is developing uncertainty-embedded bathy-topo products. My talk was on uncertainty in lidar inputs to fire models, and I got a great question from Mark Fonstad about the real independence of errors – as in canopy height and canopy base height are likely correlated, so aren’t their errors? Why do you treat them as independent? Which kind of blew my mind, but Qinghua Guo stepped in with some helpful words about the difficulties of sampling from a joint probability distribution in Monte Carlo simulations, etc. 

Plus we had some great times with Jacob, Leo, Yanjun and the Green Valley International crew who were showcasing their series of Lidar instruments and software. Good times for all!

GIF Bootcamp 2017 wrap up!

Our third GIF Spatial Data Science Bootcamp has wrapped!  We had an excellent 3 days with wonderful people from a range of locations and professions and learned about open tools for managing, analyzing and visualizing spatial data. This year's bootcamp was sponsored by IGIS and GreenValley Intl (a Lidar and drone company). GreenValley showcased their new lidar backpack, and we took an excellent shot of the bootcamp participants. What is Paparazzi in lidar-speak? Lidarazzi? 

Here is our spin: We live in a world where the importance and availability of spatial data are ever increasing. Today’s marketplace needs trained spatial data analysts who can:

  • compile disparate data from multiple sources;
  • use easily available and open technology for robust data analysis, sharing, and publication;
  • apply core spatial analysis methods;
  • and utilize visualization tools to communicate with project managers, the public, and other stakeholders.

At the Spatial Data Science Bootcamp we learn how to integrate modern Spatial Data Science techniques into your workflow through hands-on exercises that leverage today's latest open source and cloud/web-based technologies. 

Women in GIS interview!

Hi all! I was recently profiled for the excellent website: Women in GIS (or WiGIS). This is a group of technical-minded women who maintain this website to feature women working in the geospatial industry with our Who We Are spotlight series. and in addition, the individuals in this group make their presence known at conferences like CalGIS and ESRI’s UCs. We also plan to host a number of online resources women might find useful to start or navigate their GIS career.

Excellent time, and thanks for the opportunity!

Dronecamp coming in July. Check it!

IGIS is pleased to announce a three-day "Dronecamp" to be held July 25-27, 2017, in Davis. This bootcamp style workshop will provide "A to Z" training in using drones for research and resource management, including photogrammetry and remote sensing, safety and regulations, mission planning, flight operations (including 1/2 day of hands-on practice), data processing, analysis, and visualization. The workshop content will help participants prepare for the FAA Part 107 Remote Pilot exam. Participants will also hear about the latest technology and trends from researchers and industry representatives.

Dronecamp builds upon a series of workshops that have been developed by IGIS and Sean Hogan starting in 2016. Through these workshops and our experiences with drone research, we've learned that the ability to use mid-range drones as scientifically robust data collection platforms requires a proficiency in a diverse set of skills and knowledge that exceeds what can be covered in a traditional workshop. Dronecamp aims to cover all the bases, helping participants make a great leap forward in their own drone programs.

Dronecamp is open to all but will have a focus on applications in agriculture and natural resources. No experience is necessary. We expect interest to exceed the number of seats, so all interested participants must fill in an application before they can register. Applications are due on April 15, 2017. For further information, please visit http://igis.ucanr.edu/dronecamp/. Dronecamp Flier

New GPS/GLONASS Base Station installed on UC Berkeley campus. Happy geo-locationing!

California Surveying and Drafting recently installed a GPS/GLONASS base station antenna on McCone Hall and to reciprocate they’re allowing Berkeley researchers to use the real-time correction signal for free. This could be useful for anyone doing research in California with access to a mapping-grade or survey-grade GNSS unit such as a Trimble Geoexplorer. You’ll need to tether your Trimble to a strong 3G/4G wifi signal (for example from your cell phone) so this approach will only work in regions with cellular reception. 

Initial tests show under 5cm of error with a Trimble GeoXH 6000 unit on campus. Thanks Nico!

Summary of our pilot Soil Vegetation Map digitization project in Sonoma County

Between the years 1949-1979 the Pacific Southwest research station branch of the U.S. Forest service published two series of maps: 1) The Soil-Vegetation Maps, and 2) Timber Stand Vegetation Maps. These maps to our knowledge have not been digitized, and exist in paper form in university library collections, including the UC Berkeley BioScience and Natural Resources Library.

Collection Description

The Soil-Vegetation Maps use blue or black symbols to show the species composition of woody vegetation, series and phases of soil types, and the site-quality class of timber. A separate legend entitled “Legends and Supplemental Information to Accompany Soil-Vegetation Maps of California” allow for the interpretation of these symbols in maps published 1963 or earlier. Maps released following 1963 are usually accompanied by a report including legends, or a set of “Tables”. These maps are published on USGS quadrangles at two scales 1:31,680 and 1:24,000. Each 1:24,000 sheet represents about 36,000 acres. See Figure 1 for the original index key.

The Timber Stand Vegetation Maps use blue or black symbols to show broad vegetation types and the density of woody vegetation, age-size, structure, and density of conifer timber stands and other information about the land and vegetation resources is captured. The accompanying “Legends and Supplemental Information to Accompany Timber Stand-Vegetation Cover Maps of California” allows for interpretation of those symbols. Unlike the Soil-Vegetation Maps a single issue of the legend is sufficient for interpretation. See Figure 2 for the original index key.

Methods

We found 22 quad sheets for Sonoma County in the Koshland BioScience Library at UC Berkeley.

Scanning

Using a large format scanner at UC Berkeley’s Earth Science and Map library we scanned each original quad at a standard 300dpi resolution. The staff at the Earth Science Library completes the scans and provides an online portal with which to download. Current library recharge is at $10 per quad sheet. Coordinating the release of the maps from the UC Berkeley BioScience library and subsequent transfer to the UC Berkeley Earth Science and Map library currently requires a UC member with valid library privileges to check out the maps. 

Georeferencing

Georeferencing of the maps was done in ArcGIS Desktop using the georeferencing toolbar. For the Sonoma county quads which are at a standard 1:24,000 scale we were able to employ the use of the USGS 24k quad index file for corner reference points to manually georeference each quad. We used Upper Right, Upper Left, Lower Right, Lower Left as our tie points. The USGS quads are projected in polyconic NAD 1927 UTM Zone 10 projection so we adjusted our data frame to match this original projection and register the image. For a step by step description of this process see “Georeferencing Steps in ArcMap”.

Error estimation

The georeferencing process of historical datasets often produces error. We capture the error created through this process through the root mean squared error (RMSE). The min value from these 22 quads is 4.9, the max value is 15.6 and the mean is 9.9. This information must be captured before the image is registered. See Table 1 below for individual RMSE scores for all 22 quads. 

Table 1: Quad original name, quad name from the downloaded USGS 24k file, and the RMSE of the georeferencing process. 

Quad Name                      Quad Name                    RMSE (m)

60A-3                                Whispering Pines            7.48705

60B-3                                Asti                                 12.7461

60B-4                                The Geysers                    6.84357

60C-1                                Jimtown                          7.66811

60C-2                                Geyserville                      6.60752

60C-3                                Guerneville                     14.8663

60D-12                              Mount Saint Helena       10.7671

61A-3                                Big Foot Mountain          9.77075

61A-4                                Cloverdale                      9.37442

61B-3                                McGuire Ridge               7.90499

61B-4                                Gube Mountain              15.3223

61C-1                                Annapolis                       5.66674

61C-2                                Stewarts Point                 14.8612

61C-4                                Plantation                       4.91229

61D-1                                Warm Springs Dam         15.562

61D-2                                Tombs Creek                   12.995

61D-3                                Fort Ross                         9.06434

61D-4                                Cazadero                        13.0045

62A-4                                Gualala                          11.1405

63A-1                                Duncans Mills                 7.44373

63A-2                                Arched Rock                   5.55524

64B-2                                Camp Meeker                 8.91102

Aerial photography archives

Notes on where to find historical aerial imagery (thanks to Kass Green): The USDA has an archive of aerial imagery in Salt Lake City at APFOhttp://www.fsa.usda.gov/programs-and-services/aerial-photography/index.  There is a ArcGIS online map of the  tiles and dates of this photos. Search in ArcGIS online for the AFPO Historical Availability Tile Layer. USDA is in the process of scanning these photos, but you can order them through a manual process now (which can take a long time). 

The EROS data center in Sioux Falls also has an archive of high altitude photos for the US from the 1980s.  Also check out https://lta.cr.usgs.gov/NHAP  and https://lta.cr.usgs.gov/NAPP .  These photos are available digitally, but are not terrain corrected or georeferenced.

Great links from class today

Today was WebGIS and the Geoweb (I know, we could do a whole semester), and rounded up some nice resources. 

  1. Open Street Map interactions (from Vanessa):
    1. Here is Overpass Turbo, the OSM data filtering site. https://overpass-turbo.eu
    2. Here is Tag Info, where you can find the keys to query information on Overpass Turbo. https://taginfo.openstreetmap.org/
  2. Privacy (from Wyeth): Radiolab did a great piece on the intersection between GIS data and privacy.
    1. Link to the article: http://www.radiolab.org/story/update-eye-sky/ (this is the updated article after changes from the original broadcast in June 2015 [http://www.radiolab.org/story/eye-sky/] ) 
    2. Also, the company that developed from this: http://www.pss-1.com/

New digitization project: Soil-Vegetation Map Collection

Between the years 1949-1979 the Pacific Southwest research station branch of the U.S. Forest service published two series of maps: 1) The Soil-Vegetation Maps, and 2) Timber Stand Vegetation Maps. These maps to our knowledge have not been digitized, and exist in paper form in university library collections, including the UC Berkeley Koshland BioScience Library.

Index map for the Soil Vegetation MapsThe Soil-Vegetation Maps use blue or black symbols to show the species composition of woody vegetation, series and phases of soil types, and the site-quality class of timber. A separate legend entitled “Legends and Supplemental Information to Accompany Soil-Vegetation Maps of California” allow for the interpretation of these symbols in maps published 1963 or earlier. Maps released following 1963 are usually accompanied by a report including legends, or a set of “Tables”. These maps are published on USGS quadrangles at two scales 1:31,680 and 1:24,000. Each 1:24,000 sheet represents about 36,000 acres. 

The Timber Stand Vegetation Maps use blue or black symbols to show broad vegetation types and the density of woody vegetation, age-size, structure, and density of conifer timber stands and other information about the land and vegetation resources is captured. The accompanying “Legends and Supplemental Information to Accompany Timber Stand-Vegetation Cover Maps of California” allows for interpretation of those symbols. Unlike the Soil-Vegetation Maps a single issue of the legend is sufficient for interpretation. 

We found 22 quad sheets for Sonoma County in the Koshland BioScience Library at UC Berkeley, and embarked upon a test digitization project. 

Scanning. Using a large format scanner at UC Berkeley’s Earth Science and Map library we scanned each original quad at a standard 300dpi resolution. The staff at the Earth Science Library completes the scans and provides an online portal with which to download. 

Georeferencing. Georeferencing of the maps was done in ArcGIS Desktop using the georeferencing toolbar. For the Sonoma county quads which are at a standard 1:24,000 scale we were able to employ the use of the USGS 24k quad index file for corner reference points to manually georeference each quad. 

Error estimation. The georeferencing process of historical datasets produces error. We capture the error created through this process through the root mean squared error (RMSE). The min value from these 22 quads is 4.9, the max value is 15.6 and the mean is 9.9. This information must be captured before the image is registered. See Table 1 below for individual RMSE scores for all 22 quads. 

Conclusions. Super fun exercise, and we look forward to hearing about how these maps are used. Personally, I love working with old maps, and bringing them into modern data analysis. Just checking out the old and the new can show change, as in this snap from what is now Lake Sonoma, but was the Sonoma River in the 1930s.

Thanks Kelly and Shane for your work on this!

ISECCI historical ecology working group wrap-up

Last week Kelly and I with others travelled to the Sierra Nevada Aquatic Research Lab (SNARL) in the eastern Sierra Nevada, just south of Mono Lake for a research retreat. SNARL is part of the UC's Natural Reserve System, which is comprised of nearly 40 properties across the state. These are preserves that foster research, education and collaboration. They have much in common with ANR's REC system. I've been to a few of them now, and am very pleased to make more visits. I love the east side of the Sierra, and that iconic Highway 395. 

This trip was a retreat for the ISECCI historical ecology working group, led by the inspirational Peter Alagona from UCSB. We discussed our existing projects, including the VTM work (see figure below), and talked about potentials for more collaborative research and further integration between NRS and ANR. We have a list of wishes for digitization, and if anyone out there has ideas about pitching these to donors, please let me know. For example: 

  • Kelly and I want to digitize the Leiburg maps from the northern Sierra to add to the VTM stack;
  • We want to find a better way to index and view historical aerial photography state-wide. Something like this for historical maps: http://ngmdb.usgs.gov/maps/TopoView/help/

And we had a field trip looking at Mono Lake water issues. Great time spent!

Density of VTM features across the collections

Digitizing old maps for science: the Soil-Vegetation Map Series

As many of you know, my lab at Berkeley has been involved in rescuing, digitizing and sharing historical ecological data for about a decade. Our big push has been working with the Wieslander Vegetation Type Mapping (VTM) project in California. The VTM collection, created in the 1920s and 1930s, has been described as “the most important and comprehensive botanical map of a large area ever undertaken anywhere on the earth’s surface” (Jepson et al. 2000). It was pioneered by Albert E Wieslander, an employee of the Forest Service Forest and Range Experiment Station in Berkeley, CA. Overall, the collection covers about 28 million ha or just over a quarter of the state including natural areas exclusive of the deserts and the larger agricultural areas. The collection includes over 200 vegetation maps, 18,000 vegetation plots, over 3,000 photographs, and over 100 plant voucher specimens.  It is a detailed, extensive (although not complete), and multi-modal description of the vegetation of California in the early 20th century, and its availability in digital form presents multiple opportunities to examine, characterize and understand changes to California landscapes. Many groups around the state have helped in the digitization, and now the four parts of the collection are reunited in digital space. Here is a nice pic of their coverage in the state. 

The data has been used in many scientific publications, but one of the VTM projects under-sung roles has been to provide a foundation for many subsequent mapping efforts in the state. For example, the protocols developed by Wieslander and his crew became the foundation for the State Cooperative Soil-Vegetation surveys from that covered 4.6 million ha of land during 1947-1977. Those early surveys paved the way for many of the contemporary vegetation classification schemes used today in California, including the Manual for California Vegetation, the National Vegetation Classification System, and the California Gap Analysis Program.

The maps from the State Cooperative Soil-Vegetation Survey of California, which are the first state-wide program to use aerial photography interpretation to produce vegetation and timber maps, have not to our knowledge been digitized, and might add to the understanding of California flora in the post-war period. We are about to embark on a pilot project to examine the feasibility of digitizing and georeferencing these soil-vegetation maps, and adding them to the digital collection. I think the georeferencing process will have to be just an attribution of the marked corner coordinates. Stay tuned! Here are some snap shots so you know what we are up against.

Map examples: a) detail of veg polygons; b) index indicating year of aerial photo used; c) example from timber series of maps Fore more about the VTM project, see my blog posts here, and the vtm.berkeley.edu website. 

New datum for Australia: catch me if you can!

In the US, we have revised our geodetic datum over the yearrs. Most famously the switch from NAD27 to NAD83 as instrumention necessitated more measurements and resulted in a more accurate model. But in Australia, they are creating a new datum because the continent is MOVING 7cm a year. 

Read here: http://www.bbc.com/news/technology-36912700

Check it:

The Geocentric Datum of Australia, the country's local co-ordinate system, was last updated in 1994. Since then, Australia has moved about 1.5 metres north.

So on 1 January 2017, the country's local co-ordinates will also be shifted further north - by 1.8m.

The over-correction means Australia's local co-ordinates and the Earth's global co-ordinates will align in 2020.

I for one welcome our ESRI overloards: Wrap Up from the 2016 ESRI User Conference

What a full week! Here is my wrap-up from a great 2016 ESRI User Conference.

I haven't been here in many years, and I am glad I came. I learned much, and have some new ideas for workshops and classes, and how IGIS can be more of service to ANR, and I got started on ArcGIS Pro - ESRI's eventual replacement for ArcGIS Desktop. Pro has a completely new user interface that is very clear; you can visualize, edit, and perform analysis in both 2D and 3D; it is super fast via multithreading & 64-bit processing (finally), and it has new icons and a bunch of new processing tools. A bunch of very cool stuff comes with 1.3 soon. 

Day 1: Monday was spent in big-picture, inspirational talks from camera beautiful GIS people. 15,000 in the audience, 10 massive screens. I loved it, I felt like I was at a really intense but sedate rock concert. Note to ESRI: could you put the chairs any closer together? The highlight was speaker keynote Andrea Wulf, talking about her obsession with Alexander Von Humboldt.  Note to self: get the book. In addition to the big picture stuff, Day 1 is ESRI's chance to highlight this year's software improvements as we continue the voyage away from the desktop workflow: Pro, integrated 3D, green design, Apps, seamless integration with the web. 

Day 2: Tuesday focused on workshops. I picked four workshops from the Spatial Statistics Team at ESRI. These were led by Lauren Bennett and her crew (Flora Vale, Jenora D'Acosta). Uniformly fantastic. I had downloaded Pro the night before, and with some trepidation got started and followed along. I am happy to report that it seems very intuitive. I have read elsewhere about worries that there is loss of cartographic control, and I will look into that. I learned about the Spatial Stats toolbox in Pro, and some very new capacity in optimization of pattern analysis (you know how difficult it is to pick that distance kernel), and in the new space-time cube capabilities. The space-time capabilities make very complex analyses doable, and are very exciting, but still a bit constraining if you don't know how to update the code. Oh yeah, and space-time uses netCDF format. 

Day 3: For Wednesday's workshops I chose workshops that would help me update class labs: Python + Raster analysis; Site suitability in Pro; Cost Connectivity in Pro; and some crazy cool prediction tool called Empirical Baysien Kriging, which I will be checking out.  I guess EBK has been around for awhile, but now implemented in ESRI software. The new suite of tools in site suitability + connectivity are going to be key. Kevin M. Johnston and Elizabeth Graham led the Site Suitability and Connectivity, and Eric Krause led the Kriging workshop. 

Day 4: All day was spent in Pix4D immersion with the excellent Pix4D support/training team. Pix4D is the gold standard for drone imagery workflow; it also serves as the backend engine for ESRI's Drone2Map application, which I have not tried. Most of the morning was spent in basics: workflow basics, application domains, super wow factor examples like 0.5cm resolution imagery processing. We also looked at workflow and best practices, accuracy, and some example projects. The room was full of 50+ people, many with specific questions about a range of projects. Got to hang out a bit with Greg Crustinger, who is now at Parrot. Even more excited now about our new Sequoia cameras. 

Thoughts:

  • Little Italy has some great restaurants. 
  • We need to move to Pro soon. Probably not in time for Fall's class, but soon. GIF and IGIS workshops are going to have to get updated. 
  • I need to get more in touch with imagery analysis in Pro. Especially with the segmentation and classification part. 
  • I want to recreate the workflow for site suitability + locate regions + cost connectivity soon. 
  • The ability to perfom complex analyses in a GUI is increasingly easy, but is that a good thing? We have to be increasingly vigilant about training the fundamentals as well. 
  • One frustration about these workshops that I bet my workshop participants share - the data all is perfect, and ready to go. We need to keep talking about where to get data, and how to wrangle it into shape. 
  • Could drones mean the resurrection of photogrammetry? At least in the classroom?

Trends: 

  • Hyper granularity: how do we processes these increasingly fine resolution datasets? 
  • Global to local focus in modeling: GWR, optimized Getis-Ord, empirical baysien kriging all try to deal with and model local variability across a complex study area;
  • Incorporation of permutations and distribution functions in modeling has been made way easier;
  • Big Data, multidimensional data, temporal data: ESRI is really trying to be a better informatics platform for research;
  • ESRI seems to be increasingly relying on external and open standards for new data formats/products; this is a great trend;
  • Decision-making: all these analyses need to support decision-making; communication remains critical, tools for web-based interaction continue to expand.

Searching for patterns in high res imagery: template matching

From two friends in the space of a week! While I was away, this tool made the rounds: 

http://sf.terrapattern.com/: This is the alpha version of Terrapattern, a visual search tool for satellite imagery. The project provides journalists, citizen scientists, and other researchers with the ability to quickly scan large geographical regions for specific visual features.  

It is a great deal like some of the template matching routines in Definiens Ecognition among other proprietary software tools.  

Here is an article about it: http://techcrunch.com/2016/05/25/terrapattern-is-a-neural-net-powered-reverse-image-search-for-maps/ 

They say:

Terrapattern is a visual search engine that, from the first moment you use it, you wonder: Why didn’t Google come up with this 10 years ago? Click on a feature on the map — a baseball diamond, a marina, a roundabout — and it immediately highlights everything its algorithm thinks looks like it. It’s remarkably fast, simple to use and potentially very powerful. 

Go ahead and give it a try first to see how natural it is to search for something. How does that work? And how did a handful of digital artists and developers create it — and for under $35,000?

The secret, as with so many other interesting visual computing projects these days, is a convolutional neural network. It’s essentially an AI-like program that extracts every little detail from an image and looks for patterns at various levels of organization — similar to how our own visual system works, though the brain is infinitely more subtle and flexible.