I for one welcome our ESRI overloards: Wrap Up from the 2016 ESRI User Conference

What a full week! Here is my wrap-up from a great 2016 ESRI User Conference.

I haven't been here in many years, and I am glad I came. I learned much, and have some new ideas for workshops and classes, and how IGIS can be more of service to ANR, and I got started on ArcGIS Pro - ESRI's eventual replacement for ArcGIS Desktop. Pro has a completely new user interface that is very clear; you can visualize, edit, and perform analysis in both 2D and 3D; it is super fast via multithreading & 64-bit processing (finally), and it has new icons and a bunch of new processing tools. A bunch of very cool stuff comes with 1.3 soon. 

Day 1: Monday was spent in big-picture, inspirational talks from camera beautiful GIS people. 15,000 in the audience, 10 massive screens. I loved it, I felt like I was at a really intense but sedate rock concert. Note to ESRI: could you put the chairs any closer together? The highlight was speaker keynote Andrea Wulf, talking about her obsession with Alexander Von Humboldt.  Note to self: get the book. In addition to the big picture stuff, Day 1 is ESRI's chance to highlight this year's software improvements as we continue the voyage away from the desktop workflow: Pro, integrated 3D, green design, Apps, seamless integration with the web. 

Day 2: Tuesday focused on workshops. I picked four workshops from the Spatial Statistics Team at ESRI. These were led by Lauren Bennett and her crew (Flora Vale, Jenora D'Acosta). Uniformly fantastic. I had downloaded Pro the night before, and with some trepidation got started and followed along. I am happy to report that it seems very intuitive. I have read elsewhere about worries that there is loss of cartographic control, and I will look into that. I learned about the Spatial Stats toolbox in Pro, and some very new capacity in optimization of pattern analysis (you know how difficult it is to pick that distance kernel), and in the new space-time cube capabilities. The space-time capabilities make very complex analyses doable, and are very exciting, but still a bit constraining if you don't know how to update the code. Oh yeah, and space-time uses netCDF format. 

Day 3: For Wednesday's workshops I chose workshops that would help me update class labs: Python + Raster analysis; Site suitability in Pro; Cost Connectivity in Pro; and some crazy cool prediction tool called Empirical Baysien Kriging, which I will be checking out.  I guess EBK has been around for awhile, but now implemented in ESRI software. The new suite of tools in site suitability + connectivity are going to be key. Kevin M. Johnston and Elizabeth Graham led the Site Suitability and Connectivity, and Eric Krause led the Kriging workshop. 

Day 4: All day was spent in Pix4D immersion with the excellent Pix4D support/training team. Pix4D is the gold standard for drone imagery workflow; it also serves as the backend engine for ESRI's Drone2Map application, which I have not tried. Most of the morning was spent in basics: workflow basics, application domains, super wow factor examples like 0.5cm resolution imagery processing. We also looked at workflow and best practices, accuracy, and some example projects. The room was full of 50+ people, many with specific questions about a range of projects. Got to hang out a bit with Greg Crustinger, who is now at Parrot. Even more excited now about our new Sequoia cameras. 

Thoughts:

  • Little Italy has some great restaurants. 
  • We need to move to Pro soon. Probably not in time for Fall's class, but soon. GIF and IGIS workshops are going to have to get updated. 
  • I need to get more in touch with imagery analysis in Pro. Especially with the segmentation and classification part. 
  • I want to recreate the workflow for site suitability + locate regions + cost connectivity soon. 
  • The ability to perfom complex analyses in a GUI is increasingly easy, but is that a good thing? We have to be increasingly vigilant about training the fundamentals as well. 
  • One frustration about these workshops that I bet my workshop participants share - the data all is perfect, and ready to go. We need to keep talking about where to get data, and how to wrangle it into shape. 
  • Could drones mean the resurrection of photogrammetry? At least in the classroom?

Trends: 

  • Hyper granularity: how do we processes these increasingly fine resolution datasets? 
  • Global to local focus in modeling: GWR, optimized Getis-Ord, empirical baysien kriging all try to deal with and model local variability across a complex study area;
  • Incorporation of permutations and distribution functions in modeling has been made way easier;
  • Big Data, multidimensional data, temporal data: ESRI is really trying to be a better informatics platform for research;
  • ESRI seems to be increasingly relying on external and open standards for new data formats/products; this is a great trend;
  • Decision-making: all these analyses need to support decision-making; communication remains critical, tools for web-based interaction continue to expand.

Searching for patterns in high res imagery: template matching

From two friends in the space of a week! While I was away, this tool made the rounds: 

http://sf.terrapattern.com/: This is the alpha version of Terrapattern, a visual search tool for satellite imagery. The project provides journalists, citizen scientists, and other researchers with the ability to quickly scan large geographical regions for specific visual features.  

It is a great deal like some of the template matching routines in Definiens Ecognition among other proprietary software tools.  

Here is an article about it: http://techcrunch.com/2016/05/25/terrapattern-is-a-neural-net-powered-reverse-image-search-for-maps/ 

They say:

Terrapattern is a visual search engine that, from the first moment you use it, you wonder: Why didn’t Google come up with this 10 years ago? Click on a feature on the map — a baseball diamond, a marina, a roundabout — and it immediately highlights everything its algorithm thinks looks like it. It’s remarkably fast, simple to use and potentially very powerful. 

Go ahead and give it a try first to see how natural it is to search for something. How does that work? And how did a handful of digital artists and developers create it — and for under $35,000?

The secret, as with so many other interesting visual computing projects these days, is a convolutional neural network. It’s essentially an AI-like program that extracts every little detail from an image and looks for patterns at various levels of organization — similar to how our own visual system works, though the brain is infinitely more subtle and flexible.

National Park Maps All in One Place

Kudos to Matt Holly, a member of the National Park Service’s (NPS) National Resource Stewardship and Science Directorate.  Matt has been uploading all of the NPS maps into a single portal available online.  At the moment these maps are available in GIF, JPEG, and PDF...but maybe shapefiles will follow??  You can search the maps alphabetically by park, or by state.  Access the website here.  

Mapping the Housing Divide

The Washington Post, using data from Black Knight Financial Services, recently published an amazing series of maps showing disparities in the United States' housing recoveries.  They argue that these disparities have exacerbated inequality and have particularly worked against Americans of moderate means and minority neighborhoods.  Check the full article out here and explore the maps.  

Wrap up on the Hopland Bioblitz 2016

This text excerpted from the Hopland Newsletter:

Over 70 scientists and naturalists descended upon HREC from  April 8-10th in our first Hopland Bioblitz. During the weekend over 400 species from the recently discovered blind silverfish to the characterful kangaroo rat were observed and recorded on the HREC iNaturalist page

You can still get involved with our bioblitz efforts by logging onto our iNatualist page and seeing if you can help to identify any unknown species. Enjoy more of our discoveries by taking a look through our photography competition entries.

This bioblitz was supported by a grant from the University of California Agriculture and Natural Resources and was organized by Kip Will, Maggi Kelly, George Roderick, Rosemary Gillespie. IGIS's Shane Feirer helped set up the IT infrastructure for the day.

Hopland Bioblitz is on!

Our big Hopland scientific bioblitz is this weekend (9-10 April, with some events on the 8th) and I look forward to seeing many of you there. If you can't make it to HREC, there are many ways you can remotely help us and check out what is happening all weekend long.

HELP US OUT. http://www.inaturalist.org/ Many people will be using iNaturalist to make and share observations. Helping out the effort is easy. Look for observations at the iNaturalist site by searching for "Hopland" in the "Projects" pulldown menu and choose "Hopland Research Extension Center". Once there, you can browse the plants and animals needing identification and needing confirmation. Every identification counts toward our goal of massively increasing the knowledge of the HREC's flora and fauna.

VOTE ON IMAGES.  http://www.hoplandbioblitz.org/ We are hosting an image contest for the plants and animals of HREC. Great prizes will be given  for images that get the most votes(REI gift cards and a GoPro grand prize!). Please visit the site and vote for your favorites frequently during the weekend and share them and then sit back and what the slide show.  

CHECK US OUT. http://geoportal.ucanr.edu/# Our new app will graphically show you our progress for the bioblitz observations. Results will be updated every 15 minutes. See how your favorite groups are doing in the challenge to document as many species as possible.

Look for #HoplandBioblitz on Twitter and Instagram

Follow along on Facebook https://www.facebook.com/HoplandREC/

AAG 2016 wrap up

1870s-ish map of SFAside from all the sunshine, architecture, maps, and food, at the 2016 AAG conference Kelly and I participated in four organized sessions on Historical Ecology. On Saturday, we heard from a number of fantastic researchers, and here is my wrap-up.  (Alas, these sessions overlapped with sessions on CyberGIS, Who says you can't be interestd in both? cyber-historical-environmental-spatial-data-science.)
  • History: We heard from researchers working on data from the Holocene, to pre-history, to the 20th century.
  • Focus: Ecosystems included prairie, forests (Maryland, New York, California, Florida, Ohio); and Wetlands (China, California, etc.); Land Use and Agriculture (Mexico, Brazil); Fire (Arizona); and biological collections. 
  • Data included inventory (PLS system: land appraisal value; Cadastral surveys); Imagery (Landsat, aerial imagery); and biological (paleo; tree ring; resurveys; pollen records; bird census; and PLS system: witness trees, survey lines, FIA data). 
  • Methods: Comparison between past and present from existing inventory data, as well as comparison between historic and modern resurveys; digitization of multiple data sources; narrative analysis; ecological modeling; ecosystem services modeling; fire behavior modeling; OBIA of historic imagery; and some really neat modeling work. 
  • Emerging Themes from the sessions included: 
    • Data. Most people used digital data from an existing source - websites, clearinghouse, existing digital source. Many digitized their data. One person used an API. 
    • Accuracy. About half of speakers have thought about, or incorporated understanding of data quality or uncertainty in your work; but this is difficult to do quantitatively. Some people use the 'Multiple lines of evidence' from diverse datasets to increase confidence in results. 
    • Tools. We heard about a number of tools, including GIS as desktop tool, Open tools, Backcasting with landcover models, Complex modeling approaches, One paper used OBIA methods, and one paper discussed Big historic data (maybe moving toward the cyberGIS overlap). 
    • Theoretical frameworks: A few papers used resilience as a framework, social, ecological and coupled; and several papers used a landscape ecology framework. 
    • New terms: I learned a new term: “Terrageny”: a record of how a landscape became fragmented through time, containing information on the ‘ancestry’ of fragments and showing how an initially continuous landscape was progressively divided into fragments of decreasing size. Ewers et al. 2013. Gorgeous word. Must incorporate into cocktail party discussion. 

We also sent out a survey to the speakers prior to the talks, and here are some preliminary results. 

Question: What are the three top challenges that you see for historical ecology research?
  • Data/Logistical/Availability
    • The further back in time we look, the more sparse the data. 
  • Technical 
    • Lack of metadata: Current data deluge may attract attention/urgency away from the discovery and digitization of historical data;
    • Few models capable of incorporating human and environment interactions over long time scales. 
  • Theoretical
    • Maintaining perceived relevance in the context of the novel ecosystem/no-analog system conversation - not having historical ecology be the baby that is thrown out with the bathwater.
  • Operational 
    • Many respondants mentioned issues with funding - these projects are by nature interdisciplinary, often require large programs to fund at achievable levels, and not many funding sources exist.
  • Communication
    • We need to focus on communicating the importance of understanding past conditions to inspire and guide current design proposals. 
Question: What exciting future directions do you envision for historical ecology research?
  • The importance of historical data and analysis:
    • Historical data is essential: Multi- Inter-disciplinary research needs historical research, particularly so that we can understand 1) historical reference conditions, but also so that we can understand 2) when we might have novel interactions between species and ecosphere. 
  • Practicality of historical data and analysis: 
    • Historical ecology is critical for restoration projects and for studying climate change, and for its power to communicate through environmental education with the public.
  • New data/Big data/Data Fusion: 
    • Increase in digitally available historical sources (longer ecological and climate records and reconstructions), plus the availability of large, high-resolution datasets to assess change (thinking LiDAR, government reports, survey data...)  
    • There is also increasing sophistication of analysis and visualization tools.
    • But, the current data deluge may attract attention/urgency away from the discovery and digitization of historical data.

A fantastic time was had by all!

Spatial Data Science Bootcamp 2016!

Last week we held another bootcamp on Spatial Data Science. We had three packed days learning about the concepts, tools and workflow associated with spatial databases, analysis and visualizations. Our goal was not to teach a specific suite of tools but rather to teach participants how to develop and refine repeatable and testable workflows for spatial data using common standard programming practices.

2016 Bootcamp participants

On Day 1 we focused on setting up a collaborative virtual data environment through virtual machines, spatial databases (PostgreSQL/PostGIS) with multi-user editing and versioning (GeoGig). We also talked about open data and open standards, and moderndata formats and tools (GeoJSON, GDAL).  On Day 2 we focused on open analytical tools for spatial data. We focused on Python (i.e. PySAL, NumPy, PyCharm, iPython Notebook), and R tools.  Day 3 was dedicated to the web stack, and visualization via ESRI Online, CartoDB, and Leaflet. Web mapping is great, and as OpenGeo.org says: “Internet maps appear magical: portals into infinitely large, infinitely deep pools of data. But they aren't magical, they are built of a few standard pieces of technology, and the pieces can be re-arranged and sourced from different places.…Anyone can build an internet map."

All-in-all it was a great time spent with a collection of very interesting mapping professionals from around the country. Thanks to everyone!

ASTER Data Open - No April Fools!

We know about the amazing success for science, education, government, and business that has resulted from the opening of the Landsat archive in 2008. Now more encouraging news about open data:

On April 1, 2016, NASA's Land Processes Distributed Active Archive Center (LP DAAC) began distributing ASTER Level 1 Precision Terrain Corrected Registered At-Sensor Radiance (AST_L1T) data products over the entire globe at no charge. Global distribution of these data at no charge is a result of a policy change made by NASA and Japan.

The AST_L1T product provides a quick turn-around of consistent GIS-ready data as a multi-file product, which includes a HDF-EOS data file, full-resolution composite images (FRI) as GeoTIFFs for tasked telescopes (e.g., VNIR/SWIR and TIR ), and associated metadata files. In addition, each AST_L1T granule contains related products including low-resolution browse and, when applicable, a Quality Assurance (QA) browse and QA text report.

More than 2.95 million scenes of archived data are now available for direct download through the LP DAAC Data Pool and for search and download through NASA‘s Earthdata Search Client and also through USGS‘ GloVis , and USGS‘ EarthExplorer . New scenes will be added as they are acquired and archived.

ASTER is a partnership between NASA, Japan‘s Ministry of Economy, Trade and Industry (METI), the National Institute of Advanced Industrial Science and Technology (AIST) in Japan, and Japan Space Systems (J-spacesystems ).

Visit the LP DAAC ASTER Policy Change Page to learn more about ASTER. Subscribe to the LP DAAC listserv for future announcements.

It's AAG time again!

The annual AAG conference is rolling into town next week, and several of us will be there. 

  • Kelly and Jenny will be presenting; 
    • Kelly: Disentangling drivers of change in California Forests: management and climate 
    • Jenny: Spatial Data Science for Collaborative Geospatial Research
  • Alice is a discussant on THREE panels; and 
  • I am a discussant on the Historical Ecology session. 

Former kellylabbers will also be in force: 

  • John Connors is presenting (and organizing, and morderating, and all kinds of things): 
    • Disentangling Diversity: Agrobiodiversity, Livelihoods, and Food Security in the Kilombero Valley, Tanzania
  • Desheng Liu will be there: 
    • Reconstructing Land Cover Trajectories from Dense MODIS Time Series
  • Ellen Kersten will be presenting:
    • Got health? Using spatial and temporal analysis to achieve health equity for children 

 Have a great time everyone! (If I have missed anyone, let me know!)

Bootcamp is here! Spatial Data Science 2016

Spatial Data Science for Professionals

We live in a world where the importance of spatial data is ever increasing. Many of the societal challenges we face today — fire response, energy distribution, efficient resource allocation, land use, food scarcity, invasive species, climate change, privacy and safety — are associated with big spatial data.  Addressing these challenges will require trained analysts fluent in:
  • integrating disparate data, from aircraft, satellites, mobile phones, historic collections, public records, the internet;
  • using easily available and open technology for robust data analysis, sharing, and publication;
  • understanding and applying core spatial analysis methods;
  • and applying visualization tools to communicate with project managers, policy-makers, scientists and the public.

Mastering these challenges requires Spatial Data Science: big data tools, geospatial analytics, and visualization. Today’s marketplace needs trained analysts who know how to find, evaluate, manage, analyze and publish spatial data in a variety of environments. With this hands-on Spatial Data Science Bootcamp for professionals, you can expand your GIS skill level and learn how to integrate open source and web-based solutions into your GIS toolkit by gaining an understanding of spatial data science techniques.

The goal of this Spatial Data Science Bootcamp is to familiarize participants with the modern spatial data workflow and explore open source and cloud/web based options for spatial data management, analysis, visualization and publication. We’ll use hands-on exercises that leverage open source and cloud/web based technologies for a variety of spatial data applications.

https://iep.berkeley.edu/spatial/iep-spatial-bootcamp-overview

LandFire is looking for field data! Add yours now.

I wanted to send out a friendly reminder that the data submission deadline for the current data call is March 31, 2016.  Data submitted before March 31 are evaluated for inclusion in the appropriate update cycle, and submissions after March 31 are typically considered in subsequent updates.  

This is the last call for vegetation/fuel plot data that can be used for the upcoming LANDFIRE Remap. If you have any plot data you would like to contribute please submit the data by March 31 in order to guarantee the data will be evaluated for inclusion in the LF2015 Remap. LANDFIRE is also accepting contributions of polygon data from 2015/2016 for disturbance and treatment activities. Please see the attached data call letter for more information.

Brenda Lundberg, Senior Scientist

Stinger Ghaffarian Technologies (SGT, Inc.)

Contractor to the U.S. Geological Survey (USGS)

Earth Resources Observation & Science (EROS) Center

Phone: 406.329.3405

Email: blundberg@usgs.gov

IGIS exploring applications of drones for UC Agriculture and Natural Resources

IGIS is excited to be working with 3D Robotics (3DR) to explore new applications of small unmanned aerial systems (sUAS) for monitoring agriculture and natural resources.  This technology has never been more practical for scientific exploration; however, there is still much to be learned about how to best utilize sUAS in this way.

DEM from drone flightIGIS is now developing protocols for safe and efficient deployment of a 3DR Solo sUAS.  Equipped with a common 12 megapixel GoPro Hero camera, this platform can survey up to 75 acres, at 3 inches of spatial resolution in less than 20 minutes, while flying a pre-defined flight path at 23 miles per hour, at 300 feet above ground level.  Then thanks to Pix4D mapping software, which is used to combine the pictures collected by the sUAS's GoPro into a single image mosaic, automated photogrammetric processes can render a digital terrain model from the images with a vertical accuracy close to the same 3 inches spatial resolution found in the original image collection.

IGIS has introduced sUAS and remote sensing training into our workshop schedule for this year.  Please check out our IGIS training calendar by Clicking Here for more information.

New IGIS Academic Coordinator: Andy Lyons

I would like to welcome Andy Lyons as Academic Coordinator III in the ANR Informatics and GIS (IGIS) Statewide Program. Andy comes to ANR from Stanford and before that Berkeley, where he completed his PhD in ESPM. He has an exceptionally broad academic training, bridging both the social and natural sciences, with considerable experience within both academic and not-for-profit business environments. He has a strong combination of ecology, social science, data science, and computer applications (programming, data management, multimedia, web, modeling), and grant/report writing skills. He is also a gifted teacher, won awards for his teaching at Cal including an Outstanding Graduate Instructor Award in 2004, and has been one of our instructors at the GIF's Spatial Data Science Bootcamp.

Data Science for the 21st Century - External Partners Invited!

Developing data-driven solutions in the face of rapid global change

Global environmental change poses critical environmental and societal needs, and the next generation of students are part of the future solutions.  This National Science Foundation Research Traineeship (NRT) in Data Science for the 21st Century prepares graduate students at the University of California Berkeley with the skills and knowledge needed to evaluate how rapid environmental change impacts human and natural systems and to develop and evaluate data-driven solutions in public policy, resource management, and environmental design that will mitigate negative effects on human well-being and the natural world.  Trainees will research topics such as management of water resources, regional land use, and responses of agricultural systems to economic and climate change, and develop skills in data visualization, informatics, software development, and science communication.

In a final semester innovative team-based problem-solving course, trainees will collaborate with an external partner organization to tackle a challenge in global environmental change that includes a significant problem in data analysis and interpretation of impacts and solutions. This collaboration is a fundamental and distinguishing component of the NRT program. We hope this collaboration will not only advance progress on the grand challenges of national and global importance, but also be memorable and useful for the trainees, and for the partners.

An Invitation to Collaborate

We are inviting collaboration with external partners to work with our students on their Team Research Project in 2016-17. Our students would greatly benefit from working with research agencies, non-profits, and industry.

  • Our first cohort of 14 students come from seven different schools across campus, each bringing new skillsets, backgrounds, and perspectives.
  • Team projects will be designed and executed in the spring of 2017.
  • Partners are welcome to visit campus, engage with students and take part in our project activities.
    • Join us at our first annual symposium on May 6th 4-7 pm.
    • Participate in workplace/ campus exchange.
    • Contact the program coordinator at hconstable@berkeley.edu
    • Visit us at http://ds421.berkeley.edu/ for more information.

This new NSF funded DS421 program is in the first of 5 years. We look forward to building ongoing collaborations with partners and UC Berkeley.

With Drought Comes Disease

You’ve heard that millions of California’s trees have died from drought and bark beetles. Weakened by lack of water due to four consecutive years of drought, over 29 million conifers and hardwood trees were unable to fight off attacks from bark beetles and died. Check out where and when these trees have died across California using the new Tree Mortality Viewer from FRAP, CalFire, and USFS

Tree Mortality Viewer: Mortality at Point Reyes National Seashore

Overlayed on a map of California, the Viewer visibility layers show:

  • Tree Mortality — Results of 2012-2015 aerial tree-mortality surveys. See how the situation has dramatically worsened over the years

You can download the original data  here too!