Measuring impact in extension programming: case study of IGIS

At UC Berkeley and at UC ANR, my outreach program involves the creation, integration, and application of research-based technical knowledge for the benefit of the public, policy-makers, and land managers. My work focuses on environmental management, vegetation change, vegetation monitoring, and climate change. Critical to my work is the ANR Statewide Program in Informatics and GIS (IGIS), which I began in 2012 and is now really cranking with our crack team of IGIS people.  We developed the IGIS program in 2012 to provide research technology and data support for ANR’s mission objectives through the analysis and visualization of spatial data. We use state-of-the-art web, database and mapping technology to provide acquisition, storage, and dissemination of large data sets critical to the ANR mission. We develop and delivers training on research technologies related to important agricultural and natural resource issues statewide. We facilitate networking and collaboration across ANR and UC on issues related to research technology and data. And we deliver research support through a service center for project level work that has Division-wide application. Since I am off on sabbatical, I have decided to take some time to think about my outreach program and how evaluate its impact.  

There is a great literature about the history of extension since its 1914 beginnings, and specifically about how extension programs around the nation have been measuring impact. Extension has explored a variety of ways to measure the value of engagement for the public good (Franz 2011, 2014). Early attempts to measure performance focused on activity and reach: the number of individuals served and the quality of the interaction with those individuals. Through time, extension began to turn their attention to program outcomes. Recently, we’ve been focusing on articulating the Public Values of extension, via Condition Change metrics (Rennekamp and Engle 2008). One popular evaluation method has been the Logic Model, used by extension educators to evaluate the effectiveness of a program through the development of a clear workflow or plan that links program outcomes or impacts with outputs, activities and inputs.  We’ve developed a fair number of these models for the Sierra Nevada Adaptive Management Program (SNAMP) for example. Impacts include measures of changes in learning, behavior, or condition change across engagement efforts. Recently, change in policy became an additional measure to evaluate impact. I also think measuring reach is needed, and possible.

So, just to throw it out there, here is my master table of impact that I try to use for measuring and evaluating impact of my outreach program, and I’d be interested to hear what you all think of it.

  • Change in reach: Geographic scope, Location of events, Number of users, etc.
  • Change in activity: Usage, Engagement with a technology, New users, Sessions, Average session duration
  • Change in learning; Participants have learned something new from delivered content
  • Change in action, behavior, method; New efficiencies, Streamlined protocols, Adoption of new data, Adoption of best practices
  • Change in policy; Evidence of contributions to local, state, or federal regulations
  • Change in outcome: measured conditions have improved = condition change

I recently used this framework to help me think about impact of the IGIS program, and I share some results here.

Measuring Reach. The IGIS program has developed and delivered workshops throughout California, through the leadership of Sean Hogan, Shane Feirer, and Andy Lyons (http://igis.ucanr.edu/IGISTraining). We manage and track all this activity through a custom data tracking dashboard that IGIS developed (using Google Sheets as a database linked to ArcGIS online to render maps - very cool), and thus can provide key metrics about our reach throughout California. Together, we have delivered 52 workshops across California since July 2015 and reached nearly 800 people. These include workshops on GIS for Forestry, GIS for Agriculture, Drone Technology, WebGIS, Mobile Data Collection, and other topics. This is an impressive record of reach: these workshops have served audiences throughout California. We have delivered workshops from Humboldt to the Imperial Valley, and the attendees (n=766) have come from all over California. Check this map out:

2015-2018.jpg

Measuring Impact. At each workshop, we provide a feedback mechanism via an evaluation form and use this input to understand client satisfaction, reported changes in learning, and reported changes in participant workflow. We’ve been doing this for years, but I now think the questions we ask on those surveys need to change. We are really capturing the client satisfaction part of the process, and we need to do a better job on the change in learning and change in action parts of the work.

Having done this exercise, I can clearly see that measuring reach and activity are perhaps the easiest things to measure. We have information tools at our fingertips to do this: online web mapping of participant zip-codes, google analytics to track website activity. Measuring the other impacts: change in action, contributions to policy and actual condition changes are tough. I think extension will continue to struggle with these, but they are critical to help us articulate our value to the public. More work to do!

References
Franz, Nancy K. 2011. “Advancing the Public Value Movement: Sustaining Extension During Tough Times.” Journal of Extension 49 (2): 2COM2.
———. 2014. “Measuring and Articulating the Value of Community Engagement: Lessons Learned from 100 Years of Cooperative Extension Work.” Journal of Higher Education Outreach and Engagement 18 (2): 5.
Rennekamp, Roger A., and Molly Engle. 2008. “A Case Study in Organizational Change: Evaluation in Cooperative Extension.” New Directions for Evaluation 2008 (120): 15–26.

Wrap up from the Esri Imagery and Mapping Forum

Recently, Esri has been holding an Imagery and Mapping Forum prior to the main User Conference. This year I was able to join as an invited panelist for the Executive Panel and Closing Remarks session on Sunday. During the day I hung out in the Imaging and Innovation Zone, in front of the Drone Zone (gotta get one of these for ANR). This was well worth attending: smaller conference - focused topics - lots of tech reveals - great networking. 

Notes from the day: Saw demos from a range of vendors, including:

  • Aldo Facchin from Leica gave a slideshow about the Leica Pegasus: Backpack. Their backpack unit workflow uses SLAM; challenges include fusion of indoor and outdoor environments (from transportation networks above and below ground). Main use cases were industrial, urban, infrastructure. http://leica-geosystems.com/en-us/products/mobile-sensor-platforms/capture-platforms/leica-pegasus-backpack
  • Jamie Ritche from Urthecast talked about "Bringing Imagery to Life". He says our field is "a teenager that needs to be an adult". By this he means that in many cases businesses don't know what they need to know. Their solution is in apps- "the simple and the quick": quick, easy, disposable and useful. 4 themes: revisit, coverage, time, quality. Their portfolio includes DEIMOS 1, Theia, Iris, DEIMOIS-2, PanGeo + . Deimos-1 focuses on agriculture. UrtheDaily: 5m pixels, 20TB daily, (40x the Sentinel output); available in 2019. They see their constellation and products as very comparable to Sentinel, Landsat, RapidEye. They've been working with Land O Lakes as their main imagery delivery. Stressing the ability of apps and cloud image services to deliver quick, meaningful information to users. https://www.urthecast.com/
  • Briton Vorhees from SenseFly gave an overview of: "senseFly's Drone Designed Sensors". They are owned by Parrot, and have a fleet of fixed wing drones (e.g. the eBee models); also drone optimized cameras, shock-proof, fixed lens, etc (e.g. SODA). These can be used as a fleet of sensors (gave an citizen-science example from Zanzibar (ahhh Zanzibar)). They also use Sequoia cameras on eBees for a range of applications. https://www.sensefly.com/drones/ebee.html
  • Rebecca Lasica and Jarod Skulavik from Harris Geospatial Solutions: The Connected Desktop". They showcased their new ENVI workflow implemented in ArcGIS Pro. Through a Geospatial Services Framework that "lifts" ENVI off the desktop; and creates an ENVI Engine. They showed some interesting crop applications - they call it "Crop Science". This http://www.harrisgeospatial.com/
  • Jeff Cozart and McCain McMurray from Juniper Unmanned shared "The Effectiveness of Drone-Based Lidar" and talked about the advantages of drone-based lidar for terrain mapping and other applications. They talked through a few projects, and highlighted that the main advantages of drone-based lidar are in the data, not in the economics per se. But the economies do work out too. (They partner with Reigl and YellowScan from France.)  They showcased an example from Colorado that compared lidar (I think it was a Reigl on a DJI Matrice) and traditional field survey - the lidar cost was 1/24th as expensive as the field survey. They did a live demo of ArcGIS tools with their CO data: classification of ground, feature extraction, etc. http://juniperunmanned.com/
  • Aerial Imaging Productions talked about their indoor scanning - this linking-indoor-to-outdoor (i.e. making point cloud data truly geo) is a big theme here. Also OBJ is a data format. (From Wikipedia: "The OBJ file format is a simple data-format that represents 3D geometry alone — namely, the position of each vertex, the UV position of each texture coordinate vertex, vertex normals, and the faces that make each polygon defined as a list of vertices, and texture vertices.") It is used in the 3D graphics world, but increasingly for indoor point clouds in our field.
  • My-Linh Truong from Riegl talked about their new static, mobile, airborne, and UAV lidar platforms. They've designed some mini lidar sensors for smaller UAVas (3lbs; 100kHz; 250m range; ~40pts/m2). Their ESRI workflow is called LMAP, and it relies on some proprietary REIGL software processing at the front end, then transfer to ArcGIS Pro (I think). http://www.rieglusa.com/index.html

We wrapped up the day with a panel discussion, moderated by Esri's Kurt Schwoppe, and including Lawrie Jordan from Esri, Greg Koeln from MDA, Dustin Gard-Weiss from NGA, Amy Minnick from DigitalGlobe, Hobie Perry from USFS-FIA, David Day from PASCO, and me. We talked about the promise and barriers associated with remote sensing and image processing from all of our perspectives. I talked alot about ANR and IGIS and the use of geospatial data, analysis and viz for our work in ANR. Some fun things that came out of the panel discussion were:

  • Cool stuff:
    • Lawrie Jordan started Erdas!
    • Greg Koeln wears Landsat ties (and has a Landsat sportcoat). 
    • Digital Globe launched their 30cm resolution WorldView-4. One key case study was a partnership with Associated Press to find a pirate fishing vessel in action in Indonesia. They found it, and busted it, and found on board 2,000 slaves.
    • The FIA is increasingly working on understanding uncertainty in their product, and they are moving for an image-base to a raster-based method for stratification.
    • Greg Koeln, from MDA (he of the rad tie- see pic below) says: "I'm a fan of high resolution imagery...but I also know the world is a big place".
  • Challenges: 
    • We all talked about the need to create actionable, practical, management-relevant, useful information from the wealth of imagery we have at our fingertips: #remotesensible. 
    • Multi-sensor triangulation (or georeferencing a stack of imagery from multiple sources to you and me) is a continual problem, and its going to get worse before it gets better with more imagery from UAVs. On that note, Esri bought the patent for "SIFT" a Microsoft algorithm to automate the relative registration of an image stack.
    • Great question at the end about the need to continue funding for the public good: ANR is critical here!
    • Space Junk.
  • Game-changers: 
    • Opening the Landsat archive: leading to science (e.g. Hansen et al. 2013), leading to tech (e.g. GEE and other cloud-based processors). Greg pointed out that in the day, his former organization (Ducks Unlimited) paid $4,400 per LANDSAT scene to map wetlands nationwide! That's a big bill. 
    • Democratization of data collection: drones, smart phones, open data...
The panel in action

The panel in action

Notes and stray thoughts:

  • Esri puts on a quality show always. San Diego always manages to feel simultaneously busy and fun, while not being crowded and claustrophobic. Must be the ocean, the light and the air.
  • Trying to get behind the new "analytics" replacement of "analysis" in talks. I am not convinced everyone is using analytics correctly ("imagery analytics such as creating NDVI"), but hey, it's a thing now: https://en.wikipedia.org/wiki/Analytics#Analytics_vs._analysis
  • 10 years ago I had a wonderful visitor to my lab from Spain - Francisco Javier Lozano - and we wrote a paper: http://www.sciencedirect.com/science/article/pii/S003442570700243X. He left to work at some crazy startup company called Deimos in Spain, and Lo and Behold, he is still there, and the company is going strong. The Deimos satellites are part of the UrtheCast fleet. Small world!
  • The gender balance at the Imagery portion of the Esri UC is not. One presenter at a talk said to the audience with a pointed stare at me: "Thanks for coming Lady and Gentlemen".

Good fun! Now more from Shane and Robert at the week-long Esri UC!

Wrap up from the FOODIT: Fork to Farm Meeting

UC ANR was a sponsor for the FOODIT: Fork to Farm meeting in June 2017: http://mixingbowlhub.com/events/food-fork-farm/. Many of us were there to learn about what was happening in the food-data-tech space and learn how UCANR can be of service. It was pretty cool. First, it was held in the Computer History Museum, which is rad. Second, the idea of the day was to link partners, industry, scientists, funders, and foodies, around sustainable food production, distribution, and delivery. Third, there were some rad snacks (pic below). 

We had an initial talk from Mikiel Bakker from Google Food, who have broadened their thinking about food to include not just feeding Googlers, but also the overall food chain and food system sustainability. They have developed 5 "foodshots" (i.e. like "moonshot" thinking): 1) enable individuals to make better choices, 2) shift diets, 3) food system transparency, 4) reduce food losses, and 5) how to make a closed, circular food system.

We then had a series of moderated panels.

The Dean's List introduced a panel of University Deans, moderated by our very own Glenda Humiston @UCANR, and included Helene Dillard (UCDavis), Andy Thulin (CalPoly), Wendy Wintersteen (Iowa State). Key discussion points included lack of food system transparency, science communication and literacy, making money with organics, education and training, farm sustainability and efficiency, market segmentation (e.g. organics), downstream processing, and consumer power to change food systems. Plus the Amazon purchase of Whole Foods.

The Tech-Enabled Consumer session featured 4 speakers from companies who feature tech around food. Katie Finnegan from Walmart, David McIntyre from Airbnb, Barbara Shpizner from Mattson, Michael Wolf from The Spoon. Pretty neat discussion around the way these diverse companies use tech to customize customer experience, provide cost savings, source food, contribute to a better food system. 40% of food waste is in homes, another 40% is in the consumer arena. So much to be done!

The session on Downstream Impacts for the Food Production System featured Chris Chochran from ReFed @refed_nowaste, Sabrina Mutukisna from The Town Kitchen @TheTownKitchen, Kevin Sanchez from the Yolo Food Bank @YoloFoodBank, and Justin Siegel from UC Davis International Innovation and Health. We talked about nutrition for all, schemes for minimizing food waste, waste streams, food banks, distribution of produce and protein to those who need them (@refed_nowaste and @YoloFoodBank), creating high quality jobs for young people of color in the food business (@TheTownKitchen), the amount of energy that is involved in the food system (David Lee from ARPA-E); this means 7% of our energy use in the US inadvertently goes to CREATING FOOD WASTE. Yikes!

The session on Upstream Production Impacts from New Consumer Food Choices featured Ally DeArman from Food Craft Institute @FoodCraftInst, Micke Macrie from Land O' Lakes, Nolan Paul from Driscoll's @driscollsberry, and Kenneth Zuckerberg from Rabobank @Rabobank. This session got cut a bit short, but it was pretty interesting. Especially the Food Craft Institute, whose mission is to help "the small guys" succeed in the food space.

The afternoon sessions included some pitch competitions, deep dive breakouts and networking sessions. What a great day for ANR.

Day 2 Wrap Up from the NEON Data Institute 2017

First of all, Pearl Street Mall is just as lovely as I remember, but OMG it is so crowded, with so many new stores and chains. Still, good food, good views, hot weather, lovely walk.

Welcome to Day 2! http://neondataskills.org/data-institute-17/day2/
Our morning session focused on reproducibility and workflows with the great Naupaka Zimmerman. Remember the characteristics of reproducibility - organization, automation, documentation, and dissemination. We focused on organization, and spent an enjoyable hour sorting through an example messy directory of misc data files and code. The directory looked a bit like many of my directories. Lesson learned. We then moved to working with new data and git to reinforce yesterday's lessons. Git was super confusing to me 2 weeks ago, but now I think I love it. We also went back and forth between Jupyter and python stand alone scripts, and abstracted variables, and lo and behold I got my script to run. All the git stuff is from http://swcarpentry.github.io/git-novice/

The afternoon focused on Lidar (yay!) and prior to coding we talked about discrete and waveform data and collection, and the opentopography (http://www.opentopography.org/) project with Benjamin Gross. The opentopography talk was really interesting. They are not just a data distributor any more, they also provide a HPC framework (mostly TauDEM for now) on their servers at SDSC (http://www.sdsc.edu/). They are going to roll out a user-initiated HPC functionality soon, so stay tuned for their new "pluggable assets" program. This is well worth checking into. We also spent some time live coding with Python with Bridget Hass working with a CHM from the SERC site in California, and had a nerve-wracking code challenge to wrap up the day.

Fun additional take-home messages/resources:

Thanks to everyone today! Megan Jones (our fearless leader), Naupaka Zimmerman (Reproducibility), Tristan Goulden (Discrete Lidar), Keith Krause (Waveform Lidar), Benjamin Gross (OpenTopography), Bridget Hass (coding lidar products).

Day 1 Wrap Up
Day 2 Wrap Up 
Day 3 Wrap Up
Day 4 Wrap Up

Our home for the week

Our home for the week

Day 1 Wrap Up from the NEON Data Institute 2017

I left Boulder 20 years ago on a wing and a prayer with a PhD in hand, overwhelmed with bittersweet emotions. I was sad to leave such a beautiful city, nervous about what was to come, but excited to start something new in North Carolina. My future was uncertain, and as I took off from DIA that final time I basically had Tom Petty's Free Fallin' and Learning to Fly on repeat on my walkman. Now I am back, and summer in Boulder is just as breathtaking as I remember it: clear blue skies, the stunning flatirons making a play at outshining the snow-dusted Rockies behind them, and crisp fragrant mountain breezes acting as my Madeleine. I'm back to visit the National Ecological Observatory Network (NEON) headquarters and attend their 2017 Data Institute, and re-invest in my skillset for open reproducible workflows in remote sensing. 

Day 1 Wrap Up from the NEON Data Institute 2017
What a day! http://neondataskills.org/data-institute-17/day1/
Attendees (about 30) included graduate students, old dogs (new tricks!) like me, and research scientists interested in developing reproducible workflows into their work. We are a pretty even mix of ages and genders. The morning session focused on learning about the NEON program (http://www.neonscience.org/): its purpose, sites, sensors, data, and protocols. NEON, funded by NSF and managed by Battelle, was conceived in 2004 and will go online for a 30-year mission providing free and open data on the drivers of and responses to ecological change starting in Jan 2018. NEON data comes from IS (instrumented systems), OS (observation systems), and RS (remote sensing). We focused on the Airborne Observation Platform (AOP) which uses 2, soon to be 3 aircraft, each with a payload of a hyperspectral sensor (from JPL, 426, 5nm bands (380-2510 nm), 1 mRad IFOV, 1 m res at 1000m AGL) and lidar (Optech and soon to be Riegl, discrete and waveform) sensors and a RGB camera (PhaseOne D8900). These sensors produce co-registered raw data, are processed at NEON headquarters into various levels of data products. Flights are planned to cover each NEON site once, timed to capture 90% or higher peak greenness, which is pretty complicated when distance and weather are taken into account. Pilots and techs are on the road and in the air from March through October collecting these data. Data is processed at headquarters.

In the afternoon session, we got through a fairly immersive dunk into Jupyter notebooks for exploring hyperspectral imagery in HDF5 format. We did exploration, band stacking, widgets, and vegetation indices. We closed with a fast discussion about TGF (The Git Flow): the way to store, share, control versions of your data and code to ensure reproducibility. We forked, cloned, committed, pushed, and pulled. Not much more to write about, but the whole day was awesome!

Fun additional take-home messages:

Thanks to everyone today, including: Megan Jones (Main leader), Nathan Leisso (AOP), Bill Gallery (RGB camera), Ted Haberman (HDF5 format), David Hulslander (AOP), Claire Lunch (Data), Cove Sturtevant (Towers), Tristan Goulden (Hyperspectral), Bridget Hass (HDF5), Paul Gader, Naupaka Zimmerman (GitHub flow).

Day 1 Wrap Up
Day 2 Wrap Up 
Day 3 Wrap Up
Day 4 Wrap Up

Women in GIS interview!

Hi all! I was recently profiled for the excellent website: Women in GIS (or WiGIS). This is a group of technical-minded women who maintain this website to feature women working in the geospatial industry with our Who We Are spotlight series. and in addition, the individuals in this group make their presence known at conferences like CalGIS and ESRI’s UCs. We also plan to host a number of online resources women might find useful to start or navigate their GIS career.

Excellent time, and thanks for the opportunity!

ISECCI historical ecology working group wrap-up

Last week Kelly and I with others travelled to the Sierra Nevada Aquatic Research Lab (SNARL) in the eastern Sierra Nevada, just south of Mono Lake for a research retreat. SNARL is part of the UC's Natural Reserve System, which is comprised of nearly 40 properties across the state. These are preserves that foster research, education and collaboration. They have much in common with ANR's REC system. I've been to a few of them now, and am very pleased to make more visits. I love the east side of the Sierra, and that iconic Highway 395. 

This trip was a retreat for the ISECCI historical ecology working group, led by the inspirational Peter Alagona from UCSB. We discussed our existing projects, including the VTM work (see figure below), and talked about potentials for more collaborative research and further integration between NRS and ANR. We have a list of wishes for digitization, and if anyone out there has ideas about pitching these to donors, please let me know. For example: 

  • Kelly and I want to digitize the Leiburg maps from the northern Sierra to add to the VTM stack;
  • We want to find a better way to index and view historical aerial photography state-wide. Something like this for historical maps: http://ngmdb.usgs.gov/maps/TopoView/help/

And we had a field trip looking at Mono Lake water issues. Great time spent!

Density of VTM features across the collections

Spatial Data Science Bootcamp 2016!

Last week we held another bootcamp on Spatial Data Science. We had three packed days learning about the concepts, tools and workflow associated with spatial databases, analysis and visualizations. Our goal was not to teach a specific suite of tools but rather to teach participants how to develop and refine repeatable and testable workflows for spatial data using common standard programming practices.

2016 Bootcamp participants

On Day 1 we focused on setting up a collaborative virtual data environment through virtual machines, spatial databases (PostgreSQL/PostGIS) with multi-user editing and versioning (GeoGig). We also talked about open data and open standards, and moderndata formats and tools (GeoJSON, GDAL).  On Day 2 we focused on open analytical tools for spatial data. We focused on Python (i.e. PySAL, NumPy, PyCharm, iPython Notebook), and R tools.  Day 3 was dedicated to the web stack, and visualization via ESRI Online, CartoDB, and Leaflet. Web mapping is great, and as OpenGeo.org says: “Internet maps appear magical: portals into infinitely large, infinitely deep pools of data. But they aren't magical, they are built of a few standard pieces of technology, and the pieces can be re-arranged and sourced from different places.…Anyone can build an internet map."

All-in-all it was a great time spent with a collection of very interesting mapping professionals from around the country. Thanks to everyone!

LandFire is looking for field data! Add yours now.

I wanted to send out a friendly reminder that the data submission deadline for the current data call is March 31, 2016.  Data submitted before March 31 are evaluated for inclusion in the appropriate update cycle, and submissions after March 31 are typically considered in subsequent updates.  

This is the last call for vegetation/fuel plot data that can be used for the upcoming LANDFIRE Remap. If you have any plot data you would like to contribute please submit the data by March 31 in order to guarantee the data will be evaluated for inclusion in the LF2015 Remap. LANDFIRE is also accepting contributions of polygon data from 2015/2016 for disturbance and treatment activities. Please see the attached data call letter for more information.

Brenda Lundberg, Senior Scientist

Stinger Ghaffarian Technologies (SGT, Inc.)

Contractor to the U.S. Geological Survey (USGS)

Earth Resources Observation & Science (EROS) Center

Phone: 406.329.3405

Email: blundberg@usgs.gov

Spatial Data Science Bootcamp March 2016

Register now for the March 2016 Spatial Data Science Bootcamp at UC Berkeley!

We live in a world where the importance and availability of spatial data are ever increasing. Today’s marketplace needs trained spatial data analysts who can:

  • compile disparate data from multiple sources;
  • use easily available and open technology for robust data analysis, sharing, and publication;
  • apply core spatial analysis methods;
  • and utilize visualization tools to communicate with project managers, the public, and other stakeholders.

To help meet this demand, International and Executive Programs (IEP) and the Geospatial Innovation Facility (GIF) are hosting a 3-day intensive Bootcamp on Spatial Data Science on March 23-25, 2016 at UC Berkeley.

With this Spatial Data Science Bootcamp for professionals, you will learn how to integrate modern Spatial Data Science techniques into your workflow through hands-on exercises that leverage today's latest open source and cloud/web-based technologies. We look forward to seeing you here!

To apply and for more information, please visit the Spatial Data Science Bootcamp website.

Limited space available. Application due on February 19th, 2016.

Our summary from SNAMP: 31 integrated recommendations

The following forest management recommendations consider the SNAMP focal resources (forest, water, wildlife), as well as public participation, as an integrated group. These recommendations were developed by the UC Science Team working together. Although each recommendation was written by one or two authors, the entire team has provided input and critique for the recommendations. The entire UC Science Team endorses all of these integrated management recommendations. Click at the bottom of the post for the full description of each recommendation. 

Section 1: Integrated management recommendations based directly on SNAMP science

Wildfire hazard reduction

1. If your goal is to reduce severity of wildfire effects, SPLATs are an effective means to reduce the severity of wildfires. 

SPLAT impacts on forest ecosystem health 

2. If your goal is to improve forest ecosystem health, SPLATs have a positive effect on tree growth efficiency.

SPLAT impact assessment

3. If your goal is to integrate across firesheds, an accurate vegetation map is essential, and a fusion of optical, lidar and ground data is necessary. 

4. If your goal is to understand the effects of SPLATs, lidar is essential to accurately monitor the intensity and location of SPLAT treatments.

SPLAT impacts on California spotted owl and Pacific fisher

5. If your goal is to maintain existing owl and fisher territories, SPLATs should continue to be placed outside of owl Protected Activity Centers (PACs) and away from fisher den sites, in locations that reduce the risk of high-severity fire occurring within or spreading to those areas.

6. If your goal is to maintain landscape connectivity between spotted owl territories, SPLATs should be implemented in forests with lower canopy cover whenever possible.

7. If your goal is to increase owl nest and fisher den sites, retain oaks and large conifers within SPLAT treatments.

8. If your goal is to maintain fisher habitat quality, retention of canopy cover is a critical consideration.

9. If your goal is to increase fisher foraging activity, limit mastication and implement more post-mastication piling and/or burning to promote a faster recovery of the forest floor condition. 

10. If your goal is to understand SPLAT effects on owl and fisher, it is necessary to consider a larger spatial scale than firesheds.

SPLAT impacts on water quantity and quality

11. If your goal is to detect increases in water yield from forest management, fuel treatments may need to be more intensive than the SPLATs that were implemented in SNAMP.

12. If your goal is to maintain water quality, SPLATs as implemented in SNAMP have no detectable effect on turbidity.

Stakeholder participation in SPLAT implementation and assessment

13. If your goal is to increase acceptance of fuel treatments, employ outreach techniques that include transparency, shared learning, and inclusiveness that lead to relationship building and the ability to work together.

14. If your goal is the increased acceptance of fuel treatments, the public needs to understand the tradeoffs between the impacts of treatments and wildfire.

Successful collaborative adaptive management processes

15. If your goal is to establish a third party adaptive management project with an outside science provider, the project also needs to include an outreach component.

16. If your goal is to develop an engaged and informed public, you need to have a diverse portfolio of outreach methods that includes face to face meetings, surveys, field trips, and web-based information.

17. If your goal is to understand or improve outreach effectiveness, track production, flow, and use of information.

18. If your goal is to engage in collaborative adaptive management at a meaningful management scale, secure reliable long term sources of funding.

19. If your goal is to maintain a successful long-term collaborative adaptive management process, establish long-term relationships with key people in relevant stakeholder groups and funding agencies.

Section 2: Looking forward - Integrated management recommendations based on expert opinion of the UC Science Team

Implementation of SPLATs

20. If your goal is to maximize the value of SPLATs, complete treatment implementation, especially the reduction of surface fuels.

21. If your goal is to efficiently reduce fire behavior and effects, SPLATs need to be strategically placed on the landscape.

22. If your goal is to improve SPLAT effectiveness, increase heterogeneity within treatment type and across the SPLAT network.

Forest ecosystem restoration

23. If your goal is to restore Sierra Nevada forest ecosystems and improve forest resilience to fire, SPLATs can be used as initial entry, but fire needs to be reintroduced into the system or allowed to occur as a natural process (e.g., managed fire).

24. If your goal is to manage the forest for long-term sustainability, you need to consider the pervasive impacts of climate change on wildfire, forest ecosystem health, and water yield.

Management impacts on California spotted owl and Pacific fisher 

25. If your goal is to enhance landscape habitat condition for owl and fisher, hazard tree removal of large trees should be carefully justified before removing.

26. If your goal is to minimize the effects of SPLATs on fisher, SPLAT treatments should be dispersed through space and time.

Management impacts on water quantity and quality

27. If your goal is to optimize water management, consider the range of potential fluctuations in precipitation and temperature.

Successful collaborative adaptive management processes

28. If your goal is to implement collaborative adaptive management, commit enough time, energy, and training of key staff to complete the adaptive management cycle.

29. The role of a third party science provider for an adaptive management program can be realized in a variety of ways.

30. If the goal is to implement adaptive management, managers must adopt clear definitions and guidelines for how new information will be generated, shared, and used to revise subsequent management as needed.

31. If your goal is to increase forest health in the Sierra Nevada, we now know enough to operationalize some of the aspects of SNAMP more broadly.

Read More

SNAMP wrap up: Forest Service should implement proposed forest treatments

SNAMP field trip: photo from Shufei LeiFull press release: http://ucanr.edu/?blogpost=19857&blogasset=81020

After conducting extensive forest research and taking into consideration all aspects of forest health – including fire and wildlife behavior, water quality and quantity – a group of distinguished scientists have concluded that enough is now known about proposed U.S. Forest Service landscape management treatments for them to be implemented in Sierra Nevada forests. We say:

“There is currently a great need for forest restoration and fire hazard reduction treatments to be implemented at large spatial scales in the Sierra Nevada.”

“The next one to three decades are a critical period: after this time it may be very difficult to influence the character of Sierra Nevada forests, especially old forest characteristics.”

The scientists' recommendation is in the final report of a unique, 10-year experiment in collaboration: the Sierra Nevada Adaptive Management Project (SNAMP). A 1,000-page final report on the project was submitted to the U.S. Forest Service at the end of 2015. In it, scientists reached 31 points of consensus about managing California forests to reduce wildfire hazards and protect wildlife and human communities.

SNAMP – funded with $15 million in grants mainly from the U.S. Forest Service, with support from U.S. Fish and Wildlife, California Natural Resources Agency and University of California – ran from 2007 to 2015. The project ended with the submission of the final report that contains details about the study areas, the treatment processes and reports from each of the six science teams. The science teams and their final reports are:

A key chapter in the publication is titled Integrated Management Recommendations. In it, the 31 points of consensus are outlined.

“The integration in this project is also unique,” Susie Kocher, CE advisor said. “Scientists tend to work in their own focus areas, but we can learn a lot from each other's research projects.”

GIS Day Wrap Up (a bit late...)

GIS Day 2015! Happy 10th Birthday to the GIF! 

Panel of mapping innovators @ GIS Day 2015

A quick look at the past decade:

The GIF began in November 2015 on a wave of excitement around geospatial technology. In the months leading up to our first GIS Day in 2005, Google Maps launched, then went mobile; Google Earth launched in the summer; and NASA Blue Marble arrived. Hurricane Katrina changed the way we map disasters in real time. The opening up of the Landsat archive at no-cost by the USGS revolutionized how we can monitor the Earth's surface by allowing dense time-series analysis. These and other developments made viewing our world with detail, ease, and beauty commonplace, but these were nothing short of revolutionary - spurring new developments in science, governance and business. The decade since then has been one of intense innovation, and we have seen a rush in geospatial technologies that have enriched our lives immeasurably.

As 2015 ends we can recognize a similar wave of excitement around geospatial technology as we experienced a decade ago, yet one that is more diverse and far reaching than in 2005. This GIS Day we sought to highlight the societal benefit derived from innovators across academia, non-profits, government, and industry. 

GIS Day/GIF 10th Anniversary

On November 18 we co-hosted GIS Day with BayGeo (formerly BAAMA) as we have in the past and had well over 180 attendees. Our GIS Day featured posters, lightening talks, presentations, and a panel session that included local innovators from Bay Area Industry, Government, and Non-Profits. Our panel speakers included: Cindy Schmidt (NASA); Gregory Crutsinger (3D Robotics); Karin Tuxen-Bettman (Google); Ken-ichi Ueda (iNaturalist); Sara Dean (Stamen Designs); Jeffrey Miller (GeoWing); and Kyle Brazil (Urthecast). The discussion included what skills they look for in recruiting and where they see the geospatial world going in the next 5 years. It was a fun evening and personally, I learned a ton. Many levels of appreciation go out to those who spoke, those who came, and those who helped make the day happen. 

2005-2015: A decade of intense innovation in mapping

The GIF began in November 2015 on a wave of excitement around geospatial technology. In the months leading up to our first GIS Day in 2005, Google Maps launched, then went mobile; Google Earth launched in the summer; and NASA Blue Marble arrived. Hurricane Katrina changed the way we map disasters in real time. The opening up of the Landsat archive at no-cost by the USGS revolutionized how we can monitor the Earth's surface by allowing dense time-series analysis. These and other developments made viewing our world with detail, ease, and beauty commonplace, but these were nothing short of revolutionary - spurring new developments in science, governance and business. The decade since then has been one of intense innovation, and we have seen a rush in geospatial technologies that have enriched our lives immeasurably. In November 2015 we can recognize a similar wave of excitement around geospatial technology as we experienced a decade ago, one that is more diverse and far reaching than in 2005. This GIS Day we would like to highlight the societal benefit derived from innovators across academia, non-profits, government, and industry. Our panel discussion on the 18th has representatives from several local innovators in the field, including: Stamen Designs, Geowing, PlanetLabs, 3D Robotics, NASA, iNaturalist.org, and Google, who will discuss their perspectives on the boom in Bay Area mapping. 

Please think about joining us at GIS Day!

http://gif.berkeley.edu/gisday.html

Spatial Data Science @ Berkeley May 2015

Bootcamp participants outside historic Mulford HallOur bootcamp on Spatial Data Science has concluded. We had three packed days learning about the concepts, tools and workflow associated with spatial databases, analysis and visualizations. 

Our goal was not to teach a specific suite of tools but rather to teach participants how to develop and refine repeatable and testable workflows for spatial data using common standard programming practices.

On Day 1 we focused on setting up a collaborative virtual data environment through virtual machines, spatial databases (PostgreSQL/PostGIS) with multi-user editing and versioning (GeoGig). We also talked about open data and open standards, and modern data formats and tools (GeoJSON, GDAL).

Analyzing spatial data is the best part! On Day 2 we focused on open analytical tools for spatial data. We focused on one particular class of spatial data analysis: pattern analysis, and used Python (i.e. PySAL, NumPy, PyCharm, iPython Notebook), and R Studio (i.e. raster, sp, maptools, rgdal, shiny) to look at spatial autocorrelation and spatial regression. 

Wait, visualizing spatial data is the best part! Day 3 was dedicated to the web stack, and visualization. We started with web mapping (web stack, HTML/CSS, JavaScript, Leaflet), and then focused on web-based visualizations (D3).  Web mapping is great, and as OpenGeo.org says: “Internet maps appear magical: portals into infinitely large, infinitely deep pools of data. But they aren't magical, they are built of a few standard pieces of technology, and the pieces can be re-arranged and sourced from different places.…Anyone can build an internet map."

All-in-all it was a great time spent with a collection of very interesting mapping professionals from around the country (and Haiti!). Thanks to everyone!

Kelly Lab SPUR Students Visit Point Reyes National Seashore

Kelly Lab SPUR Students Drew Adamski and Ryan Avery have been participating in lab research all semester.  In particular they have been helping classify trails within the Pacific West's National Parks.  This month we were lucky enough to travel out with them to Point Reyes National Seashore to see some of those trails in person.  We were also lucky enough to spend the day with Chief Ranger Schifsky who was kind enough to talk to us about what issues different trails in the park were facing and which trails seemed to be changing most rapidly.  Chief Schifsky was also kind enough to show us some of the points in the park where the landscape had changed dramatically over time due to fire, restoration projects, or differing management strategies.  Overall it was a really inspiring and informative trip!

Croudsourced view of global agriculture: mapping farm size around the world

From Live Science. Two new maps released Jan. 16 considerably improve estimates of the amount of land farmed in the world — one map reveals the world's agricultural lands to a resolution of 1 kilometer, and the other provides the first look at the sizes of the fields being used for agriculture.

The researchers built the cropland database by combining information from several sources, such as satellite images, regional maps, video and geotagged photos, which were shared with them by groups around the world. Combining all that information would be an almost-impossible task for a handful of scientists to take on, so the team turned the project into a crowdsourced, online game. Volunteers logged into "Cropland Capture" on a computer or a phone and determined whether an image contained cropland or not. Participants were entered into weekly prize drawings.