NASA Data and the Distributed Active Archive Centers

I’ve been away from the blog for awhile, but thought I’d catch up a bit. I am in beautiful Madison Wisconsin (Lake Mendota! 90 degrees! Rain! Fried cheese curds!) for the NASA LP DAAC User Working Group meeting. This is a cool deal where imagery and product users meet with NASA team leaders to review products and tools. Since this UWG process is new to me, I am highlighting some of the key fun things I learned. 

What is a DAAC?
A DAAC is a Distributed Active Archive Center, run by NASA Earth Observing System Data and Information System (EOSDIS). These are discipline-specific facilities located throughout the United States. These institutions are custodians of EOS mission data and ensure that data will be easily accessible to users. Each of the 12 EOSDIS DAACs process, archive, document, and distribute data from NASA's past and current Earth-observing satellites and field measurement programs. For example, if you want to know about snow and ice data, visit the National Snow and Ice Data Center (NSIDC) DAAC. Want to know about social and population data? Visit the Socioeconomic Data and Applications Data Center (SEDAC). These centers of excellence are our taxpayer money at work collecting, storing, and sharing earth systems data that are critical to science, sustainability, economy, and well-being.

What is the LP DAAC?
The Land Processes Distributed Active Archive Center (LP DAAC) is one of several discipline-specific data centers within the NASA Earth Observing System Data and Information System (EOSDIS). The LP DAAC is located at the USGS Earth Resources Observation and Science (EROS) Center in Sioux Falls, South Dakota. LP DAAC promotes interdisciplinary study and understanding of terrestrial phenomena by providing data for mapping, modeling, and monitoring land-surface patterns and processes. To meet this mission, the LP DAAC ingests, processes, distributes, documents, and archives data from land-related sensors and provides the science support, user assistance, and outreach required to foster the understanding and use of these data within the land remote sensing community.

Why am I here?
Each NASA DAAC has established a User Working Group (UWG). There are 18 people on the LP DAAC committee, 12 members from the land remote sensing community at large, like me! Some cool stuff going on. Such as...

New Sensors
Two upcoming launches are super interesting and important to what we are working on. First, GEDI (Global Ecosystem Dynamics Investigation) will produce the first high resolution laser ranging observations of the 3D structure of the Earth. Second, ECOSTRESS (The ECOsystem Spaceborne Thermal Radiometer Experiment on Space Station), will measure the temperature of plants: stressed plants get warmer than plants with sufficient water. ECOSTRESS will use a multispectral thermal infrared radiometer to measure surface temperature. The radiometer will acquire the most detailed temperature images of the surface ever acquired from space and will be able to measure the temperature of an individual farmer's field. Both of these sensors will be deployed on the International Space Station, so data will be in swaths, not continuous global coverage. Also, we got an update from USGS on the USGS/NASA plan for the development and deployment of Landsat 10. Landsat 9 comes 2020, Landsat 10 comes ~2027.

Other Data Projects
We heard from other data providers, and of course we heard from NEON! Remember I posted a series of blogs about the excellent NEON open remote sensing workshop I attended last year. NEON also hosts a ton of important ecological data, and has been thinking through the issues associated with cloud hosting. Tristin Goulden was here to give an overview.

Tools Cafe
NASA staff gave us a series of demos on their WebGIS services; AppEEARS; and their data website. Their webGIS site uses ArcGIS Enterprise, and serves web image services, web coverage services and web mapping services from the LP DAAC collection. This might provide some key help for us in IGIS and our REC ArcGIS online toolkits. AppEEARS us their way of providing bundles of LP DAAC data to scientists. It is a data extraction and exploration tool. Their LP DAAC data website redesign (website coming soon), which was necessitated in part by the requirement for a permanent DOI for each data product.

User Engagement
LP DAAC is going full-force in user engagement: they do workshops, collect user testimonials, write great short pieces on “data in action”, work with the press, and generally get the story out about how NASA LP DAAC data is used to do good work. This is a pretty great legacy and they are committed to keep developing it. Lindsey Harriman highlighted their excellent work here.

Grand Challenges for remote sensing
Some thoughts about our Grand Challenges: 1) Scaling: From drones to satellites. It occurs to me that an integration between the ground-to-airborne data that NEON provides and the satellite data that NASA provides had better happen soon; 2) Data Fusion/Data Assimilation/Data Synthesis, whatever you want to call it. Discovery through datasets meeting for the first time; 3) Training: new users and consumers of geospatial data and remote sensing will need to be trained; 4) Remote Sensible: Making remote sensing data work for society. 

A primer on cloud computing
We spent some time on cloud computing. It has been said that cloud computing is just putting your stuff on “someone else’s computer”, but it is also making your stuff “someone else’s problem”, because cloud handles all the painful aspects of serving data: power requirements, buying servers, speccing floor space for your servers, etc. Plus, there are many advantages of cloud computing. Including: Elasticity. Elastic in computing and storage: you can scale up, or scale down or scale sideways. Elastic in terms of money: You pay for only what you use. Speed. Commercial clouds CPUs are faster than ours, and you can use as many as you want. Near real time processing, massive processing, compute intensive analysis, deep learning. Size. You can customize this; you can be fast and expensive or slow and cheap. You use as much as you need. Short-term storage of large interim results or long-term storage of data that you might use one day.

Image courtesy of Chris Lynnes

Image courtesy of Chris Lynnes

We can use the cloud as infrastructure, for sharing data and results, and as software (e.g. ArcGIS Online, Google Earth Engine). Above is a cool graphic showing one vision of the cloud as a scaled and optimized workflow that takes advantage of the cloud: from pre-processing, to analytics-optimized data store, to analysis, to visualization. Why this is a better vision: some massive processing engines, such as SPARC or others, require that data be organized in a particular way (e.g. Google Big Table, Parquet, or DataCube). This means we can really crank on processing, especially with giant raster stacks. And at each step in the workflow, end-users (be they machines or people) can interact with the data. Those are the green boxes in the figure above. Super fun discussion, leading to importance of training, and how to do this best. Tristan also mentioned Cyverse, a new NSF project, which they are testing out for their workshops.

Image attribution: Corey Coyle

Image attribution: Corey Coyle

Super fun couple of days. Plus: Wisconsin is green. And warm. And Lake Mendota is lovely. We were hosted at the University of Wisconsin by Mutlu Ozdogan. The campus is gorgeous! On the banks of Lake Mendota (image attribution: Corey Coyle), the 933-acre (378 ha) main campus is verdant and hilly, with tons of gorgeous 19th-century stone buildings, as well as modern ones. UW was founded when Wisconsin achieved statehood in 1848, UW–Madison is the flagship campus of the UW System. It was the first public university established in Wisconsin and remains the oldest and largest public university in the state. It became a land-grant institution in 1866. UW hosts nearly 45K undergrad and graduate students. It is big! It has a med school, a business school, and a law school on campus. We were hosted in the UW red-brick Romanesque-style Science Building (opened in 1887). Not only is it the host building for the geography department, it also has the distinction of being the first building in the country to be constructed of all masonry and metal materials (wood was used only in window and door frames and for some floors), and may be the only one still extant. How about that! Bye Wisconsin!

Mapping fires and fire damage in real time: available geospatial tools

Many of us have watched in horror and sadness over the previous week as fires consumed much of the beautiful hills and parts of the towns of Napa and Sonoma Counties. Many of us know people who were evacuated with a few minutes’ notice - I met a retired man who left his retirement home with the clothes on his back. Many other friends lost everything - house, car, pets. It was a terrible event - or series of events as there were many active fires. During those 8+ days all of us were glued to our screens searching for up-to-date and reliable information on where the fires were, and how they were spreading. This information came from reputable, reliable sources (such as NASA, or the USFS), from affected residents (from Twitter and other social media), and from businesses (like Planet, ESRI, and Digital Globe who were sometimes creating content and sometimes distilling existing content), and from the media (who were ofen using all of the above). As a spatial data scientist, I am always thinking about mapping, and the ways in which geospatial data and analysis plays an increasingly critical role in disaster notification, monitoring, and response. I am collecting information on the technological landscape of the various websites, media and social media, map products, data and imagery that played a role in announcing and monitoring the #TubbsFire, #SonomaFires and #NapaFires. I think a retrospective of how these tools, and in particular how the citizen science aspect of all of this, helped and hindered society will be useful.  

In the literature, the theoretical questions surrounding citizen science or volunteered geography revolve around:

  • Accuracy – how accurate are these data? How do we evaluate them?  

  • Access – Who has access to the data? Are their technological limits to dissemination?

  • Bias (sampling issues)/Motivation (who contributes) are critical.

  • Effectiveness – how effective are the sites? Some scholars have argued that VGI can be inhibiting. 

  • Control - who controls the data, and how and why?

  • Privacy - Are privacy concerns lessened post disaster?

I think I am most interested in the accuracy and effectiveness questions, but all of them are important.  If any of you want to talk more about this or have more resources to discuss, please email me: maggi@berkeley.edu, or Twitter @nmaggikelly.

Summary so far. This will be updated as I get more information.

Outreach from ANR About Fires

Core Geospatial Technology During Fires

Core Technology for Post-Fire Impact

 

Wrap up from #DroneCamp2017!

UC ANR's IGIS program hosted 36 drone enthusiasts for a three day DroneCamp in Davis California. DroneCamp was designed for participants with little to no experience in drone technology, but who are interested in using drones for a variety of real world mapping applications. The goals of DroneCamp were to:

  • Gain an broader understanding of the drone mapping workflow: including
    • Goal setting, mission planning, data collection, data analysis, and communication & visualization
  • Learn about the different types of UAV platforms and sensors, and match them to specific mission objectives;
  • Get hands-on experience with flight operations, data processing, and data analysis; and
  • Network with other drone-enthusiasts and build the California drone ecosystem. 

The IGIS crew, including Sean Hogan, Andy Lyons, Maggi Kelly, Robert Johnson, Kelly Easterday, and Shane Feirer were on hand to help run the show. We also had three corporate sponsors: GreenValley Intl, Esri, and Pix4D. Each of these companies had a rep on hand to give presentations and interact with the participants.

Day 1 of #DroneCamp2017 covered some of the basics - why drone are an increasingly important part of our mapping and field equipment portfolio; different platforms and sensors (and there are so many!); software options; and examples. Brandon Stark gave a great overview of the Univ of California UAV Center of Excellence and regulations, and Andy Lyons got us all ready to take the 107 license test. We hope everyone here gets their license! We closed with an interactive panel of experienced drone users (Kelly Easterday, Jacob Flanagan, Brandon Stark, and Sean Hogan) who shared experiences planning missions, flying and traveling with drones, and project results. A quick evaluation of the day showed the the vast majority of people had learned something specific that they could use at work, which is great. Plus we had a cool flight simulator station for people to practice flying (and crashing).

Day 2 was a field day - we spent most of the day at the Davis hobbycraft airfield where we practiced taking off, landing, mission planning, and emergency maneuvers. We had an excellent lunch provided by the Street Cravings food truck. What a day! It was hot hot hot, but there was lots of shade, and a nice breeze. Anyway, we had a great day, with everyone getting their hands on the commands. Our Esri rep Mark Romero gave us a demo on Esri's Drone2Map software, and some of the lidar functionality in ArcGIS Pro.

Day 3 focused on data analysis. We had three workshops ready for the group to chose from, from forestry, agriculture, and rangelands. Prior to the workshops we had great talks from Jacob Flanagan and GreenValley Intl, and Ali Pourreza from Kearney Research and Extension Center. Ali is developing a drone-imagery-based database of the individual trees and vines at Kearney - he calls it the "Virtual Orchard". Jacob talked about the overall mission of GVI and how the company is moving into more comprehensive field and drone-based lidar mapping and software. Angad Singh from Pix4D gave us a master class in mapping from drones, covering georeferencing, the Pix4D workflow, and some of the checks produced for you a the end of processing.

One of our key goals of the DroneCamp was to jump start our California Drone Ecosystem concept. I talk about this in my CalAg Editorial. We are still in the early days of this emerging field, and we can learn a lot from each other as we develop best practices for workflows, platforms and sensors, software, outreach, etc. Our research and decision-making teams have become larger, more distributed, and multi-disciplinary; with experts and citizens working together, and these kinds of collaboratives are increasingly important. We need to collaborate on data collection, storage, & sharing; innovation, analysis, and solutions. If any of you out there want to join us in our California drone ecosystem, drop me a line.

Thanks to ANR for hosting us, thanks to the wonderful participants, and thanks especially to our sponsors (GreenValley Intl, Esri, and Pix4D). Specifically, thanks for:

  • Mark Romero and Esri for showing us Drone2Map, and the ArcGIS Image repository and tools, and the trial licenses for ArcGIS;
  • Angad Singh from Pix4D for explaining Pix4D, for providing licenses to the group; and
  • Jacob Flanagan from GreenValley Intl for your insights into lidar collection and processing, and for all your help showcasing your amazing drones.

#KeepCalmAndDroneOn!

Wrap up from the Esri Imagery and Mapping Forum

Recently, Esri has been holding an Imagery and Mapping Forum prior to the main User Conference. This year I was able to join as an invited panelist for the Executive Panel and Closing Remarks session on Sunday. During the day I hung out in the Imaging and Innovation Zone, in front of the Drone Zone (gotta get one of these for ANR). This was well worth attending: smaller conference - focused topics - lots of tech reveals - great networking. 

Notes from the day: Saw demos from a range of vendors, including:

  • Aldo Facchin from Leica gave a slideshow about the Leica Pegasus: Backpack. Their backpack unit workflow uses SLAM; challenges include fusion of indoor and outdoor environments (from transportation networks above and below ground). Main use cases were industrial, urban, infrastructure. http://leica-geosystems.com/en-us/products/mobile-sensor-platforms/capture-platforms/leica-pegasus-backpack
  • Jamie Ritche from Urthecast talked about "Bringing Imagery to Life". He says our field is "a teenager that needs to be an adult". By this he means that in many cases businesses don't know what they need to know. Their solution is in apps- "the simple and the quick": quick, easy, disposable and useful. 4 themes: revisit, coverage, time, quality. Their portfolio includes DEIMOS 1, Theia, Iris, DEIMOIS-2, PanGeo + . Deimos-1 focuses on agriculture. UrtheDaily: 5m pixels, 20TB daily, (40x the Sentinel output); available in 2019. They see their constellation and products as very comparable to Sentinel, Landsat, RapidEye. They've been working with Land O Lakes as their main imagery delivery. Stressing the ability of apps and cloud image services to deliver quick, meaningful information to users. https://www.urthecast.com/
  • Briton Vorhees from SenseFly gave an overview of: "senseFly's Drone Designed Sensors". They are owned by Parrot, and have a fleet of fixed wing drones (e.g. the eBee models); also drone optimized cameras, shock-proof, fixed lens, etc (e.g. SODA). These can be used as a fleet of sensors (gave an citizen-science example from Zanzibar (ahhh Zanzibar)). They also use Sequoia cameras on eBees for a range of applications. https://www.sensefly.com/drones/ebee.html
  • Rebecca Lasica and Jarod Skulavik from Harris Geospatial Solutions: The Connected Desktop". They showcased their new ENVI workflow implemented in ArcGIS Pro. Through a Geospatial Services Framework that "lifts" ENVI off the desktop; and creates an ENVI Engine. They showed some interesting crop applications - they call it "Crop Science". This http://www.harrisgeospatial.com/
  • Jeff Cozart and McCain McMurray from Juniper Unmanned shared "The Effectiveness of Drone-Based Lidar" and talked about the advantages of drone-based lidar for terrain mapping and other applications. They talked through a few projects, and highlighted that the main advantages of drone-based lidar are in the data, not in the economics per se. But the economies do work out too. (They partner with Reigl and YellowScan from France.)  They showcased an example from Colorado that compared lidar (I think it was a Reigl on a DJI Matrice) and traditional field survey - the lidar cost was 1/24th as expensive as the field survey. They did a live demo of ArcGIS tools with their CO data: classification of ground, feature extraction, etc. http://juniperunmanned.com/
  • Aerial Imaging Productions talked about their indoor scanning - this linking-indoor-to-outdoor (i.e. making point cloud data truly geo) is a big theme here. Also OBJ is a data format. (From Wikipedia: "The OBJ file format is a simple data-format that represents 3D geometry alone — namely, the position of each vertex, the UV position of each texture coordinate vertex, vertex normals, and the faces that make each polygon defined as a list of vertices, and texture vertices.") It is used in the 3D graphics world, but increasingly for indoor point clouds in our field.
  • My-Linh Truong from Riegl talked about their new static, mobile, airborne, and UAV lidar platforms. They've designed some mini lidar sensors for smaller UAVas (3lbs; 100kHz; 250m range; ~40pts/m2). Their ESRI workflow is called LMAP, and it relies on some proprietary REIGL software processing at the front end, then transfer to ArcGIS Pro (I think). http://www.rieglusa.com/index.html

We wrapped up the day with a panel discussion, moderated by Esri's Kurt Schwoppe, and including Lawrie Jordan from Esri, Greg Koeln from MDA, Dustin Gard-Weiss from NGA, Amy Minnick from DigitalGlobe, Hobie Perry from USFS-FIA, David Day from PASCO, and me. We talked about the promise and barriers associated with remote sensing and image processing from all of our perspectives. I talked alot about ANR and IGIS and the use of geospatial data, analysis and viz for our work in ANR. Some fun things that came out of the panel discussion were:

  • Cool stuff:
    • Lawrie Jordan started Erdas!
    • Greg Koeln wears Landsat ties (and has a Landsat sportcoat). 
    • Digital Globe launched their 30cm resolution WorldView-4. One key case study was a partnership with Associated Press to find a pirate fishing vessel in action in Indonesia. They found it, and busted it, and found on board 2,000 slaves.
    • The FIA is increasingly working on understanding uncertainty in their product, and they are moving for an image-base to a raster-based method for stratification.
    • Greg Koeln, from MDA (he of the rad tie- see pic below) says: "I'm a fan of high resolution imagery...but I also know the world is a big place".
  • Challenges: 
    • We all talked about the need to create actionable, practical, management-relevant, useful information from the wealth of imagery we have at our fingertips: #remotesensible. 
    • Multi-sensor triangulation (or georeferencing a stack of imagery from multiple sources to you and me) is a continual problem, and its going to get worse before it gets better with more imagery from UAVs. On that note, Esri bought the patent for "SIFT" a Microsoft algorithm to automate the relative registration of an image stack.
    • Great question at the end about the need to continue funding for the public good: ANR is critical here!
    • Space Junk.
  • Game-changers: 
    • Opening the Landsat archive: leading to science (e.g. Hansen et al. 2013), leading to tech (e.g. GEE and other cloud-based processors). Greg pointed out that in the day, his former organization (Ducks Unlimited) paid $4,400 per LANDSAT scene to map wetlands nationwide! That's a big bill. 
    • Democratization of data collection: drones, smart phones, open data...
The panel in action

The panel in action

Notes and stray thoughts:

  • Esri puts on a quality show always. San Diego always manages to feel simultaneously busy and fun, while not being crowded and claustrophobic. Must be the ocean, the light and the air.
  • Trying to get behind the new "analytics" replacement of "analysis" in talks. I am not convinced everyone is using analytics correctly ("imagery analytics such as creating NDVI"), but hey, it's a thing now: https://en.wikipedia.org/wiki/Analytics#Analytics_vs._analysis
  • 10 years ago I had a wonderful visitor to my lab from Spain - Francisco Javier Lozano - and we wrote a paper: http://www.sciencedirect.com/science/article/pii/S003442570700243X. He left to work at some crazy startup company called Deimos in Spain, and Lo and Behold, he is still there, and the company is going strong. The Deimos satellites are part of the UrtheCast fleet. Small world!
  • The gender balance at the Imagery portion of the Esri UC is not. One presenter at a talk said to the audience with a pointed stare at me: "Thanks for coming Lady and Gentlemen".

Good fun! Now more from Shane and Robert at the week-long Esri UC!

Distillation from the NEON Data Institute

So much to learn! Here is my distillation of the main take-homes from last week. 

Notes about the workshop in general:

NEON data and resources:

Other misc. tools:

Day 1 Wrap Up
Day 2 Wrap Up 
Day 3 Wrap Up
Day 4 Wrap Up

Day 2 Wrap Up from the NEON Data Institute 2017

First of all, Pearl Street Mall is just as lovely as I remember, but OMG it is so crowded, with so many new stores and chains. Still, good food, good views, hot weather, lovely walk.

Welcome to Day 2! http://neondataskills.org/data-institute-17/day2/
Our morning session focused on reproducibility and workflows with the great Naupaka Zimmerman. Remember the characteristics of reproducibility - organization, automation, documentation, and dissemination. We focused on organization, and spent an enjoyable hour sorting through an example messy directory of misc data files and code. The directory looked a bit like many of my directories. Lesson learned. We then moved to working with new data and git to reinforce yesterday's lessons. Git was super confusing to me 2 weeks ago, but now I think I love it. We also went back and forth between Jupyter and python stand alone scripts, and abstracted variables, and lo and behold I got my script to run. All the git stuff is from http://swcarpentry.github.io/git-novice/

The afternoon focused on Lidar (yay!) and prior to coding we talked about discrete and waveform data and collection, and the opentopography (http://www.opentopography.org/) project with Benjamin Gross. The opentopography talk was really interesting. They are not just a data distributor any more, they also provide a HPC framework (mostly TauDEM for now) on their servers at SDSC (http://www.sdsc.edu/). They are going to roll out a user-initiated HPC functionality soon, so stay tuned for their new "pluggable assets" program. This is well worth checking into. We also spent some time live coding with Python with Bridget Hass working with a CHM from the SERC site in California, and had a nerve-wracking code challenge to wrap up the day.

Fun additional take-home messages/resources:

Thanks to everyone today! Megan Jones (our fearless leader), Naupaka Zimmerman (Reproducibility), Tristan Goulden (Discrete Lidar), Keith Krause (Waveform Lidar), Benjamin Gross (OpenTopography), Bridget Hass (coding lidar products).

Day 1 Wrap Up
Day 2 Wrap Up 
Day 3 Wrap Up
Day 4 Wrap Up

Our home for the week

Our home for the week

Day 1 Wrap Up from the NEON Data Institute 2017

I left Boulder 20 years ago on a wing and a prayer with a PhD in hand, overwhelmed with bittersweet emotions. I was sad to leave such a beautiful city, nervous about what was to come, but excited to start something new in North Carolina. My future was uncertain, and as I took off from DIA that final time I basically had Tom Petty's Free Fallin' and Learning to Fly on repeat on my walkman. Now I am back, and summer in Boulder is just as breathtaking as I remember it: clear blue skies, the stunning flatirons making a play at outshining the snow-dusted Rockies behind them, and crisp fragrant mountain breezes acting as my Madeleine. I'm back to visit the National Ecological Observatory Network (NEON) headquarters and attend their 2017 Data Institute, and re-invest in my skillset for open reproducible workflows in remote sensing. 

Day 1 Wrap Up from the NEON Data Institute 2017
What a day! http://neondataskills.org/data-institute-17/day1/
Attendees (about 30) included graduate students, old dogs (new tricks!) like me, and research scientists interested in developing reproducible workflows into their work. We are a pretty even mix of ages and genders. The morning session focused on learning about the NEON program (http://www.neonscience.org/): its purpose, sites, sensors, data, and protocols. NEON, funded by NSF and managed by Battelle, was conceived in 2004 and will go online for a 30-year mission providing free and open data on the drivers of and responses to ecological change starting in Jan 2018. NEON data comes from IS (instrumented systems), OS (observation systems), and RS (remote sensing). We focused on the Airborne Observation Platform (AOP) which uses 2, soon to be 3 aircraft, each with a payload of a hyperspectral sensor (from JPL, 426, 5nm bands (380-2510 nm), 1 mRad IFOV, 1 m res at 1000m AGL) and lidar (Optech and soon to be Riegl, discrete and waveform) sensors and a RGB camera (PhaseOne D8900). These sensors produce co-registered raw data, are processed at NEON headquarters into various levels of data products. Flights are planned to cover each NEON site once, timed to capture 90% or higher peak greenness, which is pretty complicated when distance and weather are taken into account. Pilots and techs are on the road and in the air from March through October collecting these data. Data is processed at headquarters.

In the afternoon session, we got through a fairly immersive dunk into Jupyter notebooks for exploring hyperspectral imagery in HDF5 format. We did exploration, band stacking, widgets, and vegetation indices. We closed with a fast discussion about TGF (The Git Flow): the way to store, share, control versions of your data and code to ensure reproducibility. We forked, cloned, committed, pushed, and pulled. Not much more to write about, but the whole day was awesome!

Fun additional take-home messages:

Thanks to everyone today, including: Megan Jones (Main leader), Nathan Leisso (AOP), Bill Gallery (RGB camera), Ted Haberman (HDF5 format), David Hulslander (AOP), Claire Lunch (Data), Cove Sturtevant (Towers), Tristan Goulden (Hyperspectral), Bridget Hass (HDF5), Paul Gader, Naupaka Zimmerman (GitHub flow).

Day 1 Wrap Up
Day 2 Wrap Up 
Day 3 Wrap Up
Day 4 Wrap Up

Planet Lab wants YOU to work with their data!

They say: 

Are you a college student, researcher or professor? We’re looking for innovative academics, researchers and scientists to unlock the power of a one-of-a-kind dataset. You can now apply for access to Planet’s unique dataset for non-commercial research purposes. In an area as large as 2,000 square kilometers, you’ll have access to download imagery, analyze trends, and publish your results.

Check it: https://www.planet.com/products/education-and-research/

ASTER Data Open - No April Fools!

We know about the amazing success for science, education, government, and business that has resulted from the opening of the Landsat archive in 2008. Now more encouraging news about open data:

On April 1, 2016, NASA's Land Processes Distributed Active Archive Center (LP DAAC) began distributing ASTER Level 1 Precision Terrain Corrected Registered At-Sensor Radiance (AST_L1T) data products over the entire globe at no charge. Global distribution of these data at no charge is a result of a policy change made by NASA and Japan.

The AST_L1T product provides a quick turn-around of consistent GIS-ready data as a multi-file product, which includes a HDF-EOS data file, full-resolution composite images (FRI) as GeoTIFFs for tasked telescopes (e.g., VNIR/SWIR and TIR ), and associated metadata files. In addition, each AST_L1T granule contains related products including low-resolution browse and, when applicable, a Quality Assurance (QA) browse and QA text report.

More than 2.95 million scenes of archived data are now available for direct download through the LP DAAC Data Pool and for search and download through NASA‘s Earthdata Search Client and also through USGS‘ GloVis , and USGS‘ EarthExplorer . New scenes will be added as they are acquired and archived.

ASTER is a partnership between NASA, Japan‘s Ministry of Economy, Trade and Industry (METI), the National Institute of Advanced Industrial Science and Technology (AIST) in Japan, and Japan Space Systems (J-spacesystems ).

Visit the LP DAAC ASTER Policy Change Page to learn more about ASTER. Subscribe to the LP DAAC listserv for future announcements.

IGIS exploring applications of drones for UC Agriculture and Natural Resources

IGIS is excited to be working with 3D Robotics (3DR) to explore new applications of small unmanned aerial systems (sUAS) for monitoring agriculture and natural resources.  This technology has never been more practical for scientific exploration; however, there is still much to be learned about how to best utilize sUAS in this way.

DEM from drone flightIGIS is now developing protocols for safe and efficient deployment of a 3DR Solo sUAS.  Equipped with a common 12 megapixel GoPro Hero camera, this platform can survey up to 75 acres, at 3 inches of spatial resolution in less than 20 minutes, while flying a pre-defined flight path at 23 miles per hour, at 300 feet above ground level.  Then thanks to Pix4D mapping software, which is used to combine the pictures collected by the sUAS's GoPro into a single image mosaic, automated photogrammetric processes can render a digital terrain model from the images with a vertical accuracy close to the same 3 inches spatial resolution found in the original image collection.

IGIS has introduced sUAS and remote sensing training into our workshop schedule for this year.  Please check out our IGIS training calendar by Clicking Here for more information.

2005-2015: A decade of intense innovation in mapping

The GIF began in November 2015 on a wave of excitement around geospatial technology. In the months leading up to our first GIS Day in 2005, Google Maps launched, then went mobile; Google Earth launched in the summer; and NASA Blue Marble arrived. Hurricane Katrina changed the way we map disasters in real time. The opening up of the Landsat archive at no-cost by the USGS revolutionized how we can monitor the Earth's surface by allowing dense time-series analysis. These and other developments made viewing our world with detail, ease, and beauty commonplace, but these were nothing short of revolutionary - spurring new developments in science, governance and business. The decade since then has been one of intense innovation, and we have seen a rush in geospatial technologies that have enriched our lives immeasurably. In November 2015 we can recognize a similar wave of excitement around geospatial technology as we experienced a decade ago, one that is more diverse and far reaching than in 2005. This GIS Day we would like to highlight the societal benefit derived from innovators across academia, non-profits, government, and industry. Our panel discussion on the 18th has representatives from several local innovators in the field, including: Stamen Designs, Geowing, PlanetLabs, 3D Robotics, NASA, iNaturalist.org, and Google, who will discuss their perspectives on the boom in Bay Area mapping. 

Please think about joining us at GIS Day!

http://gif.berkeley.edu/gisday.html

Fun with drones and trees

A bit late, but better late than never. Our October trip to Hopland with IGIS and 3DRobotics was great fun, and very informative. We tested the ‘Solo’ UAV with three different cameras: the typical GoPro, NIR GoPro (with post-market monkeying with filters to get IR), and a high res Canon lens.  

Key points:

  • 3DR’s flight planning software is ridiculously easy to set up and use.  
  • 3DR’s new software package can do the mosaicking. 
  • 3D models from multiple images seem easy to create. 
  • Still want to put a better scientific camera with more bands on the Solo. 

Still, lots of fun, stay tuned for more pics and an evaluation of the collected imagery. 

 

MODIS and R: a dream partnership

Found by Natalie: 

Tuck, Sean L., Helen RP Phillips, Rogier E. Hintzen, Jörn PW Scharlemann, Andy Purvis, and Lawrence N. Hudson. "MODISTools–downloading and processing MODIS remotely sensed data in R." Ecology and evolution 4, no. 24 (2014): 4658-4668. And it is Open Access!

Abstract

Remotely sensed data available at medium to high resolution across global spatial and temporal scales are a valuable resource for ecologists. In particu- lar, products from NASA’s MODerate-resolution Imaging Spectroradiometer (MODIS), providing twice-daily global coverage, have been widely used for eco- logical applications. We present MODISTools, an R package designed to improve the accessing, downloading, and processing of remotely sensed MODIS data. MODISTools automates the process of data downloading and processing from any number of locations, time periods, and MODIS products. This auto- mation reduces the risk of human error, and the researcher effort required compared to manual per-location downloads. The package will be particularly useful for ecological studies that include multiple sites, such as meta-analyses, observation networks, and globally distributed experiments. We give examples of the simple, reproducible workflow that MODISTools provides and of the checks that are carried out in the process. The end product is in a format that is amenable to statistical modeling. We analyzed the relationship between spe- cies richness across multiple higher taxa observed at 526 sites in temperate for- ests and vegetation indices, measures of aboveground net primary productivity. We downloaded MODIS derived vegetation index time series for each location where the species richness had been sampled, and summarized the data into three measures: maximum time-series value, temporal mean, and temporal vari- ability. On average, species richness covaried positively with our vegetation index measures. Different higher taxa show different positive relationships with vegetation indices. Models had high R2 values, suggesting higher taxon identity and a gradient of vegetation index together explain most of the variation in species richness in our data. MODISTools can be used on Windows, Mac, and Linux platforms, and is available from CRAN and GitHub (https://github.com/ seantuck12/MODISTools). 

Where is the best source for NAIP information for California?

How many times has NAIP been acquired for California? 
According to DFG, we have:

  • NAIP 2014 aerial imagery, 1 m, 4 variations (natural color, 4-band, CIR/false color, NDVI)
  • NAIP 2012 aerial imagery, 1 m, 4 variations (natural color, 4-band, CIR/false color, NDVI) 
  • NAIP 2010 aerial imagery, 1 m, 4 variations (natural color, 4-band, CIR/false color, NDVI) 
  • NAIP 2009 aerial imagery, 1 m, 4 variations (natural color, 4-band, CIR/false color, NDVI) 
  • NAIP 2005 aerial imagery, 1 m (natural color)

I was not aware the flight schedule was this frequent. 

Still, I can't find a definitive information source that helps. 

Wow! New world view Chrome plugin

Kelly turned us on to this plugin from Google. Each time you get a new tab on your browser, you get treated to a new picture of the earth! But, check this one out: 

From Drebkau, Germany. I have no idea what this is an image of - could it be grain fields of some kind, or is it just someone at Google's garage sale carpet? Any thoughts? 

Here is another view of the same area:

Satellites can be vulnerable to solar storms

I don't use ocean color data, but found this report of interest nonetheless. From the HICO website. HICO is the Hyperspectral Imager for the Coastal Ocean.

HICO Operations Ended. March 20, 2015

In September 2014 during an X-class solar storm, HICO’s computer took a severe radiation hit, from which it never recovered.  Over the past several months, engineers at NRL and NASA have attempted to restart the computer and have conducted numerous tests to find alternative pathways to communicate with it.  None of these attempts have been successful.  So it is with great sadness that we bid a fond farewell to HICO.

Yet we rejoice that HICO performed splendidly for five years, despite being built in only 18 months from non space-hardened, commercial-off-the-shelf parts for a bargain price.  Having met all its Navy goals in the first year, HICO was granted a two-year operations extension from the Office of Naval Research and then NASA stepped in to sponsor this ISS-based sensor, extending HICO’s operations another two years.  All told, HICO operated for 5 years, during which it collected approximately 10,000 hyperspectral scenes of the earth.

Most of the HICO scenes taken over sites worldwide are available now, and will remain accessible to researchers through two websites:  http://oceancolor.gsfc.nasa.gov/ and http://hico.coas.oregonstate.edu.  HICO will live on through research conducted by scientists using HICO data, especially studies exploring the complexities of the world’s coastal oceans.

Lidar + hyperspectral, ortho and field data released by NEON

http://www.neoninc.org/data-resources/get-data/airborne-dataFrom LASTools list:

The National Ecological Observatory Network (NEON) published this week airborne remote sensing data including full waveform and discrete return LiDAR data and LiDAR derivatives (DTM, DSM, CHM) as well as corresponding hyperspectral data, orthophotos, and field data on vegetation strucutre, foliar chemistry and ASD field spectra.

NEON Airborne Data Set
.