Wow! New world view Chrome plugin

Kelly turned us on to this plugin from Google. Each time you get a new tab on your browser, you get treated to a new picture of the earth! But, check this one out: 

From Drebkau, Germany. I have no idea what this is an image of - could it be grain fields of some kind, or is it just someone at Google's garage sale carpet? Any thoughts? 

Here is another view of the same area:

Governor Brown's new Executive Order, issued today is a banner day for our climate change efforts

From Bruce Riordan, at the Climate Readiness Institute. 

Bay Area Climate Stakeholders: Governor Brown's new Executive Order, issued today is a banner day for our climate change efforts. 

1. The Executive Order sets a new interim goal for GHG reduction—40% below 1990 levels by 2030.

2. The Executive Order, for the first time, outlines a series of steps the State will take to address climate adaptation and resilience. 

See the press release, reaction from world leaders, and the full Executive Order at: http://gov.ca.gov/home.php

Kelly Lab SPUR Students Visit Point Reyes National Seashore

Kelly Lab SPUR Students Drew Adamski and Ryan Avery have been participating in lab research all semester.  In particular they have been helping classify trails within the Pacific West's National Parks.  This month we were lucky enough to travel out with them to Point Reyes National Seashore to see some of those trails in person.  We were also lucky enough to spend the day with Chief Ranger Schifsky who was kind enough to talk to us about what issues different trails in the park were facing and which trails seemed to be changing most rapidly.  Chief Schifsky was also kind enough to show us some of the points in the park where the landscape had changed dramatically over time due to fire, restoration projects, or differing management strategies.  Overall it was a really inspiring and informative trip!

AAG wrap up 2015

Photo of Chicago from Frank Kehren, Flickr Creative Commons LicenseI focused on a series of CyberGIS sessions at AAG this year. This was partly to better situate our spatial data science ideas within the  terminology and discipline of Geography, and partly to focus on a new topic for me in AAG conferences. There were a number of organized sessions over three days, including a plenary by Timothy Nyerges from UW.  Talks ranged in topic: online collaboration, participatory analytics, open tool development such as python-based tools for parallelization of GIS operations, case studies of large area computation, introduction to languages that might be less familiar to geographers (e.g., Julia, R).

There was a session that focused on education in which ideas about challenging in teaching “cyberGIS” to undergraduate students, among other things. Additionally, Tim Nyerges gave the CyberGIS plenary: "Computing Complex Sustainable Systems Resilience" in which he made the case that CyberGIS is a framework for studying socio-economic systems, resilience, and system feedbacks.

About the term Cyber. I am not alone in my dislike of the term "CyberGIS" (Matrix 4, anyone?), but it seems to have stuck here at AAG. In many of the talks “cyber” meant “bigger". There were mentions of the “cyber thing”, which I took to be a placeholder for cluster computing. However, there are many other terms that are being used by the speakers. For example, I saw talks that focused on participatory, structured, analytic-deliberation from UW, or high performance geocomputation from ORNL; the latter term I think better captures what earth system science people might recognize. Many talks used as their entry point to Cyber the proliferation of data that characterizes modern geography and life.

These sessions were organized through an NSF-funded center: The CyberGIS Center for Advanced Digital and Spatial Studies http://cybergis.illinois.edu/.  Their formal definition of CyberGIS is:  “geographic information science and systems (GIS) based on advanced infrastructure of computing, information, and communication technologies (aka cyberinfrastructure)". They say it "has emerged over the past several years as a vibrant interdisciplinary field and played essential roles in enabling computing-, data- and collaboration-intensive geospatial research and education across a number of domains with significant societal impact."

And of course, we had excellent talks by the Kellys: Kelly presented on our VTM work: "Quantifying diversity and conservation status of California's Oak trees using the historic Vegetation Type Mapping (VTM) dataset” as part of an organized Historical Ecology session. Alice presented her paper: "Policing Paradise: The Evolution of Law Enforcement in US National Parks" as part of the session on Green Violence 2: Interrogating New Conflicts over Nature and Conservation.

Goodbye Chicago! You provided a wonderful venue, despite the cold!

Historical data and the NRS

Just got off a call with a group of people focusing on historical data discovery at the Natural Reserve System (NRS). This process is part of the recently funded Institute for the Study of Ecological Effects of Climate Impacts (ISEECI). People in the group include:  

Of particular note was the introduction of the Online Archive of California, which is a collection of metadata about historical archives. Peter is adding all his data to the OAC. His work was funded through a Research Opportunity Fund grant through UCOP, and a NSF grant.  The process the NRS has used is different than what we have done with the REC data. They have assembled metadata from the research reports from the stations, and full digitization can be opportunisic and focused on particular questions. There is a Zotero database of publications that have resulted from the reserves. 

Other important links:

The metadata data from research applications submitted through RAMS - tends to be incomplete as we rely on PI's to proof the entry and then submit it. 

http://nrs.ucop.edu/MetaData.htm

The reference database, this has had extensive work done on it, and should be fairly complete. Lynn's working on a complementary database for Santa Cruz Island historic data, which will be made available. 

http://nrs.ucop.edu/bibliography.htm

Climate data - currently hosted on DRI's website, and data should be available for download.

http://www.wrcc.dri.edu/ucnrs/index.html

SimplyMap & PolicyMap

Today I went to a great D-Lab Workshop on Demographic Mapping Tools.  Berkeley's GIS and Map Librarian, Susan Powell walked us through the use of several very easy to use mapping tools available through UC Berkeley.  Both are really great for quickly visualizing data from many different sources. 

#1: SimplyMap:  http://sm2.simplymap.com/index.html

Pros: This interface allows the easy visualization of census data (back to 1980), crime data, as well as lifestyle and market data.  SimplyMap is accessible with a UC Berkeley login and can be accessed through the Berkeley Library website.  It allows you to export data as shapefiles or image files, has a table-building function, and allows limited data filtering and masking.  The data provided come with metadata.  Most data available can be visualized down to the census tract or zipcode level.  You can save and share maps from your private account.

Cons: You cannot combine variables or years of data in the map itself, but you can do this in SimplyMap's table building function and export that.  The user interface is not always simple or straightforward. 

Above: Dollar amount spent at restaurants in Berkeley in 2014 by census tract. Map created using SimplyMaps.

#2 PolicyMap: http://ucberkeley.policymap.com/maps

Pros: PolicyMap includes census data (back to 2000), housing, health, government programs, crime, and education data.  Like SimplyMap this allows the quick and easy visualization of data in a single year.  PolicyMap also allows you to upload and overlay your own data with its existing datasets and generally allows for a bit more overlaying of datasets--point data can be added on top of polygons.  You can generate quick pre-defined reports on specific cities or areas.  You can also define a custom study area in PolicyMap.  It has a table-builder as well as a really great data-dictionary that explains where its data come from.  

Cons: There are no private accounts.  All of Berkeley has a single account, so you can see everyone else's data, and they can see yours.  Thus, you must log-in through UC Berkeley's website to gain access.  This datasharing may not be an absolute con, but it is a little weird.  PolicyMap does not allow you to export shapefiles, but it does allow you to build tables that can be easily joined with shapefiles if need be.  It too has some user-interface quirks that could probably be improved upon. 

New VTM retakes, this time from Heather

Plus sa change, plus sa la meme chose. Thanks to Heather Constable, who went out exploring near Morro Bay. Here is one of her retakes. 

Date of original photo: Feb 25, 1936, taken in San Luis Obispo County, California, US. Looking north toward Morro Bay. Shows almost dense stand of Arctostaphylos morroensis in foreground. Quad name: Cayucos. Quad number: 132B. Reference to map: 1. Photographer: Albert Wieslander.

Mapping the Berkeley Boom: Social Media and Mapping Help Unravel a Mystery

Last night we heard the Berkeley Boom again.  We’ve been hearing this thunderous boom quite frequently in the last month here in Berkeley, but this one sounded bigger than most.  Car alarms went off on the street.  The dog jumped.  “What IS that?” I wondered aloud.  With a quick search on the internet I found that that the Berkeley Boom is a phenomena whose Twitter reports are being actively mapped.  While Berkeley police and residents still have no idea what the mystery boom is, through the combined powers of social media and mapping we are gathering an understanding of where it is happening.  As Berkeley residents continue reporting the boom (#BerkeleyBoom), perhaps we’ll get to the bottom of this, the newest of Berkeley’s many mysteries. 

For more on the Berkeley Boom see the Berkeleyside article: http://www.berkeleyside.com/2015/03/31/the-unsolved-mystery-of-the-berkeley-boom/

Map from Berkeleyside Article:

The drought indeed hits home: Berkeley water less than its usual quality

The hills and lawns might look green still, but the drought has hit the east bay hard. The sparkling, clean, tasty water we usually have delivered through our taps via  the Mokelumne River Basin in the Sierra Nevada. Get out Britas!

From our favorite and fastest source for local news Berkeleyside

The drinking water for 1 million customers of East Bay Municipal Utilities District had an “off” odor and taste over the weekend and, while EBMUD is fixing the issue, customers might have to get used to it. The culprit? The drought.

EBMUD usually draws the drinking water for the majority of its customers from the bottom of Pardee Reservoir, about 100 miles east of Berkeley, according to Abby Figueroa, a spokeswoman for EBMUD. But on Thursday, the water district started taking water from the top portion of the reservoir. The water there is warmer and contains some algae, so even though it was treated before gushing into pipes in Berkeley, Oakland and elsewhere, there was a peculiar smell.

Route from the Mokelumne River Basin in the Sierra Nevada, to the East BayAccordingly there was a run on Brita filters at all local hardware/houseware stores.

New water restrictions for California announced.

Satellites can be vulnerable to solar storms

I don't use ocean color data, but found this report of interest nonetheless. From the HICO website. HICO is the Hyperspectral Imager for the Coastal Ocean.

HICO Operations Ended. March 20, 2015

In September 2014 during an X-class solar storm, HICO’s computer took a severe radiation hit, from which it never recovered.  Over the past several months, engineers at NRL and NASA have attempted to restart the computer and have conducted numerous tests to find alternative pathways to communicate with it.  None of these attempts have been successful.  So it is with great sadness that we bid a fond farewell to HICO.

Yet we rejoice that HICO performed splendidly for five years, despite being built in only 18 months from non space-hardened, commercial-off-the-shelf parts for a bargain price.  Having met all its Navy goals in the first year, HICO was granted a two-year operations extension from the Office of Naval Research and then NASA stepped in to sponsor this ISS-based sensor, extending HICO’s operations another two years.  All told, HICO operated for 5 years, during which it collected approximately 10,000 hyperspectral scenes of the earth.

Most of the HICO scenes taken over sites worldwide are available now, and will remain accessible to researchers through two websites:  http://oceancolor.gsfc.nasa.gov/ and http://hico.coas.oregonstate.edu.  HICO will live on through research conducted by scientists using HICO data, especially studies exploring the complexities of the world’s coastal oceans.

Data science for the 21st century: building a new team of researchers

Berkeley is one out of eight new awards from the National Science Foundation's recently launched NSF Research Traineeship (NRT) program. These programs develop innovative approaches to graduate training used across these projects include industry internships, international experiences, citizen science engagement, interdisciplinary team projects, and training in communication with the media, policy makers, and general public.

Our program at UC Berkeley is called Data Science for the 21st centur: DS421.  Three Grand Challenges motivate our program:

  1. Data: data acquisition, assimilation, and analysis, and the resulting challenges and opportunities for the research community and society at large. The data revolution is a potentially disruptive advance that challenges the norms and traditions of scientific research. Data science is an opportunity, entailing a revolution in training and a reorientation of research priorities. Open science— open access to datasets, literature, scripted workflows and the like—is a fundamental transformation that integrates scientific publication with the underlying data, analysis, and reasoning, using metadata and machine-readable research products to facilitate a semantic web of knowledge. These practices will make our research reproducible and transparent, documenting the evidentiary basis for scientific conclusions and their implications for policy.
  2. System dynamics: coupled human-natural systems and their responses to rapid environmental change. Social-ecological systems display a complex array of ecological and social processes interconnected across broad spatial, temporal, and socio-political scales. Our current approach to understanding ecological and economic systems is dominated by partial equilibrium models that are poorly suited to the dynamics of rapidly changing systems. Important research avenues include: characterizing the dynamics and feedbacks among and within systems to better plan for cross-scale and nonlinear uncertainties; identifying the proximity of tipping points or other critical transitions; understanding how the spatial structure of interactions affects system dynamics; and detecting and attributing responses to environmental and climatic drivers. Real-time data analytics combined with long-term monitoring and forecasting are critical tools to address to these challenges.
  3. Action: evidence-based proposals in public policy, natural resource management, and environmental design to mitigate the impacts of rapid environmental change, and enhance societal resilience and sustainability. Effective decision-making depends on networks of diverse stakeholders, with rapid feedback between individuals and groups to evaluate the impact, efficiency, equity, and efficacy of policy and management actions. This third component is at the core of a practical data science ethic critical for translating science to societal benefit, and makes use of our partnerships with academic, private, governmental, and non-governmental organizations.

Cutting across these challenges, all students, and especially those engaged in interdisciplinary research,
need excellent communication skills and the ability to adjust content and style to reach their audiences. Welcome to the new cohort!

Mapsense talk at BIDS for your viewing pleasure

Here is Erez Cohen's excellent talk from the BIDS feed: http://bids.berkeley.edu/resources/videos/big-data-mapping-modern-tools-geographic-analysis-and-visualization

Title: Big Data Mapping: Modern Tools for Geographic Analysis and Visualization

Speaker: Erez Cohen, Co-Founder and CEO of Mapsense

We'll discuss how smart spatial indexes can be used for performant search and filtering for generating interactive and dynamic maps in the browser over massive datasets. We'll go over vector maps, quadtree indices, geographic simplification, density sampling, and real-time ingestion. We'll use example datasets featuring real-time maps of tweets, California condors, and crimes in San Francisco. 

The BIDS Data Science Lecture Series is co-hosted by BIDS and the Data, Science, and Inference Seminar. 

About the Speaker

Erez is co-founder and CEO at Mapsense, which is builds software for the analysis and visualization of massive spatial datasets. Previously Erez was an engineer at Palantir Technologies, where he worked with credit derivatives and mortgage portfolio datasets. Erez holds a BS/MS from UC Berkeley's Industrial Engineer and Operations Research Department. He was a PhD candidate in the same department at Columbia University.

print 'Hello World (from FOSS4G NA 2015)'

FOSS4G NA 2015 is going on this week in the Bay Area, and so far, it has been a great conference.

Monday had a great line-up of tutorials (including mine on PySAL and Rasterio), and yesterday was full of inspiring talks.  Highlights of my day: PostGIS Feature Frenzy, a new geoprocessing Python package called PyGeoprocessing, just released last Thurs(!) from our colleagues down at Stanford who work on the Natural Capital Project, and a very interesting talk about AppGeo's history and future of integrating open source geospatial solutions into their business applications. 

The talk by Michael Terner from AppGeo echoed my own ideas about tool development (one that is also shared by many others including ESRI) that open source, closed source and commercial ventures are not mutually exclusive and can often be leveraged in one project to maximize the benefits that each brings. No one tool will satisfy all needs.

In fact, at the end of my talk yesterday on Spatial Data Analysis in Python, someone had a great comment related to this: "Everytime I start a project, I always wonder if this is going to be the one where I stay in Python all the way through..."  He encouraged me to be honest about that reality and also about how Python is not always the easiest or best option.

Similarly, in his talk about the history and future of PostGIS features, Paul Ramsey from CartoDB also reflected on how PostGIS is really great for geoprocessing because it leverages the benefits of database functionality (SQL, spatial querying, indexing) but that it is not so strong at spatial data analysis that requires mathematical operations like interpolation, spatial auto-correleation, etc. He ended by saying that he is interested in expanding those capabilities but the reality is that there are so many other tools that already do that.  PostGIS may never be as good at mathematical functions as those other options, and why should we expect one tool to be great at everything?  I completely agree.

10-year anniversary for the GIF

I'm musing, contemplating and writing on the decade 2005-2015, as this is the GIF's 10-year anniversary. What a decade it was. Here I'll post and add to some of the key events that helped transform mapping (and the GIF) in the last 10 years.

Key background events

  • 1996. Mapquest launched.
  • 1997. Skynet becomes self-aware.
  • May 2000. Selective Availabilility on GPS turned off, leading the way for GPS in smartphones.
  • The Scan Line Corrector (SLC) on the Landsat 7 ETM+ instrument failed May 31, 2003.
  • 2004. Open Street Map founded.
  • March 2004. Yahoo! maps launched, first slippy maps (click and drag to pan and zoom the map).
  • 2004. NASA releases WorldWind.
  • October 2004. Google acquires Where 2 allowing AJAX map tiling to a desktop client.
  • October 2004. Google acquires Keyhole.

What made 2005 such a crazy year

  • Google Maps launches in February, and goes mobile in April.
  • The first mashup: Paul Rademacher's Housingmaps.org. His original post on Craigslist asking for feedback: https://forums.craigslist.org/?ID=26638141
  • Google Maps API launches in June.
  • NASA's Blue Marble Next Generation released.
  • Google Earth launches in June.
  • Hurricane Katrina hits in August. Simple webmaps for the disaster proliferate, and ESRI and GE get on the scene.
  • Kellylab's first blog post in September.
  • GIF launches and hosts our first GIS Day in November with Michael Jones, formerly of Keyhole.
  • The back-up solar array drive on Landsat 5 began failing and was not able to provide the power needed to charge the batteries. November 26.

Where we are in 2015

We've gone through a number of transitions in the world of mapping:

  • Data have transitioned from being siloed, and found in clearinghouses to being open and provided through APIs.
  • We’ve moved from desktop computing to cloud computing.
  • Webmaps have transitioned from using proprietary stacks to networks with multiple open and proprietary options.
  • We’ve moved from imagery gathered monthly or seasonally to daily; footprints are smaller, and our focus has shifted from local focus to global coverage.
  • Our planimetric 2D view is changing with lidar and radar sensors.
  • Visualization has moved from static cartography or simple animations to dynamic interactive visualization.
  • Finally, mapped content is no longer anonymous or regulated, but highly personal and narrative.

Key GIF milestones:

  • 2005 GIIF (Geospatial Imaging and Informatics Facility) launches
  • 2006 OakMapper changes from ArcIMS to Google Earth API
  • 2008 GIIF becomes GIF
  • 2008 OakMapper 2.0 launches
  • 2008 SNAMP website launches
  • 2011 Cal-Adapt goes live
  • 2013 EcoEngine/HOLOS goes live
  • 2014 LandCarbon launches
  • 2014 GIF and Cal-Adapt go to the White House
  • 2014 vtm.berkeley.edu goes live, built from the HOLOS API
  • 2015 Spatial Data Science bootcamp in May

Onwards and upwards!

Lidar + hyperspectral, ortho and field data released by NEON

http://www.neoninc.org/data-resources/get-data/airborne-dataFrom LASTools list:

The National Ecological Observatory Network (NEON) published this week airborne remote sensing data including full waveform and discrete return LiDAR data and LiDAR derivatives (DTM, DSM, CHM) as well as corresponding hyperspectral data, orthophotos, and field data on vegetation strucutre, foliar chemistry and ASD field spectra.

NEON Airborne Data Set
.

Questions about the Spatial Data Science Bootcamp? Read on!

In May, the GIF will be hosting a 3-day bootcamp on Spatial Data Science.

What is the significance of Spatial Data Science?

We live in a world where the importance and availability of spatial data is ever increasing, and the value of Spatial Data Science: big data tools, geospatial analytics, and visualization is on the rise. There are many new and distributed tools available to the geospatial professional, and the ability to efficiently evaluate and integrate the wide array of options is a critical skill for the 21st century marketplace.  Spatial Data Science offers a modern workflow that includes the integration of data from multiple sources and scales; with open-source and web-based technology for robust data analysis and publication; with core spatial concepts and application of spatial analysis methods; and allows for the collaborations of people – companies, scientists, policy-makers, and the public.

Why come to the GIF to learn about it?

The Geospatial Innovation Facility (GIF) at UC Berkeley is the premier research and educational facility in the Bay Area that focuses on a broad vision of Spatial Data Science. The GIF has a decade-long history of successful GIS and remote sensing research projects. The GIF has also trained many students, researchers, and community members in geospatial techniques and applications through our popular workshop series and private consultation. With more recent advances in web-based mapping capabilities, the GIF has been at the forefront of complex web-based spatial data informatics (web-based data sharing and visualization), such as the Cal-Adapt  tool, which provides a wealth of data and information about California’s changing climate. Participants will get the benefit of our decade-long focus on Spatial Data Science: collaborative project development, rigorous spatial analysis methods, successful interaction with clients, and delivery of results to project managers, the public, and other stakeholders.

What are the key elements of the Bootcamp?

This Bootcamp is designed to familiarize participants with some of the major advances in geospatial technology today: big data wrangling, open-source tools, and web-based mapping and visualization. You will learn how and when to implement a wide range of modern tools that are currently in use and under development by leading Bay Area mapping and geospatial companies, as well as explore a set of repeatable and testable workflows for spatial data using common standard programming practices. Finally, you will learn other technical options that you can call upon in your day-to-day workflows. This 3-day intensive training will jump start your geospatial analysis and give you the basic tools you need to start using open source and web-based tools for your own spatial data projects.  

Interested in integrating open source and web-based solutions into your GIS toolkit? Come join us at our May 2015 Bootcamp: Spatial Data Science for Professionals. Applications due: 3/16/2015. Sign up here!

Information on the GIST Minor and Graduate Certificate

Hi all,

Our gis.berkeley.edu website had to be taken down. Information on the GIST Minor and Graduate Certificate can be found here:

 Thanks!