CartoDB launches tools for visualizing temporal data

CartoDB, a robust and easy to use web mapping application, today launched "torque" a new feature enabling visualization of temporal data sets. 

From the CartoDB team:

Torque is a library for CartoDB that allows you to create beautiful visualizations with temporal datasets by bundling HTML5 browser rendering technologies with an efficient data transfer format using the CartoDB API. You can see an example of Torque in action on the Guardian's Data Blog, and grab the open source code from here.

Be sure to check out the example based on location data recorded from Captain's logs from the British Royal Navy during the first World War.  Amazing stuff!

 

Bing Maps completes Global Ortho project for US

The Bing Maps team has anounced the completion of the Global Ortho Project for the US.  The project provides 30cm resolution imagery for the entire US, all acquired within the last 2 years.  You can access all of the imagery now through Bing Maps, it is pretty amazing to see such detail for all of the far off places that typically don't get high resolution attention. 

Find out more about the project from the Bing Maps Blog, or view the data for yourself.

Fire Forecast

NPR recently created a neat interactive web map depicting locations of large active wildfires (updated every 30 min) and current wildfire danger forecasts (Low to Extreme) for areas in the lower 48 states (updated daily). Check out the map here and the accompany news story here. Map was created by Matt Stiles, Stephanie D'Otreppe and Brian Boyer.

Screenshot of NPR's Fire Forecast map

New open datasets for City of Oakland and Alameda County

Following on the footsteps of the county and city of San Francisco open data repository at data.sfgov.org, two new beta open data repositories have recently been released for the City of Oakland and Alameda County. This development coincides with the recent 2012 Code for Oakland hackathon last week. The hackathon aims to make government more transparent in the city and county through the use of technology with apps and the web to make public access to government data easier. The City of Oakland’s open data repository at data.openoakland.org includes data on crime reports for a variety of spatial scales, a variety of tabular and geographic data such as parcels, roads, trees, public infrastructure, and locations of new development to name a few. It is important to note that the Oakland open data repository is currently not officially run or maintained by the City of Oakland. It is currently maintained by members of the community and the OpenOakland Brigade. Alameda County’s open data repository at data.acgov.org includes data on Sherriff crime reports, restaurant health reports, solar generation data, and a variety of tabular and geographic data and public health department data. Data can be viewed on a browser as an interactive table or an interactive map or the data can be downloaded in a variety of formats. Both sites are still in their infancy so expect more datasets to come online soon. Also on the same note, the Urban Strategies Council recently released a new version of their InfoAlamedaCounty webGIS data visualization and map viewer - check it out.

 Screenshot of City of Oakland Open Data: data.openoakland.org

Screenshot of Alameda County Open Data: data.acgov.org

Crowdsourced neighborhood boundaries

Andy Woodruff and Tim Wallace from Bostonography discuss the first preliminary results of an experiment they set up with an interactive webGIS tool that allows people to draw polygons where they think each of Boston’s neighborhoods are located. About 300 maps of neighborhoods have been submitted so far and with the compiled data there are many areas of agreement and disagreement on where neighborhood boundaries may lay. Bostonography created maps showing a gradient of agreement for each neighborhood's boundary. This exercise is reminiscent to the work of Kevin Lynch and is an interesting experiment in trying to see if there is a consensus on where people think neighborhood boundaries are as opposed to how they are defined officially by the city. For the full blog post and maps on Bostonography click here. For an article in the Atlantic Cities that discusses the maps click here.

Strength or density of polygon line placement of crowdsourced neighborhood boundaries

New Trulia commute time maps

Trulia recently released a new commute time map that shows your estimated time of arrival in real time to all points in a region. The service uses OpenStreetMap data and General Transit Feed Specification (GTFS) feeds to calculate travel time. Drive times are available nationwide with public transit travel time only available in select cities for now. Read the full story here or click here for the map.

Screenshot from Trulia commute map

Livehoods: Dynamic maps of place via social networking

Livehoods is an interesting research project from the School of Computer Science at Carnegie Mellon University which maps social networking activity and patterns using tweets and check-ins to examine the hidden structure of cities and neighborhoods. For example below on the map each point represents a check-in location. Groups of nearby points of the same color represent a Livehood. Within a Livehood statistics are calculated aggregating check-ins overtime and depicts how a place is used. For more information on Livehoods click here.

Livehoods Screenshot

ASPRS 2012 Wrap-up

ASPRS 2012, held in Sacramento California, had about 1,100 participants. I am back to being bullish about our organization, as I now recognize that ASPRS is the only place in geospatial sciences where members of government, industry, and academia can meet, discuss, and network in a meaningful way. I saw a number of great talks, met with some energetic and informative industry reps, and got to catch up with old friends. Some highlights: Wednesday's Keynote speaker was David Thau from Google Earth Engine whose talk "Terapixels for Everyone" was designed to showcase the ways in which the public's awareness of imagery, and their ability to interact with geospatial data, are increasing. He calls this phenomena (and GEE plays a big role here): "geo-literacy for all", and discussed new technologies for data/imagery acquisition, processing, and dissemination to a broad public(s) that can include policy makers, land managers, and scientists. USGS's Ken Hudnut was Thursday's Keynote, and he had a sobering message about California earthquakes, and the need (and use) of geospatial intelligence in disaster preparedness.

Berkeley was well represented: Kevin and Brian from the GIF gave a great workshop on open source web, Kevin presented new developments in cal-adapt, Lisa and Iryna presented chapters from their respective dissertations, both relating to wetlands, and our SNAMP lidar session with Sam, Marek, and Feng (with Wenkai and Jacob from UCMerced) was just great!

So, what is in the future for remote sensing/geospatial analysis as told at ASPRS 2012? Here are some highlights:

  • Cloud computing, massive datasets, data/imagery fusion are everywhere, but principles in basic photogrammetry should still comes into play;
  • We saw neat examples of scientific visualization, including smooth rendering across scales, fast transformations, and immersive web;
  • Evolving, scaleable algorithms for regional or global classification and/or change detection; for real-time results rendering with interactive (on-the-fly) algorithm parameter adjustment; and often involving open source, machine learning;
  • Geospatial data and analysis are heavily, but inconsistently, deployed throughout the US for disaster response;
  • Landsat 8 goes up in January (party anyone?) and USGS/NASA are looking for other novel parterships to extend the Landsat lifespan beyond that;
  • Lidar is still big: with new deployable and cheaper sensors like FLASH lidar on the one hand, and increasing point density on the other;
  • Obia, obia, obia! We organized a nice series of obia talks, and saw some great presentations on accuracy, lidar+optical fusion, object movements; but thorny issues about segmentation accuracy and object ontology remain; 
  • Public interaction with imagery and data are critical. The Public can be a broader scientific community, or a an informed and engaged community who can presumably use these types of data to support public policy engagement, disaster preparedness and response.

AAG 2012 Wrap-up

NY skyline from Tim DeChant's blogAAG was a moderately large conference (just under 9,000) this year, held in mid-town NY. It was a brief trip for me, but I did go to some great talks across RS, GIScience, cartography, and VGI. I also went to a very productive OpenGeoSuite workshop hosted by OpenGeo. Some brief highights from the conference: Muki Hacklay discussed participation inequities in VGI: when you mine geoweb data, you are mining outliers, not society; there are biases in gender, education, age and enthusiasm. Agent-based modeling is still hot, and still improving. I saw some great talks in ABM for understanding land use change. Peter Deadman showed how new markets in a hot crop (like Acai) can transform a region quite quickly. Landsat 8 will likely be launched in early 2013, but further missions are less certain. My talk was in a historical ecology session, and Qinghua Guo and I highlighted some of the new modeled results of historic oak diversity in California using VTM data and Maxent.

Saturday evening I had the great pleasure of being locked in after hours at the NY Public Library for a session on historic maps. David Rumsey, with Humphrey Southall (University of Portsmouth) and Petr Pridal (Moravian Library) led a presentation introducing a new website: oldmapsonline.org. The website's goal is to provide a clearer way to find old maps, and provide them with a stable digital reference. 

New Map Tool and Widgets: What’s Your Coastal Flood Risk?

This new interactive website SurgingSeas, a project of Climate Central, lets you see the combined coastal flood threat from sea level rise and storm surge, town by town and city by city from coast to coast. Type in your Zip code or the name of your community, choose a water level anywhere from 1 to 10 feet above the current high-tide line, and you can see what areas might be at risk of flooding from water that high. You can also go to any one of 55 tide gauges we studied around the country, and see the odds we’ve calculated for how soon flood waters may reach different elevations as the sea continues to rise. There are gauges close to most major coastal cities. If you want to embed the map in your own blog or website, there’s a widget for that, and you can make any view your default — not just the national one. - Michael D. Lemonick

New 2011 SOD Confirmations Added to OakMapper!

New SOD Blitz 2011 Data

New confirmed cases of Sudden Oak Death (SOD) (P. ramorum) have been added to OakMapper, a project that tracks the spread of Sudden Oak Death from data collected by citizens and organizations. All official SOD cases are collected and confirmed by the California Department of Food and Agriculture or the University of California. Community SOD cases are submitted by citizens via the OakMapper website and iPhone application. 621 new points collected in 2011 have been added to OakMapper bringing the total number of confirmed SOD locations to 2191. The new data consists of laboratory confirmed cases collected by the annual SOD Blitz campaign of 2011 from the Forest Pathology and Mycology Lab run by Dr. Matteo Garbelotto. 

Click on the image below to view a close-up of the new confirmed SOD data (in green) from the SOD Blitz 2011. 

Explore the new data online here.

OakMapper.org

troubling report of OSM vandalism

From Sarah. This is a troubling story from ReadWriteWeb reporting that someone at a range of Google IP addresses in India has been editing the collaboratively made map of the world in some very unhelpful ways, like moving and deleting information and reversing the direction of one-way streets on the map.

Update: Google sent the following statement to ReadWriteWeb on Tuesday morning. "The two people who made these changes were contractors acting on their own behalf while on the Google network. They are no longer working on Google projects."

A Google spokesperson told BoingBoing on Friday that the company was "mortified" by the discovery - but now it appears the same Google contractor may be behind mayhem rippling throughout one of the world's biggest maps. Google says it's investigating these latest allegations.

SOD: still spreading in the bay area

A nice article in SF Chron on Matteo's citizen science approach to mapping new SOD spread.

The article states: The deadly pathogen known as sudden oak death is spreading throughout the Bay Area, infecting more trees in more places than have ever been seen before, according to scientists tracking the disease. The Forest Pathology and Mycology Laboratory at UC Berkeley used 10,000 tree and plant samples collected by 500 citizens between April and June this year to document a dramatic increase in the infection rate from Napa to the Carmel Valley and virtually everywhere in between.

Travel time and housing prices map

Our Bay Area regional planning agencies have just released a new interactive map that lets you visualize your housing options given your employment location, income, and desired commute time and mode. It's all part of the regional planning efforts that are happening statewide as a result of SB 375, which requires the integration of housing and land use planning to encourage people to drive less.

The press release gives more details: "If you're in the market to buy a home in the Bay Area, wouldn't it be nice to know how long it would take to commute from neighborhoods in your price range to your work place? Well, now you can, thanks to a new mapping tool on OneBayArea.org.

The interactive map shows you approximately how far you can get from any address within the nine-county region by car, public transit, bike, or on foot, at different times of the day. You can customize your view by the travel time between areas, and the median price of homes in each area."

Wetland Tracker site, updated with new wetland data

Berkeley close up: orange are planned wetland restoration sites; yellow lines are impacted streams, blue lines are natural streams.SFEI's Wetland Tracker site has been updated with new wetland data. Specifically, the site makes available BAARI data. BAARI - the Bay Area Aquatic Resource Inventory - is a detailed base map of the Bay Area's aquatic features that includes all wetlands, open water, streams, ditches, tidal marshes and flats, and riparian
areas. The BAARI data will be used to track changes in the extent and condition of aquatic habitat, aid in ecological sample drawing, and is featured on the California Wetlands Portal, where users can browse the area's
aquatic features and restoration projects on an interactive map.

New York City Solar Map Released

An interactive web-based map called The New York City Solar Map was recently released by the New York City Solar America City Partnership, led by Sustainable CUNY. The map allows users to search by neighborhood and address or interactively explore the map to zoom and click on a building or draw a polygon to calculate a number metrics related to building roof tops and potential solar power capacity including: potential energy savings, kilowatt output (in a time series), carbon emission reductions, payback, and a calculator for examining different solar installation options and savings with your utility provider. The map is intended to encourage solar panel installations and make information regarding solar panel capacity easier to access. Lidar data covering the entire city was collected last year and was used to compute the metrics used to determine solar panel capacity.

Solar Energy CalculatorThe data reveals that New York City has the potential to generate up to 5,847 megawatts of solar power. The installed solar capacity in the US today is only 2,300 megawatts. 66.4 percent of the city’s buildings have roof space suitable for solar panels. If panels were installed on those roof tops 49.7 percent of the current estimated daytime peak demand and about 14 percent of the city’s total annual electricity use could be met.

This map showcases the utility and power of webGIS and how it can be used to disseminate complex geographic information to anyone with a browser, putting the information needed to jump start solar panel installation in the hands of the city’s residents. The map was created by the Center for Advanced Research of Spatial Information (CARSI) at CUNY’s Hunter College and funded primarily by a United States Department of Energy grant.

Source: Click here for a NYTimes Article on the project for more information.

Click here to view the New York City Solar Map.

New York City Solar Map

Cal-adapt goes live: making California climate change data available to all

California - 2090 - Annual Average Temperature - High EmissionsThe exciting project the GIF staff have been working on for 9 months is ready to be revealed. Cal-Adapt is a web-based climate adaptation planning tool that will help local governments respond to climate change. The site was developed by UC Berkeley’s Geospatial Innovation Facility with funding and oversight from the California Energy Commission’s Public Interest Energy Research Program. The information for Cal-Adapt was gathered from California’s scientific community and represents the most current data available.

 

“Cal-Adapt will allow people to identify climate change risks in specific areas around the state.” said Secretary for Natural Resources, John Laird. “This tool will be especially beneficial to government agencies and city and county planners, as they will now have access to climate change information in a very user-friendly application.”

 

UC Berkeley press release.