For a brief introduction of this concept, see “Envisioning the City as a Communications Platform“, an interview with New York City’s Chief Digital officer Rachel Hoat.
Tim O’Reilly started his Gov 2.0 conference in 2009 and proposed thinking of the “City as a platform”. San Francisco was one of the first cities to embrace this (ex mayor Gavin Newsom has written a book about it: Citizenville: How to Take the Town Square Digital and Reinvent Government) and started its open data initiative the next year, which has been renewed and grown every year since. The Obama administration has also adopted this idea. In fact, the president just signed an executive order recently to make government data more accessible so entrepreneurs can build applications or services that we haven’t even imagined yet.
The idea is to make a variety of city related information available in machine readable format (eg not scans of printouts etc.) so that the data can easily be read, mashed up & correlated with other sources, visualized and leveraged in new types of applications. By third parties – commercial, non-profit and private – as well other city departments. cabspotting and crimespotting by Stamen Design are classics of innovative data mashups. More recently analytics based on data like this have helped Memphis and Santa Clara target high risk areas and times of day with predictive policing, resulting in lower crime rates.
Another example that started way earlier is Portlands public transport data initiative. In 2005, Portland offered its transit data to Mapquest, Yahoo and Google asking each if they had plans to incorporate public transit into their mapping applications. Only Google replied. Google Transit trip planner became a “20% project” but quickly grew. Google published a “General Transit Feed Specification” data format which allows public and private transit operators to publish their routes, schedules, fares and even real time location of cars, trains etc. Hundreds of municipalities and cities adopted the format and it is a potent addition to Google Directions in Maps. (Try it)
And here is a list of several hundred transit agencies that provide public and private data feeds in this format (for example Santa Clara VTA). Here is Trillium Solutions’ paper “Opportunities to leverage GTFS” which outlines applications well beyond just Google Maps that this transit data enables.
Now entrepreneurs, building on this rich base of transit data, are building customized apps for mobile phones that make intermodal traveling faster and predictable: Nimbler is an example. They allow routing that includes walking, biking, public transport and they are working on integrating rideshare, ad-hoc car rentals, and finding and reserving parking spaces. Without the data, this innovation could not happen – the negotiations with governments for individual data sets to individual entrepreneurs would be too hard or take too long.
As cities grow denser (and already 80% of the US population lives in cities), the cost and effort to own a car is too much for many individuals. These types of applications, delivered just when needed, on a mobile device, make public transport much more attractive and can increase ridership and therefore build a stronger case for public transport infrastructure, and safer and less congested streets. In addition, these applications can provide data about ridership, demand hotspots etc. back to city and transport planners and allow a more load balanced or dynamic resource allocation. It also frees up downtown parking & garage space as well as improving air quality.
In the private sector, Uber has really been successful in reducing the estimated arrival time from an original 17 minutes to 7 minutes average two years later, based on data collection and ‘big data’ predictive analytics. For example, their algorithms noticed 2am demand spikes around certain places in SF (night clubs closing) or at the end of Giants games at Pac Bell Park. Now they can predict these and pre-position drivers in the right place at the right time. This makes their operations much more efficient and competitive with alternatives. Why notdo the same with trains, trams, buses, subways ? (In fact NYC has just engaged Uber to coopt the taxi lobby in an e-hailing experiment, leveraging Ubers data+app infrastructure but using yellow cabs as the transport implementation)
Another really interesting application, both for end users and city planners (zoning and infrastructure development) are mashups between maps and transit data. For example you can ask OpenTripPlanner Analyst what areas of your city are accessible by public transport & on foot within a certain amount of time. You can see whether your workplace or kids school is reachable in a reasonable time before signing that apartment lease.
Mapnificient is another very well done mashup of this kind. You can even combine two proximity maps: “where can you can I meet for lunch that takes us both less than 20 minutes to get to”. You can see the CalTrain stations like lily pads of easy access in this snapshot:
Again, there is something in it for city planners here too: coders at the Urban Data Challenge hackathon have mashed up economic data with transit information and produced a map of “urban transit quality and equity” – providing better access to poorer areas can make them more desirable and lead to economic growth.
The auto industry and related companies & startups are also noticing the trend of young potential car buyers backing away and are looking to engage with alternative, on-demand transportation. Mercedes invested in Car2Go, BMW in DriveNow, as defensive measures to preserve car sales and brand awareness and conversely Avis acquired ZipCar and Hertz On Demand as offensive measures, capitalizing on those trends. This in addition to about 30 car/ridesharing/pooling startups (Lyft, Uber, Relayrides, etc.)
Top Down: Smart Cities
There is a lot of talk about “smart cities”. Definitions usually include: distributed sensing (road, traffic, air quality, etc. aka Internet of Things or M2M), crowdsourcing / civic involvement, mobility and connectivity (smartphones, Wifi) and big data fusion and analytics for better decision making. Hopefully all with a healthy dose of transparency thrown in.
IBM, Cisco, SAP etc. all would love to sell each city in the world their (often monolithic) products and services. In reality, there will probably be a mix of bottom up as well as top down solutions. Megaprojects like the $400M, 30 year LA traffic light synchronization or the California High speed rail project are however, often completely overtaken by newer technology or use cases by the time they are finally ready and funded for implementation. When technology is in heavy flux it is probably better to make many small, incremental bets rather than a single mega bet. For example, think back just six years to 2007, pre-iphone. Most people used their phone to TALK ! Seems quaint now.
The Internet of Things, connected streetlights or trash cans, or parking spaces etc. are examples.
Here is a list of “10 Most Impressive Smart Cities on Earth” with a brief description of what they are doing for each (SF, Amsterdam, Tokyo, Xinjiang, Seattle, Copenhagen, Stockholm, Vienna, NYC, Santiago).
Not to be left behind, oil rich Arab states have recently convened the Arab Future Cities Summit. Qatar’s Ministry for Municipality and Urban Planning alone has earmarked more than US$140 billion for mega projects across the energy, transport, education, health and tourism sectors, much of which is expected to be completed in time for the 2022 FIFA World Cup.
Bottom Up: Hackathons
An interesting and fairly cheap way to spread the API & data love as well as to get feedback and application ideas is to organize hackathons around certain themes (transit data, car rentals, electric vehicles and connected homes, smart grid etc.) These are usually 1-2 day overnight events where many ad-hoc teams of 2-5 coders and designers form to bang out an application prototype and at the end of a day&night sprint present for 5 minutes for a modest pot of prizes. You can read my blog entries & see videos from recent Hackathons by Hertz (ad-hoc rentals, electric fleet, business and leisure travel) and BMW (sustainable driving, homes). Jim and I plan to do these at ProspectSV for connected car related themes to generate a community of innovators and stakeholders and to provide insight into product and data demand to cities and car makers as well as new application ideas.
URBAN PROTOTYPING is a global series of festivals held in cities around the world, exploring how technology, art, and design can serve as tools for civic participation.
The Urban Data Challenge ran from February to March 2013 with events, participants and data from San Francisco, Geneva and Zurich. You can see a list of projects, videos etc at their website. The current one is in London until June 26th.
Here in Silicon Valley, Ford has a Personalized Fuel Efficiency App Challenge that closes in early August with $50K in total prize money.
Key to these hackathons is the availability of easy to use REST over HTTP APIs and lots of data that can be leveraged (including government data). Just to give a flavor, here are some examples from recent hackathons:
- Data.gov – 373,029 raw and geospatial datasets
- genability – comprehensive realtime energy prices, tariffs, load serving entities, time of day & peak pricing etc.
- eia.gov – US Energy Information Administration: per state net generation, consumption, cost, price of fossil fuels, electricity. Stocks of fuels, quality (sulfur, ash), power plant level data (465K data sets)
- General Transit Feed Specs – and list of available public and private transit agency feeds
- OpenStreetMaps – open source base maps, including bike, indoor and accessibility (blind, wheelchair) focused maps and pluggable routing
- twilio – voice calling, sms messaging
- CSRHub – corporate social responsibility (CSR) and sustainability ratings and information
- DriveNow – BMW ad-hoc EV rentals
- chargepoint – (formerly Coulomb) EV charging station network and reservations
- Nokia – maps, routing, gas prices, EV charging stations, POIs
- VoicePark – parking spot reservations
- Scoot – SF scooter rentals
- AT&T – Call management, SMS, speech
- Hertz Global Reservation APIs, in car nav system APIs, in car telematics APIs
- GM – in car infotainment APIs and smartgrid developer network
- Tons more from ProgrammableWeb
How well prepared is a complex infrastructure like a modern city for emergencies like storms, terrorist attacks, earthquakes etc. ? Based on recent experiences in New Orleans (Katrina), New York City (Sandy), Haiti (Earthquake), Mumbai (terrorist shooting attack), Pakistan (floods), there is room for improvement. Surprisingly, it is not the traditional public sector or NGOs that provide most innovation or improvements here, it is private startups leveraging networks and mobile devices.
For example, it took weeks for the Red Cross to actually distribute tents to the people that needed them. Meanwhile, first responders needed to dig people out of the rubble, figure out even how to get to them (roads clogged with fallen buildings all over Port-Au-Prince…So a small, ad hoc group of volunteer geeks around the world (!) formed and built tools to collect SMS messages from people in the rubble, crowdsource translation from Creole (local language) into English (first responders). Geoeyes satellites flew over Haiti and Google followed up with high resolution imagery. Crowdsourced, collaborative mappers all around the world as well as on the ground in Haiti updated street maps so that trucks and equipment could be routed through still passable streets. They marked up fires, contaminated water supplies, trapped persons, survivors, collapsed structures and schools. It was an incredibly useful grassroots effort that has led to more persistent crowdsourced information aid organizations like Crisis Mappers, Google Crisis Response Team, and several others.
In the immediate aftermath of a disaster, chaos often reigns. This was the case in the Mumbai attacks where it was unclear how many people were involved, which areas of the city were affected etc. An open source platform called Ushahidi, originally created to monitor election violence in Kenya (again, crowdsourced via mobile phone SMS) was ready to provide much needed information. They has basically created a platform to instantly generate a crisis site / platform. It includes maps, chat, data feeds, and integrate data from SMS, email, Twitter, maps and the web. It provided the ability for family members to say “I am here, I am ok”, for calls to donate blood, show open traffic routes for ambulances and available hospitals for the victims etc. The platform is evolving to provide automatic deduplication and deconflicting tools to help in the information overload in the hours during or after a disaster.
Hurricane Sandy hit New York City hard – it caused $19B damage to NYC and more storms like that to come. During and after the storm, Rachel Haot, New York City’s chief digital officer (!), dealt with social media overload. She was responsible for organizing responses to the hundreds of Twitter questions flooding in and making sure the 200 people who manage social media for the City of New York were uniformly accurate and calm as they disseminated information across multiple channels. Haot said this was much more effective than putting the onus of approving updates on only a few people, which could have potentially led to a bottleneck.
And though 200 people may seem like a lot of staff to oversee during an emergency, Jessica Lawrence, the executive director of New York Tech Meetup (NYTM), tapped 30,000 technologists for their help in the first days after the storm. One of the first projects that they did was putting up a “coworking crowdmap” with Ushahidi. The public could add available short- and long-term office spaces with Internet and phone access to the map. Very shortly they had over 80 offices spaces available on the map where people could work pretty much immediately after the storm. NY Tech Responds – Google Sandy Crisis Map
Since the storm, Haot has created a new organization, Code Corps, which she hopes will help the city to better tap into its existing community in the wake of the next disaster. “This is a way for the city to partner with organizations that are interested in helping to build technology for life-saving purposes,” Haot said. Haot also spearheaded the City’s first official hackathon, Reinvent NYC.GOV.
Mayor Bloomberg is of course no newcomer to data and technology, having built his commercial empire on just that. He outlined a vision to make NYC the worlds leading digital city, the digital roadmap including infrastructure, education, data (open government), engagement, industry. He has hired Mike Flowers as the City’s Chief Analytics Officer. He runs the Office of Policy and Strategic Planning, aka “The Mayor’s Geek Squad“. They collect, correlate and publish the city’s “big data”.
For example (see video from Mikes presentation to Code for America), New York City gets roughly 25,000 illegal-conversion complaints a year, but it has only 200 inspectors to handle them. Initially, much of the data wasn’t in usable form. For instance, the city’s record keepers did not use a single, standard way to describe location; every agency and department seemed to have its own approach. The buildings department assigns every structure a unique building number. The housing preservation department has a different numbering system. The tax department gives each property an identifier based on borough, block, and lot. The police use Cartesian coordinates. The fire department relies on a system of proximity to “call boxes” related to the location of firehouses, even though call boxes are defunct. Flower’s team cleaned up the data and figured out what factors are strong predictors for high risk (catastrophic fire in this case) Even with a simple targeting algorithm, inspector issued vacate orders went from a 13% to a – sustained – 70% because of better prediction. And they were able to allocate inspections based on risk (rather than 311 complaint frequency which was used previously) – and consequently save tenant and firefighter lives and injuries.
A second less sexy example dealt with illegal yellow grease dumping mostly from restaurants which pollutes water, clogs piping, increases fire risk. Correlating waster hauler data with restaurant licenses AND manhole blockages yielded the insight that restaurants with no wastehaulers for solid waste and brown grease are 3.6 times (!) as likely to have problem manholes within 600 feet.
More details about data driven decision making in New York City:
Slate: “Big Data in the Big Apple”
O’Reilly: Predictive data analytics is saving lives and taxpayer dollars in New York City
Agencies like OpenPlans or Trillium Solutions have made it their business to provide tools and consulting to local governments looking to open up their data.
Final thought (finally…)
By establishing a digital plan, a data and media team and open data policies and workflows, the city not only prepares itself better for disasters, it becomes more efficient, creates services more responsive to its citizen’s needs, makes better use of assets (like transportation infrastructure) and saves money.
If all of that sounds politically overwhelming / impossible, perhaps starting with just an open data initiative, modeled on San Francisco would be a great kick off.
PS: with apologies to the patient reader, “If I had more time, I would have written a shorter blog post.”