Monday, 22 September 2014

Exploring the Value of Open Data on Capitol Hill

Lessons Learned: Exploring the Value of Open Data on Capitol Hill

Otmane El Rhazi, Government data helps drive our economy and will increasingly become more important in the future. Thursday, I had the opportunity to speak on this topic at a congressional briefing hosted by U.S. Senator Mark Warner (D-VA), Chairman of the Budget Committee’s Government Performance Task Force, and the Center for Data Innovation. Panelists included Daniel Castro, Director of the Center for Data Innovation, Kathleen Phillips, COO for Zillow, Tom Schenk, Chief Data Officer for the City of Chicago, and Steven Adler, IBM’s Chief Information Strategist.

We explored how government data is the foundation of the ongoing data revolution, fostering innovation, creating jobs and driving better decision-making in both the private and public sectors. The federal government is, and will continue to be, the only provider of credible, comprehensive, and consistent data on our people, economy, and climate. We also pointed to the findings in our recently released report,“Fostering Innovation, Creating Jobs, Driving Better Decisions: The Value of Government Data,” which found that billions in economic output and trillions in resource decisions are driven by federal data.
Daniel Castro, Director of the Center for Data Innovation, urged attendees to make sure Congress continues to invest in our data infrastructure. He highlighted the value of open data, ensuring that data flows more seamlessly between the public and private sectors. Castro also focused on the need to consider new ways to enable cooperation between government and industry to maximize the benefits of big data to the greatest number in society.

Zillow’s Chief Operating Officer Kathleen Phillips discussed how her company uses a wide variety of federal and local data to better connect buyers and sellers in the real estate marketplace. Zillow provides critical information in an easy to digest mapping format for over 50 million properties around the country. Their Zillow Home Value Forecast, fed in part by federal datasets, also predicts local home values. Zillow uses data from the Census Bureau, the Bureau of Labor Statistics, the Bureau of Economic Analysis, the Federal Housing Finance Agency and other federal sources to provide a real time evaluation of local real estate markets.
IBM’s Chief Information Strategist Steven Adler explained how IBM is working with local cities, like Palo Alto and San Jose, California to publish high quality open data sets and create open government partnerships. For example, in the Code for America hackathon, private citizens volunteered skills to augment government services. IBM has also been very active in efforts to standardize data formats across governments within the US and around the world. IBM has also fostered the development of a SMS application for farmers in Africa to collect and report data on agricultural assets.
And, Chicago’s Chief Data Officer Tom Schenk said Chicago relies on U.S. Census, BLS and BEA data as a big contributor to Chicago’s operational efficiency. For example, he said they dispatch trucks and crews for city projects using predictive models and forecasting based on federal, state and local data. Chicago has made more than 600 data sets available to the public, including the amount and types of energy use by Census block; continuous data on beach conditions; and data on the location of every reported crime since 2001.

Thanks to all our partners and the participants who contributed to this briefing. By gaining a better understanding of the value and benefits derived from government data, we can more effectively position our federal agencies to take fuller advantage of this resource, provide a boost to the economy, and support policy decisions that further enhance our capability to improve peoples’ lives.
Government data helps drive our economy and will increasingly become more important in the future. Thursday, I had the opportunity to speak on this topic at a congressional briefing hosted by U.S. Senator Mark Warner (D-VA), Chairman of the Budget Committee’s Government Performance Task Force, and the Center for Data Innovation. Panelists included Daniel Castro, Director of the Center for Data Innovation, Kathleen Phillips, COO for Zillow, Tom Schenk, Chief Data Officer for the City of Chicago, and Steven Adler, IBM’s Chief Information Strategist.
We explored how government data is the foundation of the ongoing data revolution, fostering innovation, creating jobs and driving better decision-making in both the private and public sectors. The federal government is, and will continue to be, the only provider of credible, comprehensive, and consistent data on our people, economy, and climate. We also pointed to the findings in our recently released report,“Fostering Innovation, Creating Jobs, Driving Better Decisions: The Value of Government Data,” which found that billions in economic output and trillions in resource decisions are driven by federal data.
Daniel Castro, Director of the Center for Data Innovation, urged attendees to make sure Congress continues to invest in our data infrastructure. He highlighted the value of open data, ensuring that data flows more seamlessly between the public and private sectors. Castro also focused on the need to consider new ways to enable cooperation between government and industry to maximize the benefits of big data to the greatest number in society.

Zillow’s Chief Operating Officer Kathleen Phillips discussed how her company uses a wide variety of federal and local data to better connect buyers and sellers in the real estate marketplace. Zillow provides critical information in an easy to digest mapping format for over 50 million properties around the country. Their Zillow Home Value Forecast, fed in part by federal datasets, also predicts local home values. Zillow uses data from the Census Bureau, the Bureau of Labor Statistics, the Bureau of Economic Analysis, the Federal Housing Finance Agency and other federal sources to provide a real time evaluation of local real estate markets.

IBM’s Chief Information Strategist Steven Adler explained how IBM is working with local cities, like Palo Alto and San Jose, California to publish high quality open data sets and create open government partnerships. For example, in the Code for America hackathon, private citizens volunteered skills to augment government services. IBM has also been very active in efforts to standardize data formats across governments within the US and around the world. IBM has also fostered the development of a SMS application for farmers in Africa to collect and report data on agricultural assets.
And, Chicago’s Chief Data Officer Tom Schenk said Chicago relies on U.S. Census, BLS and BEA data as a big contributor to Chicago’s operational efficiency. For example, he said they dispatch trucks and crews for city projects using predictive models and forecasting based on federal, state and local data. Chicago has made more than 600 data sets available to the public, including the amount and types of energy use by Census block; continuous data on beach conditions; and data on the location of every reported crime since 2001.

Thanks to all our partners and the participants who contributed to this briefing. By gaining a better understanding of the value and benefits derived from government data, we can more effectively position our federal agencies to take fuller advantage of this resource, provide a boost to the economy, and support policy decisions that further enhance our capability to improve peoples’ lives.

No comments:

Post a Comment