by Mikeal Rogers, Wired Magazine, March 7, 2013
GitHub was intended to be an open software collaboration platform, but it’s become a platform for much, much more than code. It’s now being used by artists, builders, home owners, everyone in between, entire companies … and cities. GitHub is doing to open source what the internet did to the publishing industry.“ Anyone can now change the data when new bike paths are built, when roads are under construction, and new buildings are erected,” the city of Chicago recently announced. … Perhaps not so surprisingly, he has about 17 open “pull” requests for changes. And of course, GitHub is still used by programmers and developers flying AR Drones with Node.js or building websites with jQuery.
For full text of this article, visit The GitHub Generation: Why We’re All in Open Source Now | Wired Opinion | Wired.com.
- The GitHub Revolution: We’re All in Open Source Now (wired.com)
- Chicago Using GitHub Has Potential For More Citizen Participation in Government (architects.dzone.com)
by Joe Francica, Directions Magazine, March 4, 2013
FedGeoDay, held in Washington, D.C. this past week, can best be described as an advocacy forum for open source geospatial technology and data. Some of the leading organizations, government agencies and companies invested in open source tech sponsored the conference. Editor in Chief Joe Francica attended this first-time event, which drew over 250 people.
For full text of this article, please visit FedGeoDay: Advocating for Open Source – Directions Magazine.
Two articles by the Wilson Center highlighting the need for #opensource and #agile (#FedGeoDay) include:
1) Mike Byrne’s report “The National Broadband Map: A Case Study on Open Innovation for National Policy”
2) “Too Big to Succeed: The Need for Federal IT Reform“
- FedGeoDay Schedule Out, Jam Packed (mapbox.com)
All Points Blog, Feb 25, 2013
Tim de Troye from the State of South Carolina offered a presentation that is an ongoing issue among states and local governments about how they distribute geospatial data collected with taxpayer money. He recognized that some organizations copyright their data and that data in South Carolina, for example, is available but through different agreements depending on whether it is spatial or not.
The big question in licensing geospatial data is to license or not to license?
For full text of this article, please visit To License or Not to License Geospatial Data: Still a Challenge for Government Agencies – All Points Blog.
- Spatial experts added to Immigration’s skills shortage list (computerworld.co.nz)
- New NRC Report: Future U.S. Workforce for Geospatial Intelligence (geodatapolicy.wordpress.com)
The National Broadband Map: A Case Study on Open Innovation for National Policy
The National Broadband Map is a powerful consumer protection tool developed by the FCC to provide consumers nationwide reliable information on broadband internet connections. Through consistent public engagement and the use of emerging crowdsourcing technologies and open-source software, the project was able to promote government transparency and trust in government, while finishing on time and avoiding cost overruns. The National Broadband Map is a vital example of the benefits to all when government prioritizes transparency, allows itself to be guided by the public, and directs national policy based on robust and reliable data. Published by the Commons Lab of the Science and Technology Innovation Program, Woodrow Wilson International Center for Scholars, Washington, DC September 2012.
To download a copy of the REPORT, click on the Commons Lab Scribed webpage here.
To watch the archived VIDEO on the rollout event, visit the Commons Lab YouTube page.
- Commons Lab and FCC Releases New Report on the National Broadband Map (geodatapolicy.wordpress.com)
The following is part of a special series of policy briefs by the Woodrow Wilson International Center for Scholars running until inauguration day. This piece, written by Commons Lab Early Career Scholar Zachary Bastian, tackles the need for reform in federal information technology.
As the world has become more dependent on information technology (IT), so has the federal government and its constituencies. Leveraged effectively, technical tools can engage the public, create cost savings, and improve outcomes. These benefits are obscured by regular reminders that federal IT is fundamentally flawed. It is too big to succeed. For IT to become sustainable, the federal government must enable change in three categories: 1) embracing agile development, modular contracting, and open-source software, 2) prioritizing small business participation, and 3) shifting the federal IT culture towards education and experimentation. The adoption of these reforms is vital. The current state of federal IT undermines good work through inefficiency and waste.
- Too Big to Succeed: The Need for Federal IT Reform (disaster-net.com)
by Jennifer Chan, US News and World Report, Op-Eds, November 23, 2012
Dr. Jennifer Chan, a Public Voices fellow at the OpEd Project, is the director of Global Emergency Medicine in the Department of Emergency Medicine at Northwestern University’s Feinberg School of Medicine and an associate faculty member of the Harvard Humanitarian Initiative.
In the wake of Sandy’s destruction, digital volunteers mobilized again. From their homes and offices, using iPads and laptops, hundreds of volunteers crowd-sourced information and took on microtasks to help FEMA and other agencies process large swaths of information and speed humanitarian response.
For instance, in the first 48 hours after the hurricane, 381 aerial photos collected by the Civil Air Patrol were viewed by hundreds of volunteers, with the goal of quickly giving an overview of the extent of storm and flood damage. This project was called the Humanitarian OpenStreetMap MapMill project. In response to a request from FEMA, project developer Schuyler Erle volunteered to launch and lead the project. By mid-afternoon November 2nd, more than 3,000 volunteers had assessed 5,131 images, viewing them more than 12,000 times. Just a week later, more than 24,000 images had been assessed. Each view from a digital volunteer—a mother, a researcher, a friend, a colleague—helped FEMA determine the degree of damage along the eastern seaboard, assessing the condition of buildings, roads, and houses, with the aim of helping the agency in its post-disaster recovery and planning. That’s an amazing effort.
But did it actually help?
For full text of the op-ed, visit How To Make Crowdsourcing Disaster Relief Work Better – US News and World Report.
- How To Make Crowdsourcing Disaster Relief Work Better (usnews.com)
- Crowdsourcing the Evaluation of Post-Sandy Building Damage Using Aerial Imagery (irevolution.net)
The National Broadband Map: A Case Study on Open Innovation for National Policy
Commons Lab Blog, October 2012
The National Broadband Map, designed to provide consumers nationwide reliable information on broadband internet connections, was built incorporating emerging technology. It protects consumers, holds the government and private sector accountable, and engages the public across the United States. In a time of budgetary constraint, the Map made a series of remarkable policy innovations that allowed the project to be completed in minimal time and at a reduced cost. The public was engaged before, during, and after the project. Citizens generated speed testing data. They provided comments and feed back on improving internet connectivity. They used a National Broadband Map crowdsource utility to let the FCC know whether the information they posted was accurate. The data collected is open, freely available to anyone. The application itself was built using open-source software unchained by licensing fees, enhancing its flexibility and accessibility. The development process broke from traditional government procurement, and programmers regularly communicated with uses to better understand the needs of the project: this avoided cost overruns and unused features.