Tag Archive | Open Source

The GitHub Generation: Why We’re All in Open Source Now

by Mikeal Rogers, Wired Magazine, March 7, 2013

GitHub was intended to be an open software collaboration platform, but it’s become a platform for much, much more than code. It’s now being used by artists, builders, home owners, everyone in between, entire companies … and cities. GitHub is doing to open source what the internet did to the publishing industry.“ Anyone can now change the data when new bike paths are built, when roads are under construction, and new buildings are erected,” the city of Chicago recently announced. … Perhaps not so surprisingly, he has about 17 open “pull” requests for changes. And of course, GitHub is still used by programmers and developers flying AR Drones with Node.js or building websites with jQuery.

For full text of this article, visit The GitHub Generation: Why We’re All in Open Source Now | Wired Opinion | Wired.com.

 

FedGeoDay: Advocating for Open Source

by Joe Francica, Directions Magazine, March 4, 2013

FedGeoDay, held in Washington, D.C. this past week, can best be described as an advocacy forum for open source geospatial technology and data. Some of the leading organizations, government agencies and companies invested in open source tech sponsored the conference. Editor in Chief Joe Francica attended this first-time event, which drew over 250 people.

For full text of this article, please visit FedGeoDay: Advocating for Open Source – Directions Magazine.

Two articles by the Wilson Center highlighting the need for #opensource and #agile (#FedGeoDay) include:
1) Mike Byrne’s report “The National Broadband Map: A Case Study on Open Innovation for National Policy”
2) “Too Big to Succeed: The Need for Federal IT Reform

 

To License or Not to License Geospatial Data: Still a Challenge for Government Agencies

All Points Blog, Feb 25, 2013

Tim de Troye from the State of South Carolina offered a presentation that is an ongoing issue among states and local governments about how they distribute geospatial data collected with taxpayer money. He recognized that some organizations copyright their data and that data in South Carolina, for example, is available but through different agreements depending on whether it is spatial or not.

The big question in licensing geospatial data is to license or not to license?

For full text of this article, please visit To License or Not to License Geospatial Data: Still a Challenge for Government Agencies – All Points Blog.

 

Wilson Center Report and Video on Crowdsourcing for the National Broadband Map

The National Broadband Map: A Case Study on Open Innovation for National Policy

by Zachary Bastian, Wilson Center‘s Commons Lab, and Michael Byrne, FCC.

The National Broadband Map is a powerful consumer protection tool developed by the FCC to provide consumers nationwide reliable information on broadband internet connections. Through consistent public engagement and the use of emerging crowdsourcing technologies and open-source software, the project was able to promote government transparency and trust in government, while finishing on time and avoiding cost overruns. The National Broadband Map is a vital example of the benefits to all when government prioritizes transparency, allows itself to be guided by the public, and directs national policy based on robust and reliable data. Published by the Commons Lab of the Science and Technology Innovation Program, Woodrow Wilson International Center for Scholars, Washington, DC September 2012.

To download a copy of the REPORT, click on the Commons Lab Scribed webpage here.

To watch the archived VIDEO on the rollout event, visit the Commons Lab YouTube page.

Too Big to Succeed: The Need for Federal IT Reform

The following is part of a special series of policy briefs by the Woodrow Wilson International Center for Scholars running until inauguration day. This piece, written by Commons Lab Early Career Scholar Zachary Bastian, tackles the need for reform in federal information technology.

As the world has become more dependent on information technology (IT), so has the federal government and its constituencies. Leveraged effectively, technical tools can engage the public, create cost savings, and improve outcomes. These benefits are obscured by regular reminders that federal IT is fundamentally flawed. It is too big to succeed. For IT to become sustainable, the federal government must enable change in three categories: 1) embracing agile development, modular contracting, and open-source software, 2) prioritizing small business participation, and 3) shifting the federal IT culture towards education and experimentation. The adoption of these reforms is vital. The current state of federal IT undermines good work through inefficiency and waste.

Click here to read the remainder of this brief on Scribd.

How To Make Crowdsourcing Disaster Relief Work Better

by Jennifer Chan, US News and World Report, Op-Eds, November 23, 2012

Dr. Jennifer Chan, a Public Voices fellow at the OpEd Project, is the director of Global Emergency Medicine in the Department of Emergency Medicine at Northwestern University’s Feinberg School of Medicine and an associate faculty member of the Harvard Humanitarian Initiative.

In the wake of Sandy’s destruction, digital volunteers mobilized again. From their homes and offices, using iPads and laptops, hundreds of volunteers crowd-sourced information and took on microtasks to help FEMA and other agencies process large swaths of information and speed humanitarian response.

For instance, in the first 48 hours after the hurricane, 381 aerial photos collected by the Civil Air Patrol were viewed by hundreds of volunteers, with the goal of quickly giving an overview of the extent of storm and flood damage. This project was called the Humanitarian OpenStreetMap MapMill project. In response to a request from FEMA, project developer Schuyler Erle volunteered to launch and lead the project. By mid-afternoon November 2nd, more than 3,000 volunteers had assessed 5,131 images, viewing them more than 12,000 times. Just a week later, more than 24,000 images had been assessed. Each view from a digital volunteer—a mother, a researcher, a friend, a colleague—helped FEMA determine the degree of damage along the eastern seaboard, assessing the condition of buildings, roads, and houses, with the aim of helping the agency in its post-disaster recovery and planning. That’s an amazing effort.

But did it actually help?

For full text of the op-ed, visit How To Make Crowdsourcing Disaster Relief Work Better – US News and World Report.

 

Commons Lab and FCC Releases New Report on the National Broadband Map

The National Broadband Map: A Case Study on Open Innovation for National Policy

To download the report and watch the archived video, click here.

Commons Lab Blog, October 2012

The National Broadband Map, designed to provide consumers nationwide reliable information on broadband internet connections, was built incorporating emerging technology.  It protects consumers, holds the government and private sector accountable, and engages the public across the United States.  In a time of budgetary constraint, the Map made a series of remarkable policy innovations that allowed the project to be completed in minimal time and at a reduced cost. The public was engaged before, during, and after the project.  Citizens generated speed testing data.  They provided comments and feed back on improving internet connectivity.  They used a National Broadband Map crowdsource utility to let the FCC know whether the information they posted was accurate.  The data collected is open, freely available to anyone.  The application itself was built using open-source software unchained by licensing fees, enhancing its flexibility and accessibility.  The development process broke from traditional government procurement, and programmers regularly communicated with uses to better understand the needs of the project: this avoided cost overruns and unused features.

Read More…

Live Webcast: The National Broadband Map: A Case Study on Open Innovation for National Policy

Live webcast, Oct 15 at 9:30 AM Eastern:

The National Broadband Map, designed to provide consumers nationwide reliable information on broadband internet connections, was built incorporating emerging technology.  It protects consumers, holds the government and private sector accountable, and engages the public across the United States.  In a time of budgetary constraint, the Map made a series of remarkable policy innovations that allowed the project to be completed in minimal time and at a reduced cost.

The public was engaged before, during, and after the project.  Citizens generated speed testing data.  They provided comments and feed back on improving internet connectivity.  They used a National Broadband Map crowdsource utility to let the FCC know whether the information they posted was accurate.  The data collected is open, freely available to anyone.  The application itself was built using open-source software unchained by licensing fees, enhancing its flexibility and accessibility.  The development process broke from traditional government procurement, and programmers regularly communicated with uses to better understand the needs of the project: this avoided cost overruns and unused features.

The incorporation of geographic information systems allows users to identify broadband internet options in their area, and policy makers to identify geographic gaps in service needing support.  This combination of techniques created a flexible resource that has already guided appropriations through the Connect America Fund. It continues to be applied to other communications challenges such as mobile broadband connectivity.  The National Broadband Map demonstrates that there is room for agencies to innovate and promotes a national conversation on how to improve government outcomes in the 21st century.

The National Broadband Map is a vital example of the benefits available to all when government prioritizes transparency, allows itself to be guided by the public, and directs policy based on robust and reliable data.

To RSVP for the event, watch the live webcast, or download a copy of the report, click here.

Follow on Twitter with #NBMcrowd

International Open Government Data Conference 2012

Data.gov and the World Bank are joining forces to sponsor the second International Open Government Data Conference (IOGDC) to be held on July 10-12, 2012, in Washington D.C. at the World Bank Headquarters at 1818 H Street NW. The IOGDC will gather policymakers, developers, and others with a keen interest in open government data to share lessons learned, stimulate new ideas, and demonstrate the power of democratizing data.

The IOGDC will bring together the world’s foremost experts on open government data. From policy to technology, IOGDC promises to be filled with thoughtful, dynamic discussion around the historic opportunity presented by open government data to foster collaboration, transparency, and interactive public participation. There is no cost to attend, but preregistration is required.

The full agenda is at: http://www.data.gov/communities/conference and you can download a PDF version. The event will be web streamed live online at http://bit.ly/IOGDC-Live. You can follow and tweet about the event using the hashtag #IOGDC – there will also be daily recap featured on the World Bank Open Data Blog.

Tech@State: Data Visualization

The next Tech@State, scheduled for Sept 23-24, will feature new innovative and fascinating data visualization techniques. The event will also be streamed live on the Internet.

Agenda for Data Visualization

DAY 1:

8:00 AM – Doors Open

8:50 – 9:00 AM – Introduction, Suzanne Hall – Senior Advisor for Innovation, Bureau of Educational and Cultural Affairs

9:00 – 9:15 AM – Welcome, Dr Kerri-Ann Jones, Assistant Secretary of State, Bureau of Oceans and International Environmental and Scientific Affairs

9:15 – 10:15 AM – Keynote Address – ‘Policy and Technology’ Edward Tufte

10:15 – 10:30 AM – Coffee Break

10:30 – 11:25 AM – Panel on ‘Development Challenge: Open Data to Making Sense of Data’

11:25 AM – 12:20 PM – Panel on ‘Latest Trends in Data Visualization’

12:20 – 12:30 PM – Showing of ‘Connected’ Trailer & Declaration of Interdependence Project

12:30 – 1:30 PM – Lunch

— Afternoon Breakouts —

1:30 – 3:00 PM

Session A

1.  Supporting Disaster Response and Coordination – Panelists Bios & Photos

2.  Visualizations for Aid Transparency and Management – Panelists Bios & Photos

3.  Best Practices for Visualization Interoperability – Panelist Bios & Photos

4.  State Department and USAID Data Visualization Projects – Panelist Bios & Photos

3:00 – 3:30 PM – Coffee Break

3:30 – 5:00 PM

Session B

1.  Using Climate and Health Data to Monitor Food Insecure Areas – Panelist Bios & Photos

2.  Mobile Technology and New Media:  Trends and Opportunities – Panelist Bios & Photos

3.  Turning Information into Insight – Panelist Bios & Photos

4.  New Ways to Visualize Development Data – Panelist Bios & Photos

 

 

Follow

Get every new post delivered to your Inbox.

Join 2,251 other followers

%d bloggers like this: