Data Management Key To Federal Open Data PolicyData Management Key To Federal Open Data Policy

White House order means agencies will need to give more attention to data lifecycle management.

Wyatt Kash, former Editor, information Government

May 16, 2013

2 Min Read
information logo in a gray background | information

The CIOs of federal agencies are assessing the full scope of a White House executive order, introduced May 9, that requires agencies to make government data available in open, machine-readable formats. One likely outcome is that federal IT teams will have to give data sets the same of lifecycle management attention they already apply to hardware and software investments, said Simon Szykman, CIO at the Department of Commerce.

"Data will be part of what gets addressed early on in the lifecycle [of new systems or applications]," Szykman said during a May 14 presentation at the FOSE conference in Washington, D.C. "This makes it more systematic requirement and a bigger part of what we'll be doing in the future."

Szykman predicted the executive order will lead to "less ad hoc modeling and sharing and more systematic sharing of data," in response to questions about the policy.

[ For more on the Obama administration's Open Data policy, see White House Releases Open Data Policy. ]

Richard Holgate, assistant director for science and technology and CIO at the Bureau of Alcohol, Tobacco, Firearms & Explosives (ATF), said the executive order is significant not only for making government data more readily available to the public, but also for sharing data internally between government components.

The Department of Justice, of which ATF is one of many component organizations, publishes a variety of crime statistics that are useful to the public. But the majority of data it produces revolves around case files used by law enforcement agencies, attorneys, the courts, and other parts of the criminal justice system.

"We do a relatively poor job of passing along all that information," Holgate said. Stricter requirements to make data machine readable from the start would help make such data more readily discoverable, more accurate and timely, and reduce duplication.

The executive order was accompanied by a new Open Data Policy, issued by the Office of Management and Budget and the Office of Science and Technology Policy, which instructs federal agencies to manage government information as an asset.

Under the new policy, agencies are required to create an internal index of their data, make a public list of their public data and list all data that can be made public. Within 30 days of the policy's issuance, agencies will get access to an open online repository of tools and best practices to assist them in integrating the policy into their operations.

Federal CIO Steven VanRoekel and CTO Todd Park and will be responsible for regularly updating the online repository so that agencies continue adoption of open data practices, according to the executive order. VanRoekel, speaking at FOSE on May 15, said the goal of the new policy is to make interoperable, machine-readable data "the new default" in how federal agencies manage their data.

Read more about:

20132013

About the Author

Wyatt Kash

former Editor, information Government

Wyatt Kash is a former Editor of information Government, and currently VP for Content Strategy at ScoopMedia. He has covered government IT and technology trends since 2004, as Editor-in-Chief of Government Computer News and Defense Systems (owned by The Washington Post Co. and subsequently 1105 Media). He also was part of a startup venture at AOL, where he helped launch AOL Government. His editorial teams have earned numerous national journalism awards. He is the 2011 recipient of the G.D. Crain Award, bestowed annually on one individual nationally for outstanding career contributions to editorial excellence in American business media.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights