Federal requirement for open access: Seeing what you paid for

In early May President Obama signed an executive order that makes "Open and Machine Readable the New Default for Government Information".

This new order continues a process the President started on his first day in office with a memorandum to executive departments and agencies that stated an official openness policy for his administration. (An observation: the referenced web page is on whitehouse.gov but does not include a date for the memo - something I think would be required to have a complete history.)  While the Obama push is a welcome one, not everyone is pleased with the progress to date.

[GOVERNMENT:Senators begin contentious H-1B battle]

The most recent executive order was accompanied by a memorandum that "requires agencies to collect or create information in a way that supports downstream information processing and dissemination activities." And to do so "using machine-readable and open formats, data standards, and common core and extensible metadata for all new information creation and collection efforts," while reviewing the information for privacy, confidentiality and security.

The Obama administration's primary information sharing portal is data.gov, which was established just about four yearsago.  The site provides access to data files on all sorts of things (including President Obama's executive orders). There is a lot of data on data.gov, but much of it is in raw files that need to be downloaded before they can be used.  The new orders direct that more work be done to create APIs that would enable interactive access to the information.

Getting direct access to zillions of bytes of information on what the government is doing with (or to) our money is a good thing, even if the modes of access could be made better.  But perhaps as important are the rules John Holdren, the head of the White House Office of Science and Technology, published in late February.  

The U.S. government spends about $30 billion per year in support of basic research. That sounds like a lot of money but it is almost a round-off error on the over $3.5 trillion federal budget.  Still, that round-off error supports a lot of researchers at places like Harvard.

Traditionally, researchers would publish their results in peer-reviewed scholarly journals. Libraries would buy subscriptions to the journals, often for hundreds of dollars per journal per year.  The journal publishers use the revenue to support the publishing and peer review processes. In addition, for-profit journal publishers also would like to make some money.  Even though federal rules have, for years, required researchers to provide access to their raw data so other researchers could verify their work, this data has been generally hard to get a hold of.

Holdren's new rules require that most federal funding agencies develop plans to require easy Internet-based access to research papers and to the raw research data within a year or so of the publication of a paper. The delay will let the publishers of scholarly journals preserve their existing business models since the maximum value of a paper tends to be greatest in the year after publication.  The rules do permit some wiggle room on when the papers need to be made available, but the fact that independent researchers, and the general public (read taxpayers) can get reasonably quick access to the results of the research we pay for is a Good Thing.  

Some universities, including Harvard, have been pushing for this type of open access for years It is good to see the feds working for the same goals.

Disclaimer: The above-mentioned efforts and rules generally apply to information, not processes. I will not opine on whether the Obamaadministration is transparent -- in the understanding-how-decisions-are-made sense -- as well as open.  

Bradner is Harvard University's Senior Technology Consultant. Reach him at sob@sobco.com.

Read more about wide area network in Network World's Wide Area Network section.

Join the CSO newsletter!

Error: Please check your email address.

Tags securityWide Area Network

More about BradnerHarvard UniversityTechnology

Show Comments

Featured Whitepapers

Editor's Recommendations

Solution Centres

Stories by Scott Bradner

Latest Videos

  • 150x50

    CSO Webinar: Will your data protection strategy be enough when disaster strikes?

    Speakers: - Paul O’Connor, Engagement leader - Performance Audit Group, Victorian Auditor-General’s Office (VAGO) - Nigel Phair, Managing Director, Centre for Internet Safety - Joshua Stenhouse, Technical Evangelist, Zerto - Anthony Caruana, CSO MC & Moderator

    Play Video

  • 150x50

    CSO Webinar: The Human Factor - Your people are your biggest security weakness

    ​Speakers: David Lacey, Researcher and former CISO Royal Mail David Turner - Global Risk Management Expert Mark Guntrip - Group Manager, Email Protection, Proofpoint

    Play Video

  • 150x50

    CSO Webinar: Current ransomware defences are failing – but machine learning can drive a more proactive solution

    Speakers • Ty Miller, Director, Threat Intelligence • Mark Gregory, Leader, Network Engineering Research Group, RMIT • Jeff Lanza, Retired FBI Agent (USA) • Andy Solterbeck, VP Asia Pacific, Cylance • David Braue, CSO MC/Moderator What to expect: ​Hear from industry experts on the local and global ransomware threat landscape. Explore a new approach to dealing with ransomware using machine-learning techniques and by thinking about the problem in a fundamentally different way. Apply techniques for gathering insight into ransomware behaviour and find out what elements must go into a truly effective ransomware defence. Get a first-hand look at how ransomware actually works in practice, and how machine-learning techniques can pick up on its activities long before your employees do.

    Play Video

  • 150x50

    CSO Webinar: Get real about metadata to avoid a false sense of security

    Speakers: • Anthony Caruana – CSO MC and moderator • Ian Farquhar, Worldwide Virtual Security Team Lead, Gigamon • John Lindsay, Former CTO, iiNet • Skeeve Stevens, Futurist, Future Sumo • David Vaile - Vice chair of APF, Co-Convenor of the Cyberspace Law And Policy Community, UNSW Law Faculty This webinar covers: - A 101 on metadata - what it is and how to use it - Insight into a typical attack, what happens and what we would find when looking into the metadata - How to collect metadata, use this to detect attacks and get greater insight into how you can use this to protect your organisation - Learn how much raw data and metadata to retain and how long for - Get a reality check on how you're using your metadata and if this is enough to secure your organisation

    Play Video

  • 150x50

    CSO Webinar: How banking trojans work and how you can stop them

    CSO Webinar: How banking trojans work and how you can stop them Featuring: • John Baird, Director of Global Technology Production, Deutsche Bank • Samantha Macleod, GM Cyber Security, ME Bank • Sherrod DeGrippo, Director of Emerging Threats, Proofpoint (USA)

    Play Video

More videos

Blog Posts

Market Place