New tech transforms transparency into privacy

Preserving privacy by keeping information secret isn't working. Consumers give away precious data for online baubles. Data breaches, large and small, spill data all over the Web. Marketers indiscriminiately gather details about the online lives of people in their target markets.

Does that mean we should be reading the last rites over privacy? Not necessarily, say two researchers at the Massachusetts Institute of Technology, who are working on a new Internet protocol that could preserve privacy by making information less private.

The researchers at MIT's Computer Science and Artificial Intelligence Lab -- graduate student Oshani Seneviratne and principal research scientist Lalana Kagal -- call the protocol HTTPA -- HTTP with Accountability. The protocol doesn't attempt to shroud data in secrecy. Rather, it allows the owner of the data to attach conditions for its use. It also allows usage of the data to be audited so its owner can determine if its conditions are being followed.

Under the scheme, data would be assigned a Uniform Resource Identifier. Just as a Uniform Resource Locator identifies pages on the Web, URIs would be used to identify a piece of data there. URIs are already in use on the Semantic Web, but most Net surfers don't fraternize with them because communication on the Semantic Web is largely done between machines, not humans.

The URIs could be used by the creator of data resources to attach conditions to it. Those conditions could describe who should or should not be looking at the information, how it may be used, maybe even when it should be destroyed. The URIs are also used to keep tabs on the data. Those records would be stored on secure servers across the Internet, and the only person who would have access to those servers is the one who owns the data.

When a data owner wants to see what's happened to their data, they can request an audit from the system of servers, which will identify everyone who accessed the data and what they've done with it.

Initially, the researchers see the system as voluntary. It would be left to developers to support the protocol, although the researchers believe implementing the technology isn't difficult. Whenever an HTTP request is made, a server would inform the requester of the restrictions on the resource and log the transaction on one of the secure servers. "Every resource on the Web that needs a usage restriction would have one attached," explained researcher Seneviratne. "In that way, you can easily track how the resource has been propagated across the Web using the protocol."

She explained that there could be hundreds of resources on a web page. "HTTPA would just be applied to resources that need protection," she said. "You don't want to protect every resource, just those that contain sensitive information or protected content."

Voluntarism and transparency may seem like naive methods for protecting privacy, but without a commitment to "do the right thing" all attempts to preserve privacy may be doomed to fail. "We have to rely on the majority of people in the community wanting to respect privacy," said Steve Wilson, vice president and principal analyst with Constellation Research, a business and technology consultancy. "I think most businesses want to do the right thing, although they don't always agree what the right thing is."

Getting all the stakeholders to lineup behind a scheme like HTTPA will also be challenging, but not without precedent. For example, the bar code had voluntary origins, and it has become ubiquitous today. "In the case of HTTPA, you will have to have the producers of information and the consumers of information all understand and use the standard," explained Allan Friedman, a research scientist at George Washington University. "The trick is making sure all the systems talk to one another."

What sets HTTPA apart from the typical system designed to preserve privacy is that it can be also used to foster data sharing. "There's a real benefit to be gained from the judicious application of data analytics on all of our data," Friedman said. "But if you let that process run rampant, everyone will have everyone else's data, and that is a very dangerous position."

"HTTPA is a way of allowing us to begin to tap the benefits of sharing data across different sources while still allowing some personal preferences for privacy," he added.

In an era where privacy is on the ropes and about to be counted out, HTTPA has some intriguing potential, maintains Alfred Essa, vice president for analytics and R&D at McGraw-Hill Education. "Traditional mechanisms, like access control and encryption, are important and necessary, but they're not enough," he said. "We need more to put the user in the driver's seat when it comes to protecting their own sensitive data."

"That's what HTTPA or other systems or frameworks like it can do that when it comes to protecting data," he added.

Putting users in control of how their data is used, however, will have limited impact on privacy if it requires too much effort. Users have shown in the past that if presented with a choice of giving up data or proactively protecting it, they'll choose the path of least resistance, which is giving up the data, although there are signs that's changing. "If a framework like HTTPA is to be practical," Essa noted, "the end user should not have to spend time tracking this stuff down."

That's not the case with HTTPA at the moment, and that's the way the researchers want it for now. "The current implementation does not include enforcement," Seneviratne said. "It's designed for after-the-fact checking."

"If it included enforcement, it would be like a DRM [Digital Rights Management] system," she explained. "We wanted to stay away from a system that enforces things." DRM has always been a controversial technology and its critics contend that attempts to spread its use on the Internet will degrade the Web experience for users.

Although HTTPA is designed for public Internet use, it could have applications in the enterprise. "If an enterprise is afraid of using the global tracking network, they could use their own network and use it to log transactions within the enterprise," Seneviratne said.

As with any new technology -- especially one involving the overseeing of sensitive data -- entrenched ways are difficult to surmount. All one has to do is look at the continued resistance to cloud computing as an example of that. So Seneviratne has no illusions about the road ahead for HTTPA. "By making everything transparent and accountable, you can ensure privacy," she said, "but not many people are favorable to that line of thought because privacy implies information is hidden from people."

Join the CSO newsletter!

Error: Please check your email address.

Tags securityMassachusetts Institute of Technologyintelprivacy

More about Massachusetts Institute of TechnologyMITTechnology

Show Comments

Featured Whitepapers

Editor's Recommendations

Solution Centres

Stories by John P. Mello Jr.

Latest Videos

  • 150x50

    CSO Webinar: Will your data protection strategy be enough when disaster strikes?

    Speakers: - Paul O’Connor, Engagement leader - Performance Audit Group, Victorian Auditor-General’s Office (VAGO) - Nigel Phair, Managing Director, Centre for Internet Safety - Joshua Stenhouse, Technical Evangelist, Zerto - Anthony Caruana, CSO MC & Moderator

    Play Video

  • 150x50

    CSO Webinar: The Human Factor - Your people are your biggest security weakness

    ​Speakers: David Lacey, Researcher and former CISO Royal Mail David Turner - Global Risk Management Expert Mark Guntrip - Group Manager, Email Protection, Proofpoint

    Play Video

  • 150x50

    CSO Webinar: Current ransomware defences are failing – but machine learning can drive a more proactive solution

    Speakers • Ty Miller, Director, Threat Intelligence • Mark Gregory, Leader, Network Engineering Research Group, RMIT • Jeff Lanza, Retired FBI Agent (USA) • Andy Solterbeck, VP Asia Pacific, Cylance • David Braue, CSO MC/Moderator What to expect: ​Hear from industry experts on the local and global ransomware threat landscape. Explore a new approach to dealing with ransomware using machine-learning techniques and by thinking about the problem in a fundamentally different way. Apply techniques for gathering insight into ransomware behaviour and find out what elements must go into a truly effective ransomware defence. Get a first-hand look at how ransomware actually works in practice, and how machine-learning techniques can pick up on its activities long before your employees do.

    Play Video

  • 150x50

    CSO Webinar: Get real about metadata to avoid a false sense of security

    Speakers: • Anthony Caruana – CSO MC and moderator • Ian Farquhar, Worldwide Virtual Security Team Lead, Gigamon • John Lindsay, Former CTO, iiNet • Skeeve Stevens, Futurist, Future Sumo • David Vaile - Vice chair of APF, Co-Convenor of the Cyberspace Law And Policy Community, UNSW Law Faculty This webinar covers: - A 101 on metadata - what it is and how to use it - Insight into a typical attack, what happens and what we would find when looking into the metadata - How to collect metadata, use this to detect attacks and get greater insight into how you can use this to protect your organisation - Learn how much raw data and metadata to retain and how long for - Get a reality check on how you're using your metadata and if this is enough to secure your organisation

    Play Video

  • 150x50

    CSO Webinar: How banking trojans work and how you can stop them

    CSO Webinar: How banking trojans work and how you can stop them Featuring: • John Baird, Director of Global Technology Production, Deutsche Bank • Samantha Macleod, GM Cyber Security, ME Bank • Sherrod DeGrippo, Director of Emerging Threats, Proofpoint (USA)

    Play Video

More videos

Blog Posts

Market Place