Data security professor says it’s time to improve internet privacy laws

The laws that govern privacy on the internet fall short of protecting users as much as they should, says Northeastern professor Woodrow Hartzog. Photo by iStock.

High-profile hacks of private data have made one thing abundantly clear to privacy advocates and lawmakers alike, says Northeastern professor Woodrow Hartzog: The laws that govern privacy on the internet fall short of protecting users as much as they should.

Some lawmakers believe that the solution to improving internet privacy lies in revamping privacy policies and terms of service agreements, says Hartzog, who teaches law and computer science. But he has found this approach to be “hopelessly flawed.”

Woodrow Hartzog. Photo by Matthew Modoono/Northeastern University

“I think that now is the time for some regulatory and legislative imagination,” Hartzog says. “I think that it would be unwise to simply keep doing more of what we’ve always done when the technology and the reality of data in the world is so different than it was when we first created these principles earlier this century.”

Hartzog, who specializes in privacy, data protection, robotics, and automated technologies, will argue these points Wednesday at a hearing on what Congress should do to address risks to consumers and better protect the private data of all Americans. 

Here, Hartzog explains in greater detail what he plans to say at the hearing, which will be held by the Senate’s Committee on Commerce, Science, and Transportation in Washington, D.C.

Whats the problem with the privacy protections that we currently have in place?

Companies will say that people’s privacy is very important to them, and in isolated contexts there are a number of companies that are going to great lengths to protect people’s privacy, but a lot of it right now is voluntary rather than mandatory and can change over time. It’s kind of contextually specific and patchwork, and so without a more robust regulatory framework designed to ensure that trust, people are still sort of at the mercy of companies who still act in their own financial interests. The problem is that when you leave it to companies in a self-regulatory model, they are incentivized to collect as much information as possible because that’s what’s valuable.

What is the “notice and choice” approach and why do you say that it is ineffective?

Notice and choice is a regulatory approach. As long as a company tells you what it does with your personal information, and you have the option of not using the service, then [you have consented to that company’s data practices]. It shows up with language like consent requirements, which manifests itself as the millions of “I agree” buttons that we have on the internet, or general transparency requirements, which typically manifests itself in the form of privacy policies.

One possible solution would be to do more of it. But for the past 10 years, I’ve been arguing that, in fact, these are the wrong goals to be pursuing in the first place, which is what I’m hoping to highlight in my testimony. That when we pursue transparency and consent control, what we’re actually doing is putting the burden and the risk of disclosing information online on people themselves. We’re asking them to engage in an impossibly complex calculus multiple times a day regarding how their information might be risky, and trying to get them to understand the scope of full data protection is simply asking too much.

We all can relate to the fatigue and despondency of clicking “I agree” yet again when we had no idea what we’re agreeing to, and it’s irrational to think that we would ever be able to read all of the privacy policies that exist.

What are some solutions that you plan to advocate?

One of the the alternatives that I’ve been arguing for for the past couple of years are trust-based rules that are designed to create non-waivable obligations on companies to be discreet, to be honest, to be protected, and to be loyal to us, who trust these companies with our personal information. These are the rules designed to keep that trust and relieve the burden of the risk calculus from people and better balance the burden of protection where companies are able to carry that burden a little more.

Of course, not every company wants to be a full data player, but there are other things, like very specific data collection limitations, that say that if you’re going to collect data and use it, you have to have a legitimate interest in doing so. It’s possible the lawmakers could take that concept and get serious about it.

For media inquiries, please contact media@northeastern.edu.