Start treating 'intimate data' collection as a civil rights issue, says researcher | CBC Radio - Action News
Home WebMail Friday, November 22, 2024, 10:28 AM | Calgary | -10.8°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Spark

Start treating 'intimate data' collection as a civil rights issue, says researcher

We all know tech companies collect a lot of data about us, and sell it to third parties. So is there a tipping point when it comes to how much intimate data about our health, conversations, dating habits tech ought to be able to access? Danielle Citron makes the case for treating data protection as a civil rights issue, and Sandra Wachter discusses the risks of algorithmic groups and discrimination.
Finger taps on a smartwatch screen
Privacy experts say we need better protections for our intimate data information about our health, private conversations, dating profile. (Zyabich/Shutterstock)

It's no secret that the real cost of many free online apps is the user data they collect.

And the ways in which that data is collected only get more sophisticated just this week, for example, Apple unveiled a new watch function that measures the wearer's body temperature and predicts ovulation.

Privacy expert Danielle Citron calls this kind of information "intimate data," and she studies how this data is surveilled and shared with third parties.

"So much of the information that we're sharing, that we are thinking is really private and really just for ourselves or people we trust, is being collected hand over fist and unfortunately often sold to the highest bidder and that includes data brokers, as well as governments," Citron told Spark host Nora Young.

Privacy concerns accompany this unprecedented access especially in the U.S., where biometric data and private social media messages have already been used to bring criminal charges against people seeking abortion.

"As states criminalize abortion, the information that's being amassed and collected and shared and obtained through data brokers is going to provide the circumstantial evidence that law enforcement needs to think, 'Alright, this is the person we're going to investigate,'" said Citron, who is a professor of law at the University of Virginia.

In the U.S. and Canada, there are privacy laws designed to safeguard personal information. The 1996 Health Insurance Portability and Accountability Act in the U.S. aims to protect sensitive patient health information from being disclosed without the patient's consent or knowledge. In Canada, the Privacy Act and the Personal Information Protections and Electronics Document Act (PIPEDA) promises similar protections.

But as Citron explains, our intimate health data is more than just our patient records. "There's so much data that we reveal about our health including our FitBits, when we search WebMD, go to the Mayo Clinic website all of that information is being tracked, sold to advertisers and marketers, and sold to data brokers," she said.

Algorithms that group and discriminate

Another concern is the use of machine learning algorithms and artificial intelligence (AI) to extract patterns from the large volumes of our data.

And according to AI researcher Sandra Wachter, algorithms can determine meaningful patterns in intimate data that even the users themselves don't mean to share. For example, an algorithm could determine that a user is a parent by their location data, search history and membership in parent social media groups.

"It's not just about being careful to not volunteer [that] information. The issue is that you might actually be giving it up anyway without being aware. So it's, you know, a very nosy algorithm, if you will, that learns very intimate things about you without you being aware," Wachter, a professor of technology and regulation at the University of Oxford, told Spark.

Wachter studies AI grouping, and recently wrote a paper on the need for regulation to protect against algorithmic harms.

Canadian lawmakers are currently working on this kind of legislation. In June, the federal government introduced Bill C-27, which, if implemented, will update PIPEDA and create Canada's first-ever AI legislation. The latter will impose a set of regulations on the design and use of AI systems.

Wachter applauds these efforts, but says the issue can't be solved by laws alone. "We also need to change society, so that tech can be the plaster until we start healing the wounds of inequality that plagues our society," she said.

Intimate data privacy as a civil rights issue

One of the central ideas in Citron's new book, The Fight for Privacy: Protecting Dignity, Identity, and Love in the Digital Age, is that tech companies, not individuals, should bear the responsibility to protect user data.

"They should treat our intimate data as if they're the guardians of that data," she said.

Some of this can be addressed through legislation like Canada's PIPEDA, or the proposed American Data Protection Privacy Act south of the border. But tech companies can also commit to collecting less intimate data in the first place. "If you shrink those data reservoirs, then there's only so much [the] government can access," Citron said.

"We should understand intimate privacy as a fundamental, civil right and a human right that needs rigorous protection. And that anyone handling our intimate information that includes individuals, companies and governments have duties of care, loyalty and not to sell and share information."


Written by Olsy Sorokina. Interviews with Danielle Citron and Sandra Wachter produced by Adam Killick and Nora Young.

Add some good to your morning and evening.

Get the CBC Radio newsletter. We'll send you a weekly roundup of the best CBC Radio programming every Friday.

...

The next issue of Radio One newsletter will soon be in your inbox.

Discover all CBC newsletters in theSubscription Centre.opens new window

This site is protected by reCAPTCHA and the Google Privacy Policy and Google Terms of Service apply.