It is evident that the COVID disruption has ushered in a wave of digital transformation for businesses – some have thrived and some have struggled. But, big tech companies have been growing at enormous speeds over the past decade. Many lawmakers, activists, and others have been calling for regulating big tech like Google, Apple, and Facebook.
There are many battles between lawmakers and big tech, antitrust being at the forefront. However, I want to focus on data privacy because it presents complex dilemmas and it seems that Congress will not regulate it anytime soon.
First, companies have been profiting off selling personal data for years. Maybe we should cut out the middleman and sell it ourselves…?
With efforts stalled in Congress for specifically regulating data privacy, it seems public pressure has pushed these companies to improve the protection of consumers’ data.
(On a side note, whenever I read or hear that big tech is protecting our privacy, I am skeptical.)
Which brings me to a dilemma – what if giving up some of your private data meant you gained access to something of significant personal value? Or giving up your private data meant bettering society? One might argue that this happens currently, but I believe with technology advances, the decision will become more complex.
To answer the first question, I want to focus on wearable tech and personal health data as an example. More and more people are using wearable technology with sales expected to increase 18.1% to $81.5 billion this year. Wearable tech has improved to better track health data such as blood pressure, O2 levels, and more. The tragedy and destruction caused by COVID may have persuaded people to adopt wearable tech faster and push companies to collect health data to properly diagnose COVID or other health concerns. Am I willing to give up this private health data to get better treatment? To be notified if I am at a high risk of a virus? To gain access to a vaccine? On the surface that sounds like a good trade, but this data in the wrong hands could lead to discrimination. Could this data lead to an in-depth analysis of my pre-existing conditions, allowing insurance companies to charge me more? Currently this practice is illegal, but that doesn’t mean it will last forever or there aren’t legal loopholes.
Another more specific example that answers the second question is Apple’s attempt to combat child sexual abuse on iPhones. Apple and other major tech companies were under scrutiny for not doing enough to combat child sexual abuse. Apple recently announced that they had developed new software to “root out images of sexual abuse from iPhones”. On the surface, that seems like a great idea. Identify child sexual abusers, then bring these people to justice. This software also allows parents to set safety parameters on their child’s phone that could block sensitive pictures sent via text.
- Apple scans someone’s photos stored on their phone, not on iCloud, for unique ID numbers (each photo has one when the photo is created)
- They cross-reference the unique ID of photos to a database of child sexual abuse photos
- If there are more than 30 matches, they bring in an employee to confirm
- If confirmed, that person’s private information is presented to the authorities
Once this software is uploaded to new versions of an iPhone, it could be used in malicious ways. Could a government scan private citizens’ phones to see images that criticize the government? What is actually considered my personal, private property? Reminds me of Pandora’s box – once it’s out there, you can’t take it back.
But, this technology could effectively reduce child sexual abuse. Am I willing to give up my privacy in order to help eliminate child sexual abuse? I think that is a loaded question, because the situation is far more complex.
Emerging technologies could usher in new advances in improving our healthcare and protecting vulnerable people, to name two examples. Companies, big and small, need to address how they are protecting consumer data while also maintaining a competitive advantage by collecting it. It is a delicate line to walk, and companies will need to tailor messaging in order to effectively convey what they are doing and why they are doing it.
Consumers need to take more time evaluating the trade-offs of providing or not providing data to companies. This could be from a consumer perspective or from your employer who wants to gather your data. This evaluation is also dependent on transparency and trust.
I expect companies will experiment with data collection by separating what data consumers can opt to share. At the same time, I expect consumers to face complex decisions for sharing data based on the benefits as technology improves.
There will come a point when I will ask whether it’s worth sharing personal data in the long run, and many people may feel differently. As for me, when I am presented with those wordy, legalese terms and conditions pop-ups, I may respond with: