Data Privacy: Initial Expectations for Emerging Technologies and Digital Business

It is evident that the COVID disruption has ushered in a wave of digital transformation for businesses – some have thrived and some have struggled. But, big tech companies have been growing at enormous speeds over the past decade. Many lawmakers, activists, and others have been calling for regulating big tech like Google, Apple, and Facebook.  

There are many battles between lawmakers and big tech, antitrust being at the forefront. However, I want to focus on data privacy because it presents complex dilemmas and it seems that Congress will not regulate it anytime soon.

First, companies have been profiting off selling personal data for years. Maybe we should cut out the middleman and sell it ourselves…?

There has been pushback from both lawmakers and consumers about the collection of this data and how it’s being used. Public pressure has pushed a company like Apple to better protect data privacy. As a result, Apple has aggressively pushed to inform consumers about how their data is being safely collected and protected. In 2020, they updated their privacy policy to allow users to opt out of app tracking. Google has even followed Apple’s Safari and Firefox in limiting cookies to better protect consumers’ data.

With efforts stalled in Congress for specifically regulating data privacy, it seems public pressure has pushed these companies to improve the protection of consumers’ data.

(On a side note, whenever I read or hear that big tech is protecting our privacy, I am skeptical.)

Which brings me to a dilemma – what if giving up some of your private data meant you gained access to something of significant personal value? Or giving up your private data meant bettering society? One might argue that this happens currently, but I believe with technology advances, the decision will become more complex.

To answer the first question, I want to focus on wearable tech and personal health data as an example. More and more people are using wearable technology with sales expected to increase 18.1% to $81.5 billion this year. Wearable tech has improved to better track health data such as blood pressure, O2 levels, and more. The tragedy and destruction caused by COVID may have persuaded people to adopt wearable tech faster and push companies to collect health data to properly diagnose COVID or other health concerns. Am I willing to give up this private health data to get better treatment? To be notified if I am at a high risk of a virus? To gain access to a vaccine? On the surface that sounds like a good trade, but this data in the wrong hands could lead to discrimination. Could this data lead to an in-depth analysis of my pre-existing conditions, allowing insurance companies to charge me more? Currently this practice is illegal, but that doesn’t mean it will last forever or there aren’t legal loopholes.

Another more specific example that answers the second question is Apple’s attempt to combat child sexual abuse on iPhones. Apple and other major tech companies were under scrutiny for not doing enough to combat child sexual abuse. Apple recently announced that they had developed new software to “root out images of sexual abuse from iPhones”. On the surface, that seems like a great idea. Identify child sexual abusers, then bring these people to justice. This software also allows parents to set safety parameters on their child’s phone that could block sensitive pictures sent via text.

But, reading deeper into the “how” of this new software reveals some dangers. In short:

  • Apple scans someone’s photos stored on their phone, not on iCloud, for unique ID numbers (each photo has one when the photo is created)
  • They cross-reference the unique ID of photos to a database of child sexual abuse photos
  • If there are more than 30 matches, they bring in an employee to confirm
  • If confirmed, that person’s private information is presented to the authorities

Once this software is uploaded to new versions of an iPhone, it could be used in malicious ways. Could a government scan private citizens’ phones to see images that criticize the government? What is actually considered my personal, private property? Reminds me of Pandora’s box – once it’s out there, you can’t take it back.

But, this technology could effectively reduce child sexual abuse. Am I willing to give up my privacy in order to help eliminate child sexual abuse? I think that is a loaded question, because the situation is far more complex.

Emerging technologies could usher in new advances in improving our healthcare and protecting vulnerable people, to name two examples. Companies, big and small, need to address how they are protecting consumer data while also maintaining a competitive advantage by collecting it. It is a delicate line to walk, and companies will need to tailor messaging in order to effectively convey what they are doing and why they are doing it.

Consumers need to take more time evaluating the trade-offs of providing or not providing data to companies. This could be from a consumer perspective or from your employer who wants to gather your data. This evaluation is also dependent on transparency and trust.

I expect companies will experiment with data collection by separating what data consumers can opt to share. At the same time, I expect consumers to face complex decisions for sharing data based on the benefits as technology improves.

There will come a point when I will ask whether it’s worth sharing personal data in the long run, and many people may feel differently. As for me, when I am presented with those wordy, legalese terms and conditions pop-ups, I may respond with:


  1. Great post. I do think we’ve seen Big Tech care more about privacy in recent years, if only for the reasons you mention above. They are realizing that they need to give something in exchange for data collection. It’s the smaller companies I worry about more these days.

  2. Kanal Patel · ·

    Loved the breakdown of how apple is collecting the data to tackle child sex abuse. I have favored the EUs laws around collecting personal data for many years now. While I understand the positive outcomes of have mass data collected and analyzed, I am also skeptical of all the wrong hands it could fall in and how it can have destructive impacts on people as well. It a very delicate balance for sure. We see it in the news with all the hacks and misuse of data. Here is a story I thought was really crazy:,to%20Vastaamo%2C%20the%20country%27s%20largest%20private%20psychotherapy%20center.

  3. Christina S · ·

    My sister is a data privacy/information protection attorney, and sparked my interest in this subject years before it was on most people’s radar. Initially I thought a lot of the concerns she raised were futuristic paranoia, but now as tech increasingly infiltrates itself into our lives and companies (and govts) seek to capitalize on the ever-important data, I see a lot of these cautionary tales playing out that had once seemed exaggerated or way too futuristic to be worrying about. I think the conundrum you raise is an interesting one – “what if giving up some of your private data meant you gained access to something of significant personal value? Or giving up your private data meant bettering society?”. When phrased as such, the answers seem obvious but as you assert these are loaded questions, and who defines the parameters? I hope to continue to explore this in greater detail throughout the course!

  4. I really enjoyed reading this post and it geve great deal to think about. On one hand I really appreciate how technology is helping our everyday lives in a better way, on the other hand where shouldl we draw the line? Should we give up out privacy alltogheter for a better society or should we allow companies like apple to collect all our private data for a safer society?

%d bloggers like this: