The Personalization Paradox: Balancing Convenience and Privacy
It’s your friend’s thirtieth birthday next week, and you have been scouring the internet for over two hours now trying to find the perfect gift. Should you play it safe with a gift basket? You know she loves red wine and cheese, but what about that beautiful necklace you saw on… where did you see that again? You’ve gone through so many sites now you can no longer remember. Oh, and that bouquet of flowers was beautiful, maybe she would like that more than the wine!
In the end, you settle for the wine and cheese basket from the first website you browsed. With a deflated sigh you close your laptop, exhausted by the entire experience.
Does this sound familiar to you? This is a prime example of what is known to behavioral scientists as the paradox of choice, or choice overload: the overwhelming feeling associated with making a decision when faced with a large number of options.
Choice overload doesn’t just cause trouble for us while we’re stuck trying to make a decision. It also has negative consequences that continue after the choice has already been made. Returning to our example above, imagine that the following day, you come across a customizable engraved bracelet that you know your friend would have loved, and your heart sinks. This disappointment represents an additional opportunity cost: you feel reduced satisfaction with your selection because you can’t stop comparing it to alternatives.
It’s estimated that the average person makes 35,000 decisions a day. That’s an awful lot of choice overload and opportunity cost to deal with. Can modern-day personalization technology help reduce the burden of choice?
Behavioral Science, Democratized
We make 35,000 decisions each day, often in environments that aren’t conducive to making sound choices.
At TDL, we work with organizations in the public and private sectors—from new startups, to governments, to established players like the Gates Foundation—to debias decision-making and create better outcomes for everyone.
Personalization tech, explained
As the name alludes to, personalization is the personal or tailored experience created by a company for a customer, based on their knowledge of that customer. In the age of big data, we’re all well acquainted with personalization: open any social media platform and your feed has been created specifically for you, based on your interests and potentially even your current mood.
TikTok, in particular, is known for its use of powerful personalization algorithms to generate its trademark “for you” pages (colloquially known as FYPs), where users are fed a stream of content selected specifically for them. This level of personalization is behind the platform’s ability to suck users (such as myself) into a TikTok trance, often for hours at a time.
Over on Netflix, you can also find a percentage match for each movie based on what you’ve watched in the past, estimating the likelihood that you’ll enjoy a particular film or series. Similarly, on Spotify, you can find a “Based on your recent listening” section that recommends songs you may enjoy. This enhances the consumer experience and saves a vast amount of time that could be wasted trawling through the thousands of film or music options available on these streaming services.
As a crippling sufferer of indecisiveness, I rejoice at these features. I am what is known as a maximizer — someone who seeks to make the best possible decision, as opposed to satisficers, who are content with making a choice that is “good enough.”1 Having a narrowed-down selection to choose from saves me time and spares me the anguish of choice overload.
The big data driving personalization
In a world where time-rich is the new cash-rich, personalization technology is flourishing. But how does it work? In a word, the answer is data mining. Every click of a button online can be tracked, and this information is extremely valuable to organizations who make money from selling (or advertising) products or services to consumers. I’m sure you are familiar with those pesky pop-ups asking for your consent to cookies, or the small pieces of information websites store on your computer. Websites use cookies to remember and identify you, store your preferences, and create personalized content.2
Social media platforms also store information about us, such as what content we interact with the most, what we like, and where we spend our time online. They use this to deliver tailored content to our feed. Open up my Instagram and you will find beautiful snaps of Balinese beaches or huts on the water in the Maldives — yes you guessed it, I love to travel. Scroll through my TikTok and you will find endless hilarious cat clips, the latest exhibitions in London, and latte art — it’s like TikTok knows I just bought a coffee machine.
Where do we draw the line?
For all the convenience and expediency that personalization can bring, the technology that undergirds it has also attracted its fair share of negative attention. As the capabilities of big data grow, it becomes easier and easier for tech products to cross the line from helpful to creepy.
Have you ever wondered if your phone is listening to you in order to send you personalized adverts? Facebook CEO Mark Zuckerberg assured US senators in a five-hour congressional hearing that facebook does not monitor conversations. Similar allegations have been denied by Instagram. The fact that such suspicions have become widespread enough to spark this level of response suggests that many users are at least slightly disturbed by how well their devices seem to know them.
Similar concerns have been raised about Google’s predictive text, which often seems to know what you want to say before you’ve even typed it. AI emotion recognition technology (ERT) has the power to know what emotion you are feeling even if you don’t have the words to express it. Although these tools have many beneficial use cases — for example, in medical settings, they can be used to detect which patients are experiencing the most discomfort — they can also infringe on the general population's privacy if they are ambiently monitoring us.3
Products that employ technologies like these for the purposes of hyper-personalization can often end up falling into the so-called “creepiness ditch,” when an algorithm knows what you need, precisely when you need it. If you use technology often enough, then your online digital profile may even know you better than you know yourself. It knows your likes, dislikes, and everything you’ve ever searched. This raises the question: Where exactly do we draw the line between useful and invasive?
It is true that personalization technology has the power to save us time, but it also has the power to consume it — and companies often benefit financially from doing so. Far from enhancing our ability to make the best decisions for ourselves, potent and addictive software can erode our free will by trapping us in loops of instant gratification. In the worst cases, it can feel like companies use our digital profiles as something akin to voodoo dolls, enticing us to buy, click, like, or consume content without ever knowing that they are pulling on our behavioral levers to influence us.
It is extremely difficult to choose to leave these platforms, and even when I do, I find myself regularly fighting the urge to return to them. As of 2022, 47% of Americans consider themselves “addicted” to their phones, and 74% feel uneasy leaving their phones at home.4 Do the seconds saved by personalization technology really compensate for the potential hours spent scrolling through highly addictive personalized content?
Data security and ethical data management
All of this is further complicated by questions about the security of the vast amounts of personal data being collected from consumers. A US retail privacy survey carried out by Deloitte found that only 1 in 5 retail executives surveyed believes that they have done an adequate job of integrating their data privacy strategy with their overall corporate or business unit strategy.5 This is certainly less than reassuring. In the same survey, managers cited lack of funding, inadequate data management within their organizations, and lack of clear government regulations as challenges in implementing a consumer privacy strategy.
In Europe, GDPR regulations protect consumer data. On the other hand, in the US, a clear federal mandate is lacking, and it is up to individual states to enact data privacy regulations, leading to confusion and inefficiency. The stakes are high when our digital profile includes information such as our addresses, geolocation data, medical and legal records, login credentials and credit score histories. Data security needs to be a central component of organizations, not just a regulation they have to follow or a “nice to have.”
Making personalization more ethical (and less creepy)
Fortunately, consumers can have some sway here by favoring companies that prioritize data security and forcing the market to align with our data privacy values. These types of consumer- and trust-focused organizations make it easy for consumers to manage their privacy, personalization, and data settings; are transparent about the uses of your data; empower employees to champion their clients’ data protection; and implement robust cybersecurity measures into their infrastructure. Such companies can act as a model for more ethical data management.
Most companies adopt personalization tech in an opt-out manner, where the personalized features are the default. A key component of human behavior is our tendency to stick with the default or go with the flow, an example of a cognitive bias known as status quo bias. While the choice to opt out is ours to make, the friction of finding where the settings are, accompanied with status quo bias, results in the vast majority of people enabling personalization by default. Switching to an opt-in setup offers a more ethical approach, as it forces people to make a conscious decision about what personalized settings work best for them.
The AI Governance Challenge
Another simple adjustment organizations can make to improve the ethics of personalization features, and foster trust from consumers, is informed consent. Informed consent with regards to data collection gives the user an overview of what data is being collected from them, where it is being used, and any risks associated with this, so they can make informed decisions about their own data. This should be presented in simple and concise terms, avoiding data science jargon and pages and pages of fine print. Informed consent regarding personalization features should appear within existing data management pop-ups.
Ultimately, personal data should be treated as the property of its owner and should be protected with the same vigilance as any other asset. Companies, therefore, should always allow us access to our own data and inform us of where it is being used. Unfortunately, this is often not the case and where it is, such as the right to access within GDPR regulations, it can often involve a lengthy and deterring process to get the information.
To opt in or opt out? That is the question
To recap, personalization tech is extremely beneficial in saving time — and after all, time is our most valuable asset. It can alleviate the burden of choice associated with modern-day living. It can help to build rapport between company and customer, making us feel like a valued client and enhancing our feelings of satisfaction. On the flip side, personalization technology can be used in predatory targeted advertising and can make platforms highly addictive.
As with many modern algorithms, the inner workings of personalization tech is often ambiguous, and can be intrusive. If we accept personalization for the sake of convenience in certain aspects of our lives, then we must also be aware of the more sinister purposes our data could be used for in the future, such as hiking up our insurance premiums and potentially denying us loans or mortgages.
Perhaps those pesky pop-ups asking for our consent are not so bad after all, as they serve to protect our data privacy. Will you stop for a second before hitting “agree to all” next time you see one of these? Will you opt in or opt out of personalized features?
References
- Schwartz, B. (2016). The Paradox of Choice: Why Less is More. ECCO Press.
- Hoffman, C. (2017). What is a Browser Cookie? How To Geek. https://www.howtogeek.com/119458/htg-explains-whats-a-browser-cookie/
- Fulmer, J. (2021, September 21). The Value of Emotion Recognition Technology. IT Business Edge. Retrieved March 15, 2022, from https://www.itbusinessedge.com/business-intelligence/value-emotion-recognition-technology/
- Wheelwright, T. (2021, April 21). Cell phone behavior survey: Are people addicted to their phones? reviews.org. https://www.reviews.org/mobile/cell-phone-addiction/
- Consumer privacy in retail: The next regulatory and competitive frontier. (n.d.). Deloitte. Retrieved February 28, 2022, from https://www2.deloitte.com/content/dam/Deloitte/us/Documents/consumer-business/us-retail-privacy-survey-2019.pdf
About the Author
Eva McCarthy
Eva holds a Bachelor of Science Mathematics degree and is currently undertaking a Master's in Cognitive and Decision Science at University College London. She is a committee member for UCL’s Behavioral Innovations Society, a student community of behavioral scientists that aims to deliver positive and sustainable behavior change within UCL and beyond. She also works for Essentia Analytics, a behavioral data analytics service that helps investment managers make measurably better investment decisions. Standing at the precipice of major technological upheaval she believes it is essential to apply behavioral science research to new technological advancements.