Quantcast

Choice architecture and software design: how Facebook works against your privacy interests



Victoriano izquierdo 730289 unsplash

Social media platforms have been around long enough for users to have gone through several phases of how they use them—perhaps once for travel photos, then later for neighborhood news; a run at political discussions, then later updates about the kids, or professional posts for a new job. Changing how and what one shares is a common activity, and there are tons of articles online about how to lock down a Facebook profile or change the privacy on all one’s posts. Every article hits the same wall, though: Facebook uses choice architecture to prevent customers from managing their privacy efficiently.

Your decisions can be nudged for good

Choice architecture describes the design of the choices presented to consumers and the impact of that design on decision-making. Coined by Thaler and Sunstein in their 2008 book, Nudge: Improving Decisions about Health, Wealth, and Happiness, it lays out how we humans make predictable mistakes, because of our regular use of cognitive "shortcuts" like heuristics, fallacies, and social interaction influences. When used for good, as Thaler and Sunstein advocate, “nudges" can improve outcomes for our finances (through automated retirement savings), healthcare, education, and more. 

Facebook nudges your decisions towards their benefit

Facebook uses choice architecture and “dark patterns” of user interface design to work against your privacy interests. They do so using rewards and punishments. For example: 

  • To change the privacy of your past posts, you must change them one at a time. 
  • While you can hide posts from your timeline, they still appear in Facebook news feeds and search. 
  • Because of recent well-publicized hacks, you can no longer “view your profile” as someone else to check your privacy settings. 
  • But there’s an easy out: you can change all your posts to “Friends only.” 

The punishment for making a significant change in your post privacy (say, to you only) is your lost time: start from the first year you used the platform, go by months or years at a time, and change post privacy one by one. It’s a mind-numbing, painful experience, that’s tied to distractions (oh, I wonder what she’s been up to?), past experiences, and past emotions. If one posted just twice a week for a few years, the likelihood of success of changing 300+ posts is very, very low. 

Instead, Facebook offers a reward — change all your posts, all at one time, and save hours of work! Just like you’d imagine. However, the privacy is changed to “Friends only,” and you’ll continue to share. 

Facebook forces customers into painful decisions. Spend hours of mindless clicking? Delete one’s account entirely? That’s a painful act if friends, family, neighbors, or the kids’ school groups continue to use it regularly. Alternatively, “just” keep sharing with Friends…

Freedom starts by becoming conscious of design manipulation

Manipulation is nothing new—“bait and switch” tactics have been around for as long as there have been markets. However, software—via smartphones and more—is now all around us, all the time, in our work and everyday lives. By becoming conscious of how software systems are designed to nudge and manipulate our decision-making, we can start down a path towards more conscious, more deliberative use. A great resource is the website Dark Patterns. The Dark Patterns Hall of Shame highlights the worst of the worst, and builds your ability to recognize manipulation. 

There’s even a dark pattern named after Mark Zuckerberg: Privacy Zuckering, in which you are tricked into publicly sharing more information about yourself than you intended to.

Photo by Victoriano Izquierdo on Unsplash.