I want to use the most recent Meta AI training procedure as a penultimate example of Deceptive Design. I’ve talked about Deceptive Design in the past in posts.
Meta has been sending notifications to some Meta product users about the fact that their data will be used to train AI starting June 26, 2024 in a notification (e.g. Facebook) that looks something like this:
Right now it appears to be for people in the UK and EU, where GDPR enforcement is an extra step. There is no option to opt out of Meta using your data to train its AI in the US at the moment though they state there will be eventual rollouts.
In order to receive this, you have to be logged in and in the Facebook app itself. So if you happen to not be logged in, it will move forward without you.
Unlike an email or even a text message, a notification can be easily missed and is often fleeting. It pops up and can easily disappear. Using a notification method to alert people about a vital policy change was in intentional on Meta’s part.
Deceptive Design.
So while it is legally obligated to inform its users, Meta is using methods to have you give the least attention to it for all the insidious reasons you can imagine.
If you happen to catch that notification, it will give you a page like this:
It gives you only one button that says “Close.” There is no other button. If you click that Close button, you have essentially opted in to AI data training.
Think about that intentional design decision to only make one button that opts you in. It’s a button that doesn’t say “Opt-in,’ but says “Close” because the chance of us just clicking on it by habit of dismissing a popup window is extremely high. The designers knew this.
Deceptive Design.
If you want to opt-out, you must go to the link in the text that says “Right to Object. Note they are saying you can object, not opt-out. While they give you the right to object, they may not opt you out. There is a difference in language here.
When you click on the Right to Object link it leads you to this page:
It requires that you fill in your country of residence presumably so they can skirt around local laws, your email, and finally the reason you want to object. We should be able to simply state something to the effect of, “I don’t want it,” which we should be able to do, they might not honor our request. It is colonial culture for Meta to demand a reason and then decide for themselves whether it is justifiable.
Even though we should not be required to give them any specific reason, I recommend being as clear and straightforward as possible for this one. Explicitly state that you do not consent to your data being used to train Meta’s AI.
You will then receive an email response with Meta’s “decision.”
Another intentional design decision the made is that you have to do this separately on their different apps. So while Meta shares data across apps, they require their users to view them as separate apps each requiring opt-out actions. (I’ve already covered why you should not use WhatsApp.)
Deceptive Design.
In Instagram you have to go to Settings > About > Privacy Policy. One would not first think to go to “About” to locate this information. Again, Deceptive Design. Once you click through you will reach a page like this where you can again find the “right to object” link to click through to the above form again. This option does not currently exist in the US.
Big tech companies use this type of Deceptive Design all the time, but so do smaller apps and platforms. They are intentionally lying and trying to trick you.
Deceptive Design is part of capitalist colonialism and has been used in many shapes and forms for centuries.
We must push back.
As always, if and when you can, opt out, opt out, opt out.
Comments are closed.