Those categories are useful. But at the end of the day, dark patterns can be simplified into “exploiting some asymmetry between customer/user and seller/software”. Often of knowledge, but sometimes also the fact that the customer/user is expected to be a human being and able to feel shame (see: confirmshaming), lose their patience (nagging), etc.
In special nagging annoys me to no end. Partially because I’m resilient against this sort of manipulation, and “caving in” releases the tension created by the nagging.
What’s this one called? // You do a video call on some service, like Messenger or WhatsApp or Zoom. // After the call, you get a popup asking you about the quality of the service, on a 5-point scale. // If you click 5-stars, it says “thanks” and lets you go. // If you pick one of the others, it does the whole “oh help us do better, please fill out this form” spiel which is obviously a lot of work.
It’s a type of nagging, as it’s trying to tire you down into compliance. It seems to have a few elements of bait and switch (as it’s trying to change the situation from a simpler into a more laborious one) and, depending on how it’s worded, confirmshaming.
I think the #1 dark pattern is how cell phone makers collude with app makers to opportunistically and illegally turn the expensive devices that WE PAY FOR into bugs & telemetric devices that listen to all of our conversations and then privately snitch on us and sell that information to private companies and even individuals for all we know. These devices we buy log our conversations all day long secretly and turn them into text that can persist forever.
Ultimately privacy zuckering.
LinkedIn: presenting their “suggested” connections as if they already requested you to connect, when in fact it will be you initiating the connection if you click the blue button.
Those categories are useful. But at the end of the day, dark patterns can be simplified into “exploiting some asymmetry between customer/user and seller/software”. Often of knowledge, but sometimes also the fact that the customer/user is expected to be a human being and able to feel shame (see: confirmshaming), lose their patience (nagging), etc.
In special nagging annoys me to no end. Partially because I’m resilient against this sort of manipulation, and “caving in” releases the tension created by the nagging.
From HN comments:
It’s a type of nagging, as it’s trying to tire you down into compliance. It seems to have a few elements of bait and switch (as it’s trying to change the situation from a simpler into a more laborious one) and, depending on how it’s worded, confirmshaming.
Ultimately privacy zuckering.
Bait and switch, perhaps?