There is a natural alienation to the new and unfamiliar, to the point where we don't just have a cognitive dissonance called status quo bias, there are several conservative ideologies. Now that reading has been integrated into our culture for centuries, this resistance that Socrates felt seams agnostophobic. In our age of technological innovation, there are plenty of fields, from gene therapies to AI, to feel agnostic about.
Another kind of resistance challenging to UX is doxastic anxiety. Choosing to remain ignorant, fearing how knowledge will change your reality. This can look like holding off a cancer screening, trying to avoid bad news. Or not looking at the horrible conditions of sweatshops, so you don't have to feel guilty enjoying something you love buying.
When resistance becomes harmful, use UX to gently setup a transparent onboarding phase
Most of the things we design are meant to facilitate communication. Language, math, science, music, art, etc. are all ways that we have tried to express ourselves and understand each other.
So what happens if you do the opposite?.....
What happens when things are meant to be unclear, confusing , and uncomfortable? This tactic is obfuscation. When design is meant to hurt, meant to make things hard. Who has the right to understand? what happens when ignorance is a weapon?
Referring to when someone can't understand or express themselves. This is when labels can actually be helpful. Like knowing you have a mental illness and the name and support groups vs just thinking something is wrong with you and you are the only one experiencing this defect. This can also be protection, naming a specific abuse helps victims navigate around resources vs being scared and not knowing how to feel safe or where to find it.
Dark patterns are like 'black' hat hacking. Deliberately abusing expertise to harm people. With user experience, there is a lot of new opportunities to manipulate users.
For example, in US healthcare patients are processed like customers trapped in a monopoly. If you get lab work done, they can have you sign an 'open' contract that you agree to pay what insurance doesn't cover. But they can't charge whatever they want, nor can they make up thousands of dollars of late fees. If they do, you can go to a small claims court (you don't even need a lawyer), and force a 'reasonable' settlement.
The dark patterns that are being used in this example:
Fear, using scary threats on the bill making it seem like you HAVE to pay their made up fees. Some emotions like fear or anger can put you into 'tunnel vision', so your mind can focus on solving the most important problem right in front of you. This is designed for survival, if you are about to get mauled by a bear you won't be distracted by "did I leave the stove on..". But when this survival mechanism, get weaponized, so a patient won't get distracted by "Can they DO this? ...just make up any price, like a blank check...". This is where UX can become insidious.
Bureaucracy, making a process so tedious that most people will just pay the stupid fine. Rather than wasting weeks dealing with the courts. Sometimes when they hand patients these contracts the font is printed very small with no spacing between lines. To make sure no one reads, or really gets what is written. Or they make you follow all these forms and paperwork: this one is due this week...... then next one is due 3 days from now but you need a notary......... the next one is in a month but you can only get it by calling the branch office, they are only open every other Monday from 4:00 to 4:01 AM..... etc. Just to make sure that even responsible people struggle to keep up. And when the patient is sick, disabled, unstable housing, working long hours, or raising children, how will they even have the time or energy to process all these steps. That's the trick, they won't this is when UX is weaponized to stop people from trying without making it directly illegal.
Education and Entitlement, any white collar worker, or fancy lawyer would know enough about contract design, to know how to remove made up fees. But people that just moved to this country and don't know the laws or English very well, won't see these obvious read flags. Growing up in poverty or abused, will also affect if someone has the entitlement to fight bad fees. This is how UX can be weaponized to exploit the 'didn't know you were supposed to know that' problem.
A straightforward prevention strategy is transparency.
Building contracts or processes so straightforward that the user can solve their own problems. The feeling of autonomy is very self empowering, and a side effect of well designed UX.
This becomes even more important when these designs are implemented in industries that have huge impacts on people's lives, like healthcare and government policy.
Slip errors occur when you perform the right action but on the wrong object. For example, watercolor artist have to be careful not to mix-up their paint water with their drink, a slip error could be drinking paint or putting a paint brush in their drink.
Sometimes slip errors are creativity. Like if a jazz musician slips onto a different note than they planned and make something more interesting than they intended.
Omission errors occur when an action is left out. Some people make shopping list, so they don't forget an ingredient they needed. It can be dangerous when a medication is forgotten.
But sometimes omission error are nature evolutions. For example, in linguistics elision (omission of sounds) can sometimes start as errors, like if an American child says 'fou' instead of 'four'. This is perceived as an error, and they might go through speech therapy. But this omission is also part of several English dialects, some omissions are efficient (I don't know) to (I duno).
This can be a good 'rule' for the wrong situation, like using an old manual on a new device and calibrating it wrong. Or it could be more deliberate, like choosing to go faster than the speed limit and getting into a crash.
This occurs with inadequate knowledge of specialized training. For example, when a doctor treats for type 2 diabetes but later finds out that it was type 1. Right treatment plan for the wrong disease.
This can also occur in unfamiliar situations, you are new to a city and drive correctly but are late because you didn't know that a main road artery gets really congested. The locals know to avoid it in the mornings.
Humans make mistakes, it is unrealistic to expect otherwise. If you threaten people when they fail; they won't stop being human but they will stop being honest about reporting, making the system more unsafe for everyone.
One strategy pilots use is semi-anonymous reporting. They can report errors to the design/engineer team, then when the file is reported their name is removed. This way people are not afraid to warn of dangerous errors and mistakes.
It is impossible to make a perfect system. The swiss cheese model shows how layers of protection in the system can lower the risk of an accident getting past all layers of preventive measures. But these error prevention models can lead to a false sense of security. Over time these protective layers can decay (cheese holes grow and increase). This can be devastating, in cases like oil spills or plane crashes.
In healthcare, there are drug admission 'rights'. Like right drug, right dose, right patient...even with these precautions failure still occur in hospitals.
Mistakes can be an indication of systemic failure. Like how drugs have a lot of warning 'pop-ups' that cause alarm fatigue. To the point where clinicians just turn them off without processing (rule-based). Then they miss an important change (knowledge-based).