Historically deaf children in the US were banned from teaching themselves sign language. The long-term effect was that they struggled to build abstract though and complex conceptualization. Even when they were taught language later, the effects of the delay were permanent.
This denial of support for disability is still seen in the US education system. Children with undiagnosed learning disability can be further punished when they get denied scholarship opportunities and higher education, that depend on standardized test score.
Another form of language suppression is cutting off people from their culture. Like when native American children were banned and punished for speaking their native languages. Or when children get marked incorrect for using AAVE in school, even though it is a valid English dialect.
Language is such a powerful way to connect with people, to be able to cut others away from that is even more powerful.
Usually writing designs develop slowly over hundreds of years by many people. But there are some unique cases where writing systems were designed very deliberately. For example, the Korean and Cherokee scripts were designed to help unify literacy attempts and support the needs of the community directly.
This is happening right now in Africa, better representation of sounds in African languages that don't really fit with roman characters. For example, the adinkra alphabet was designed in 2015 and the Luo script was developed in 2009.
Reading is a very involved process. It takes years to connect all the different parts of the brain needed to process and decode symbols. Neurological errors can result in a variety of learning disorders that unsupported, can affect a child's literacy level.
If functional literacy is not meet, this will have an effect on living quality. In healthcare, this can affect someone's understanding of medication doses, or medical consent. In law, not understanding contracts can get someone in jail or stuck with predatory loans. Now most job opportunities are listed online, not being able to read is not an option.
Reading requires complex processes from multiple parts of the brain. There are a lot of opportunities for glitches, like dyslexia or language learning disorders.
Eyes follow a pattern of pause jump, pause jump, etc. The pause is when the eye is fixating on details to process. Jumping to a new focus is a saccade. If the saccade skips a point of interest it can regress back and jump to the next spot.
Your focal point is about 4-5 letters or your thumb width on an extended arm.
Abstract vocalized thoughts into written symbols. How to organize that?
There is no innate rule. Written languages have been structured in many directions. The languages you know affect your defaults in hierarchy composition.
What directions feels like the future and the past? For an English speaker, the future might feel like it is in the right, because that is the written order. An Arab speaker might look to the left to find the future.
For example, a clicker for presentations with 2 buttons. Which one should go to the next slide? Preference is about a 50/50 split. Our logic for the 'natural' direction is subjective so consistency matters more.
With game consoles from Japan, X used to be exit out and O was confirm. This was built off of how X and O symbols were already being used. But in other countries, like the US, the meanings were switched. We didn't have that built association so the switch stayed.
Now most countries have X as confirm and O as negate
How do you transport an idea? signal theory explores the stages of having info, encoding it into a signal, releasing that signal into a noisy environment. Then having something that can receive that signal and decoding it back in to raw info that can be processed. The oldest that we know is the genome, dRNA sending signals to ribosomes on protein design.
Chemical signals are very effective at transporting information fast and accurate in messy environments. One that I like is the chemical cascades that start when a cell receptor is triggered by a signal, and starts a cascade of chemical responses to react to that new information.
Signal translation gets more complicated with new layers and time. For example, insects and flowers have developed very specific signals with each other. That involve colors, smells, shapes, and time. Evolving together to cue and signal to each other about pollen status.
With animals, signals can transmit complicated ideas. Human language can encode nested signals:
You can tell based on inflection, tone, or even if I use an abnormal noun that I am signaling more meaning than just the literal sentence. With tactile misunderstanding you can signal power dynamics in your overt interpretation of someone's signaled message.
By choosing not to accept the signal of a polite request, and pretending to take the words literal. A power dynamic is being signaled over the need of the person's biological need.
From misunderstandings to misremembering, humans have lots of opportunities for signal failure. How do you find an appropriate frequency for signal updates, without patronizing your audience? What about something emotionally charged, like infant safety? How can a doctor better communicate vaccines to a community that lost their trust in healthcare? can that trust be repaired?
How much of a message will get lost with each level of encoding? Below cat the animal has been encoded into several layers of abstraction. What are the cost/benefits of each layer? are they worth it?
For example, encoding cat into binary means that you can put it into a computer to run data simulations. But to get to the binary, you lose things to get that level of abstraction, like emotion and the tangible link of how soft it's fur is.
To put the animal cat into an idea, you will lose it's corporeal form, but you can still preserve the cute feelings. That cuteness can be translated into sound and written scripts, but the cuteness will get lost in binary encoding. That binary would be powerful if you need to compute a lot of data, like a harm study of how feral cats damage the ecological environment (can't really do that just smelling to top of a cat's head). And these deeper layers of encoding, handle time and distance better. I know the word cat and how it is spelled, but I will never see the 'cat' that first got called cat.
Humans are very good at pattern recognition and predicting trends. They also don't like incomplete patterns. With this circle, do you think that it is complete, or that there should be another rectangle?
Some people would say that there should be another green rectangle not a yellow star or a red elephant. This compelling drive can lead to heuristic biases with incomplete concepts.
For example, with messy ideas like health, it is hard to agree on what foods we should eat and at what quantities. But for ideas that have a lot of missing gaps and pain it can be very hard for people to accept that there is no 'last rectangle'. With cancer treatments, getting a terminal diagnosis, when some charlatan says they have a cure that western doctors are trying to keep hidden. That can be very tempting, especially if the oncologist had bad bedside, crushing your hope and trust in hospitals.
What if you fill in knowledge gaps with whatever? What if it is to justify harming someone? or manipulate someone? With conspiracies or cults ideas that don't have answers are packed with misinformation lies till they fill kind of believable. Then protected with cognitive dissonance.
The Socratic method was designed as a strategy to combat this. Now sometime referred to as 'street epistemology', a question line that forces someone to look at the gaps of their reasoning without the comfort of completeness.
Start by taking a belief and putting it on trial in your mind. One starting question is, 'how sure are you about this, from 0 to 100% ?'. This is a way of gently peeling off cognitive dissonance without confrontation.
A therapeutic application of the Socratic method is Cognitive Behavior Therapy (CBT). A way for people to restructure internal beliefs causing harm.
If being right is irrelevant, this won't work. If someone wants to disrupt something or harm someone, they can change their words as needed. Proving someone inaccurate or following a logical fallacy won't change their drive. If the excuse breaks they will just start saying the quiet part out loud.
General and old words have diverse meanings. Words like 'Art' have been used by hundreds of people for hundreds of years, there are many different interpretations. Words in highly specialized fields, like 'cytotoxic', tend to hold their specific meanings longer.
Sometimes a term will be mooched off of for its reputation. Like using scientific terms for MLMs or scams to leech off of the more reputable reputation of scientific research standards.
When a topic is uncomfortable for a community they will add steps to distance themselves from the ideas even if they have to confront it. Historically disability has had very bad stigma. In America, it use to be illegal to be 'disabled' in public, people still feel uncomfortable with disability. Some words like 'idiot' or 'r*tard', have transferred from 'medical terms' to insults. New terms to step away from this stigma, like OCD or ADHD, have already begun stigmatization and devalue in the cultural zeigst.
another example, in America people avoid talking about death bluntly, using euphemisms like, 'passed on', or 'no longer with us'. This is also observed around 'gross' topics, like using 'restroom' or 'powder room' instead of 'toilet'.
Cuneiform became one of the earliest attempts to encode ideas into symbols. Starting out as a way to document exchanges and debts in early Mesopotamia. This later developed into hieroglyphics, then Greek, then roman.
Developing written scripts started out as a way to practically communicate ideas to other people, even with a time delay (like months or years). This is why pattern recognition and consistency matters more than trying to design the most 'logical' way. As long as people understand each other, good enough.
Some number systems did not have the concept of zero. But most systems did settle at a base 10 breakdown. Later with computer binary became very common.
When measuring temperature, how do you know where zero is? Like kelvin at the lowest frequency of molecules vibrating? like Celsius at the freezing point of water? like Fahrenheit at the freezing point of a salt water brine solution? the narrative that we are using might affect which units are easiest to calculate. Like how music theory developed in the west is very good at analyzing Bach or Mozart, but becomes nearly incompressible when analyzing rhythms in other cultures.
For example, if you usually wear a jacket at 10 °C, it would not matter if Celsius ever existed. You only knew that at temperature 60 °F, still wear the jacket. Even if you lived in a society that has never ascribed temperature to a meaningful scale. You would still notice feeling an uncomfortable gap between your core temp and the outside weather. Even if you were not sentient, just a water molecule vibrating in the universe, you would still vibrate slower at 10° than at 100°. This relationship is arbitrary, we are the ones that give it meaning. People will passionately defend their 'way', because it is linked to their interpretation of reality. Sometimes this leads to cognitive dissonance, if someone is not willing to be wrong.
When building or learning a new mental model, meaning will get contextualized into arbitrary relationships. To facilitate onboarding into a consistent standard, try building off framing models that are already in use.
Zombies, sometimes refers to ideas that won't die, even when they have been disproved. This can take the shape of urban myths, or bigotry. Sometime it can take decades to unravel a bad idea rotting in the culture zeitgeist. For example, it took several campaigns to get construction workers to where safety equipment on sites, because safety precautions were leaked to weakness. Right now healthcare facilities are trying to de-link racial bias in cardiac emergency, that are leading to poor outcomes for racialized minorities.
One of the biggest evolutions in typography was the integration with computers. When Apple established the freedom of type design in computers, people outside of design could start to play with type as art and as expression.
Although, for some reason Apple introduced typeface as font. Font was traditionally referred to different character shapes like the bold or the italic font variations of the main typeface family.
Some languages don't have as many typefaces. The average Japanese font has from around 8,000 to 16,000 glyphs, English by comparison has about 52. If a designer wants to make a new typeface for Japanese with a bold font and a thin font. They would need to make new character designs for 46 hiragana, 46 katakana, 26 romaji, and 2,000+ kanji with every new font. With English, you only need 26 character designs for letter fonts.
This volume difference affects how many typefaces can be made for different languages. Designers working with typography have to work within these limits. For UX, that means they might not have as many accessibility considered typefaces.
When children are first learning how to write they learn letters with euclidean geometry with a symmetrical apex. But with deteriorating art programs in pubic schools that is all they learn.
In the second line, I show how you can push the apex of round letters up or down to make them more distinct. If you have a 'b' with a low apex and a 'd' with a high apex. It helps distinguish the letters even if they get flipped. This adds an information channel that people can use for accurate symbol decoding, if they have a learning disability or low vision.
These organic imperfections can decrease some aesthetic rhythms. Designers have to find a balance between aesthetic and accessible design.
A fun feature of writing development, is the creativity and play. Back when it was hieroglyphics and Han characters, writers developed playful inside references with each other (which makes it a little hard for us to follow now without the background references).
As texting and social media have evolved, we are seeing rapid changes in Short-hand and emoji references. This has also permeated how people talk to each other face to face.