Algorithm is turning on...
design + AI + your mood = empathetic-based design
We can feel. Emotions play a big role in our life, especially in relationships with people. We can understand what our companion is feeling at the current moment and adapt based on their emotions. When they feel sad, we try to support and don’t remind about the cause of their sadness. And when we feel that our companion doesn’t enjoy the conversation, we switch a topic or stop talking. Our companions adapt to our emotions too, therefore the adaptation and the dialog are supported by two sides.
At the current moment, design almost doesn’t have a dialog with a human. Its behavior is based only on user data: your name, what you watched on YouTube, what you bought on Amazon, what you googled, etc…
These things just personalize suggesting content. But what if we will humanize design, give the ability to hear our emotions and allow to change content based on it?
In the future, design will communicate with our environment, with all that surrounds us, with our feelings. Design will look like a dialog between two active humans. It will predict what we want faster than we understand it.
This experiment shows what could be if design and content are based on our emotions. AI recognizes your emotions and... Words, text mood, colors — they change when you try to show emotions.
There are 7 emotions: happy, surprise, neutral, sad, fear, angry and disgust. You will read the same manifesto, but I changed some words to make this manifest emotional. Try to fake these 7 emotions. Then you see what will happen. Changings are highlighted.
To check it, you must allow accessing to your camera. Don’t afraid, it works on the client side, so your image from the camera stays with you on your computer. It maybe doesn’t work on phones and on some browsers, you can check it.