Now this is insane 🤯
Use AI to understand emotional expression and align technology with human well-being.
Let's dive deeper and see whats possible using hume.ai
🧵 A thread
Some of the things Hume.ai is capable of doing I'll go through in the thread.
These are all individual AI models that can be used standalone or stringed together.
I think it's very fascinating what can be done with these tools today.
🤖 Speech Prosody AI
Discover over 25 patterns of tune, rhythm, and timbre that imbue everyday speech with complex, blended meanings.
🤖 Vocal Call Types AI
Explore vocal utterances by inferring probabilities of 67 descriptors, like 'laugh', 'sigh', 'shriek', 'oh', 'ahh', 'mhm', and more
🤖 FACS 2.0 AI
An improved, automated facial action coding system (FACS): measure 26 facial action units (AUs) and 29 other features with even less bias than traditional FACS
🤖 Facial Expressions AI
Differentiate 37 kinds of facial movement that are recognized as conveying distinct meanings, and the many ways they are blended together
🤖 Dynamic Reactions AI
Measure dynamic patterns in facial expression over time that are correlated with over 20 distinct reported emotions
Sentiment AI
Measure the distribution of possible sentiments expressed in a sentence, negative to positive, neutral or ambiguous
I'm really hyped up about how we can improve @SensiveXyz with all of this, together with our improved journaling experience. Index journal, and much more this will be some exciting times to build.
I'm going to see what else we can use to improve @sensivexyz
Exploring hume.ai as a first step.
If you know of other AI companies working on tools for developers in the well-being and mental health space please ping them over.
Join +12.000 of my closest online friends in getting my newsletter directly in your inbox.
linusekenstam.substack.com