What Is Emotional AI? A Business Definition
Emotional AI is technology that analyses observable signals — facial expressions, vocal patterns, text and physiological data — to identify and quantify human emotional states. In business, it measures what people actually feel, as distinct from what they say they feel. EchoDepth is an enterprise emotional AI platform using the FACS standard: 44 facial Action Units, calibrated across 14 cultural cohorts.
Emotional AI — also called affective computing — is a category of artificial intelligence that analyses the observable signals human beings produce when they experience emotion. These signals are not chosen or curated: facial muscle movements, vocal pitch and pace, hesitation patterns in speech, and linguistic structures in text. They are involuntary, consistent across individuals, and measurable with validated scientific methods.
The commercial case for emotional AI is rooted in a fundamental measurement gap. Organisations make critical decisions — about communications, sales strategy, hiring, investment — based on what people say they think and feel. Survey data, focus groups, and interviews all capture declared preference. What people actually feel, moment to moment, in response to specific stimuli, is a different signal entirely — and it is the signal that predicts behaviour.
The difference between emotional AI and sentiment analysis
Sentiment analysis classifies text as positive, negative or neutral based on word choice. It is useful for volume monitoring: tracking aggregate customer sentiment at scale. It cannot read tone, delivery, facial expression or the hesitation that precedes a critical statement. A CEO who says "we are confident in our guidance" with every vocal stress marker of suppressed uncertainty will score positive in sentiment analysis. Emotional AI measures the person, not the words.
Words: positive, negative, neutral classification of text content
People: physiological signals that reveal actual emotional state
Whether a document uses predominantly positive language
Whether the person delivering or receiving the message believes it
How emotional AI works: the FACS standard
The scientific foundation of enterprise-grade emotional AI is the Facial Action Coding System (FACS), developed by psychologist Paul Ekman and Wallace Friesen at UCSF. FACS maps 44 specific facial muscle movements — called Action Units — to emotional states. These Action Units are involuntary at the micro-expression level, cross-culturally consistent for basic emotions, and measurable with high inter-rater reliability across validated research.
Submit a recording or document. EchoDepth returns a full scored analysis within 5 working days — free.
EchoDepth applies FACS-standard Action Unit analysis to video recordings, combined with vocal pattern analysis (pitch, pace, hesitation, vocal stress) and linguistic emotional mapping in text transcripts. The result is a multimodal emotional signal — not a sentiment score, but a quantified, timestamped emotional state output with specific metrics for each channel.
The critical distinction: FACS-based emotional AI measures what is happening physiologically. Generic image-classification systems train on labelled datasets without scientific grounding and produce outputs that are both less accurate and less defensible. For any enterprise use case where outcomes affect individuals, the difference matters legally and ethically.
Where emotional AI adds measurable value in business
Pre-live earnings call analysis identifies credibility gaps and trust signal drops in rehearsal recordings — before results day, when revision is still possible.
Sales demo analysis identifies the exact moment buyer confidence drops — typically 30–60 seconds before the verbal objection that costs the deal.
Interviewer Consistency Scores measure whether panel members are responding consistently to candidates with similar evidence — making unconscious bias visible and auditable. See also: how to measure employee sentiment.
Pre-launch all-hands analysis detects resistance signals in leadership communications 4–8 weeks before they become visible behaviour.
Frame-by-frame emotional response analysis on campaign creative reveals what audiences actually feel — not what they say they feel in a focus group.
The accuracy question
Accuracy in emotional AI depends entirely on methodology. FACS-standard systems with cultural calibration produce repeatable, scientifically defensible outputs. Generic machine-learning classification systems trained without cultural calibration produce systematically biased outputs — a well-documented problem that disproportionately misclassifies expressions from non-Western demographic groups.
EchoDepth is calibrated across 14 cultural cohorts in 6 countries specifically to address this. The system does not classify emotions — it classifies Action Unit combinations statistically associated with emotional states, with documented confidence intervals. Outputs are data for human judgment, not automated verdicts.
Ethical and legal context
Emotional AI that analyses facial expressions constitutes biometric data processing under UK GDPR Article 9 — special category data requiring explicit informed consent, a Data Protection Impact Assessment, documented purpose of processing, and a signed Data Processing Agreement. EchoDepth operates under this framework for every deployment. Biometric data is never processed without explicit consent, and analysis is never used for automated decision-making without human oversight.
Frequently Asked Questions
What is emotional AI?
Emotional AI is technology that analyses observable signals — facial expressions, vocal patterns, text and physiological data — to identify and quantify human emotional states. It measures what people actually feel, as distinct from what they declare they feel in surveys or interviews.
How is emotional AI different from sentiment analysis?
Sentiment analysis classifies text as positive, negative or neutral. It reads words. Emotional AI analyses physiological signals — facial Action Units, vocal patterns, linguistic structure — to identify the actual emotional state of the person, regardless of what words they use. The two measure fundamentally different things.
Is emotional AI accurate?
Accuracy depends on methodology. FACS-standard systems with cultural calibration produce scientifically defensible outputs. Generic image-classification systems without cultural calibration produce systematically biased results. EchoDepth uses the FACS standard calibrated across 14 cultural cohorts in 6 countries.
Is emotional AI legal in the UK?
Yes, subject to UK GDPR compliance. Facial expression data is biometric data under UK GDPR Article 9 (special category), requiring explicit informed consent, a DPIA, documented purpose of processing, and a Data Processing Agreement. EchoDepth operates under this framework for every deployment.
What is EchoDepth?
EchoDepth is Cavefish's enterprise emotional AI platform. It analyses emotional signals in video, voice, text and images — generating Trust Scores, Credibility Signals, Resistance Indicators and Engagement Depth measurements. It uses 44 FACS Action Units calibrated across 14 cultural cohorts in 6 countries.
See EchoDepth applied to your content
Send us a recording, transcript or presentation. We return a scored EchoDepth communication signal analysis within 5 working days.
Request a Free Sample Analysis →