Future Life, Inc.

Micro-Expression Graphing


Your Emotional Footprint

The technology pioneered by Future Life maps the Human Emotional Footprint using Facial Micro-Expressions of Emotion, or FMEE analysis, to objectively measure human emotional responses and actions that are usually undetectable to the human eye. We call this technology Face2Face, and our initial focus is on post-traumatic stress disorder and credibility assessment with deception detections. Our findings demonstrate that the Face2Face technology increases the accuracy in identifying mental health conditions over time by more than fifty percent versus current percentages as reported by government clinicians.

By “Emotional Footprint” we mean an overview of what the person’s predominant emotions are at that point in their life. The Face2Face Emotional Footprint provides a window into an individual’s feelings on any given topic, a data set and graphic representation of how their emotions flow and interact with one another, and a record of which topics or experiences activate or “trigger” their emotions. The Emotional Footprint makes it possible to breakout data to examine how an individual responds to a particular question or set of questions in, say, recruitment or job interview, credibility assessment, combat mission debriefing or a particular stimulus picture or class of images.


Telehealth Technology

We show our emotions in many ways such as our body language and vocal tones, but the most compelling way is through our facial expressions. 

There is a constant stream of involuntary non-verbal communication flowing between us on our faces. These expressions of emotion that we see on one another’s faces, in typical interactions, last between half of a second to 4 seconds. They are classified as Macro-expressions, and we know people can control them. For example, we often smile when we’re supposed to, but not because we actually feel like smiling.

Face2Faces uses state-of-the-art machine learning to build an emotional fingerprint of a subject. Our technology shows a real-time analysis of a subject’s emotional state across 7 key emotions. It then correlates these emotions into a correlation matrix that allows a clinician to assess underlying conditions that may not be outwardly apparent.

Face2Face literally maps the ever-changing human emotional landscape on the basis of hard data. Face2Face has been proven to be accurate 999 times out of 1000. Its assessments of an individual’s resilience, reactivity, and coping style, are driven by comparisons of their responses to different topics compared to their own baseline (neutral) reactions. It also uses comparisons of an individual’s reactions to stimulus pictures with nationally based norms. Face2Face’s integrated neural-network machine learning is designed to continuously improve its accuracy in those areas.

Moreover, it's machine learning is not limited to self-correction. It is designed to compile an ever-expanding database and set of emotional interactions and behaviors it can detect or predict. As its store of data grows it will be able to create algorithms for almost any purpose.


National Suicide Prevention Hotline Number: 988