cubed iemotion happy sad ©

Matt Dobson

As we increasingly depend on digital technology for every aspect of our lives, a new smartphones app offers a window on our moods and emotions

Spike Jonze’s much-discussed movie ‘Her’, explores our emotional relationship with our virtual helpers in the future, our interfaces with the many different online activities we will depend on. In the future perhaps these new interfaces may also help us understand ourselves a little better, like the forthcoming app from the Cambridge-based ei Technologies – ei stands for ‘emotionally intelligent’. 

The company is developing an app that will be able to identify peoples’ moods from smartphone conversations, via the acoustics rather than the content of a conversation. Such a technology has obvious commercial usages in a world where we interact with computer voices for services such as banking. ‘In call centres,’ says CEO Matt Dobson, ‘it’s about understanding how satisfied my customers are. As a consumer you have a perception and that is driven by a modulation and tone in their voice.’ 

Engineer’s natural curiosity

Dobson’s background in healthcare, working for Glaxo Smith Klein and Phillips Electronics, developed an interest in mental health where this technology offers significant possibilities. ‘I really wanted to do something in the area of emotion recognition and mental health,’ says Dobson.  Then a friend of his in Cambridge showed him an article, they looked at some technical papers and thought they could build something. ‘If you look at the mental health market it is one of the biggest needs, bigger than cancer and heart disease, yet has about a tenth of the funding.’ Dobson cites examples such as media coverage of cricketer Jonathan Trott coming home from the Ashes tour and the CEO of Lloyds taking time off due to stress, as examples of greater public awareness of psychological issues.  

Before Dobson did an MBA at Cambridge, his primary degree was in Mechanical Engineering at Bath – this grounding in science gave him a subtle head start. ‘Engineering is all about natural curiosity, not being afraid to tinker and play with stuff,’ says Dobson, ‘I am not an expert in this area but I know enough to ask the right, smart questions and can review a research paper and get a good idea what the limits of the possible are.’ 

Speech recognition

Starting up the venture, they needed expertise in the area of speech and language, and machine learning. So they called on Stephen Cox, a specialist in speech recognition and Professor of Computing Science at the University of East Anglia, who is now an adviser. 

The ‘empathetic algorithm’ is based around the idea that we can differentiate between emotions, without necessarily knowing what words mean – think of watching TV or Films in a different language. ‘It’s about understanding what parts of the voice communicate emotions, acoustically what features betray emotion – we use probably 200 to 300 features in each section of speech we analyse.’  They gathered data to train the system, which then uses statistics to pick out the most probable emotion being expressed amongst all the other background and mechanical noise on the phone.

Emotional life-tracking

Soon, says Dobson, they will have a free app where the conversation we have just had can be emotionally analysed and the users can tweet to a Twitter page.  ‘It will say “Matt had this conversation”, I can include your twitter handle in there and it creates the dialogue between us and say “I had a happy conversation with John from Cubed”.’ 

But the next step, involving a kind of emotional life-tracking is more complicated.  ‘That is quite a sophisticated piece of software,’ says Dobson. The idea being that we will be able to cross-reference our emotional states with other bits of our data from other parts of our day. ‘How we can use this data to basically monitor and understand human behaviour?’ says Dobson. In monitoring, ‘people’s mental health if they are depressed, can we understand when and why they are depressed?’ 

External links