Deep Technology.
Psycholinguistic AI.

Litha Group is a deep technology (deep tech) startup company, building solutions based on lengthy research and development.

 

The foundations for our research are based on an underlying question – “can better psychology and communications change the world for the better?”

 

In the beginning, the primary risk for us was a technical one as our field of research was human psychology. You can’t beta test therapy conversations and simply pivot, leaving the poor user in a worse state than before!

As we have innovated, we have invented.  Contemporary technologies were struggling to give us what we wanted. And so, our research and development of modern psycholinguistics has led to having to rewrite some of the technology rules too.

Ongoing Invention & Innovation

Technology related to mobile phone apps, websites, and e-commerce services tend to be regarded as ‘shallow tech’. The time that shallow tech takes to move from basic science to applicable technology is relatively short. In some cases, it’s as easy as taking open-source code and producing a new user interface.

Our deep tech research through Litha Labs began in 2012 includes deconstructing applications offering psychology, therapy, and conversations (chatbots).  Concurrent to the deconstruction phase, we carried out extensive research in such fields as cognition, linguistics, and artificial intelligence.

As we focus on a ‘human first’ approach to developing our AI, we could see where open-source technologies were falling short. This meant that we have not only been developing applications to change the way the world thinks and communicates but also had to develop some of the underlying architecture.

 

Applied Psycholinguistic AI

Maintaining our work in deep technology, we continually scan the horizon.  We see a future where AI is ethically embedded into our personalised human experiences.

  • Emotional AI technology that dynamically learns from individual and collective social interactions
  • Human-like person-centred conversation using short-and long-context
  • Human-like memory to help forecast and simulate future user behaviour
  • Ethically embedded into the Metaverse
  • Operate as an individual’s private mental health platform (screening, counselling, therapy, visualisation, psychology development)
  • Identify health traits (e.g., anxiety, stress, addiction, loneliness, depression, and degenerative brain conditions)
  • Proactively act accordingly (from referral to psychotherapy)
  • Psychotherapy that can resolve the problem rather than the symptom

 

Imagine chatting with your personal virtual friend about absolutely anything. This free-ranging, multi-lingual companion understands your accent and slang, remembers what you said to it – and can put all of this into context.

Now how about your companion understanding you; listening to you; knowing you’re feeling lonely, anxious, stressed, or sad – and being able to sympathetically engage with you?

More to the point, all of this is secure, confidential, and, as it’s not linked to the tech giants, it’s not there to be monetised.