Yellow Messenger’s Robust NLP Engine – I

Robust Natural Language Processing

Today’s Feature Friday discusses Yellow Messenger’s proprietary natural language processing engine, pivotal to our meticulous conversational AI platform celebrated by our global clientele. But before we delve into the working of the best conversational AI’s NLP engine, let’s first get our sideways dictionary down!

Deep Learning

Jasper, Joan, Jodi, three people performing the same routine tasks, for the same role, with the same qualifications, since the start of their careers, yet one of them earns double the salary of his fellow mates combined, in the same time. A traditional approach would be to check for anomalies within the organisation. But we know the holistic way is to observe their behaviors even outside the organisation. This complex and holistic approach is Deep Learning.

Deep learning is a function of machine learning in Artificial Intelligence that has complex networks responsible for processing enormous amounts of unstructured data. Because DL usually works on unstructured or raw data, it needs more data to find the underlying patterns. Artificial Neural Networks or ANNs, which are conceptually similar to the neural networks in our brain, have been found to work extremely well in these scenarios.

Machine Learning

Now Jasper, Joan and Jodi have developed the habit of watching a movie on Netflix before bedtime. Somehow, it was concluded that there were 5 possible reasons for watching Netflix and nothing besides. 

  • Boredom
  • Free access  
  • Distraction 
  • For their favourite actress/actor 
  • It put them to sleep. 

So we observed them for these points over a week to find the right answer. Since we knew which data points to look for, we didn’t have to take up a complex strategy to understand this behavior. This is simple Machine Learning.

Machine learning in artificial intelligence also processes enormous amounts of information much like deep learning, but here it has a defined, structured data to work on. The goal of Machine Learning and Deep Learning is to find patterns in data that allow us to build models capable of predicting future behaviour.

An excellent real world application of this lies in the predictive healthcare industry. Certain healthcare institutions use Machine learning and Deep learning in artificial intelligence to analyse the bio profile of an individual, regularly monitoring their health. It allows people to make lifestyle changes so they stay fit and avoid chances of grave diseases like cancer, coronary heart diseases, etc. The insurance firms use this data to provide accurate healthcare insurances for people. This is just one sample. Possibilities are endless.

Natural Language Processing

Did you know? Jasper and Joan didn’t talk to each other for a month since they didn’t understand each other? Silly. Jasper took a course in Spanish later so he could talk to Joan. However, Joan still seemed alien to him. He soon realised that she was speaking a dialect from a small town in Barcelona. This is something he picked up from trying to ‘converse’ with her on a daily basis, not something he learnt at a class. Although, the class did build a foundation. Soon enough, he could tie certain words to emotions because of her expressions and her tone. Now he knew when to wipe a smile off the face. This is natural language processing.

Natural Language Processing or NLP is the intersection of linguistics and artificial intelligence. We also know it as neuro linguistics programming. Essentially, it is the processing of enormous amounts of ‘natural language’ data. Today, NLP is facilitated with deep learning, but it was rule-based and finite in the initial days. With NLP, virtual assistants not only have behavioural patterns but also linguistics patterns of individuals. The sophistication of this engine matures overtime with more conversations. 

Can't read, won't read?
Don't. Get a demo instead!

Drop us a mail - contact@yellowmessenger.com to watch our NLP engine in action

Business Challenge

As mentioned, Deep Learning models work great when there’s a vast data set, however, they suffer significantly with smaller datasets. 

This becomes a bigger problem with chatbots where – 

– the customer rarely has data.

– The chatbot may have a lot of journeys/intents that it has to detect.

– Customers come from a wide range of different domains: Banking, Retail, Customer Support etc.

 

Which means that the chatbot has to understand what the user is trying to say, find the appropriate response (which could vary in number from few 10s to 100s) and do this by learning from a tiny amount of data. Defeats the purpose of a ‘self-learning’ system since there is no material to study, doesn’t it?

Yellow Messenger Solution

To solve this problem, we built an NLP engine which is by far the most sophisticated. 

1. Data efficiency

Our proprietary NLP engine is highly efficient with the amount of data that the user provides. For our sideways dictionary, let’s think of it as gas mileage. Ironically, the F1-score is the measure of the accuracy of a test, and not Formula 1. We have the F1 scores for three cars in the above image; Dialogflow by Google, Luis by Microsoft and us.

1. We measure the performance of these engines in three races using the datasets of three community-driven support ecosystems; Ask ubuntu, Banking 77 and Stack Exchange. We calculated the F1 score across 10 different runs using datasets provided in 2 different academic papers (reference [1] and [2])

2. Yellow Messenger’s NLP engine has achieved better results in all stints. 

3. To go one step further we used random down-sampling to reduce the dataset by 50%. YM NLP engine still outperformed Dialogflow by 28% and Luis by 9% in this race.

4. The highest score is 1, which hasn’t been achieved and is difficult.


2. Domain Agnostic

Having a global clientele means understanding every client’s vision and mission, their communication and brand persona. Certainly, this plays a role in deploying solutions however from a technical standpoint, it’s not feasible to change the engine completely to fit individual client’s requirements. The performance should not vary drastically.

1. Since our NLP engine’s performance should not change drastically from domain to domain, we’ve fine-tuned our model on data provided by various customers.
2. This allows us to strike a fine balance – keeping the model domain agnostic as well as providing flexibility to our customers to fine-tune for better performance with small datasets.
3. Which means that clients can tweak the performance to suit the desired requirements, effortlessly.

So what's in it for me, Jodi Asked

Getting back to our sideways dictionary, have you figured why the three were addicted to Netflix before bed? Because, it’s a distraction! What drives the distraction is the recommendation system which suggests movies, tv shows that they ‘prefer’. It has picked up patterns in that one week of Jasper, Joan and Jodi’s viewership to suggest the right content tailored to their individual interests, thus providing optimal brand experience. 75% viewer activity is based on these suggestions. All thanks to AI. 



Now you know why you can’t stop the binge, Jodi.


Similarly, our NLP engine picks up patterns with every conversation, which is why it gets smarter over time.

 

1. While developing the bot, the builder or developer doesn’t have to supply a lot of data.

2. Right off the bat, the chatbot starts with an immense amount of accuracy and over time because of YM’s self-learning loop, the accuracy only increases. 

What’s next for our NLP engine? A zero-shot learning problem, so providing accurate results when no data is available.

Leave a comment