Chatbots

How to make your chatbot more human-like

Chatbot’s potential is nothing new, and here at Bitext we have been talking about them for a while. We emphasize the importance of Natural Language Processing to overcome the current limitations users and developers of bots are facing when trying to create human-like chatbots.

We have done some testing of bots and bot developing platforms like wit.ai, api.ai, and LUIS. And we detected some issues that seem not to be completely solved yet. The issues found are fundamental for human language and our linguistic technology is a perfect match to solve them.

We decided to put our resources to work and today we are proud to introduce you to our news chatbot so you can start having an idea of what our platform can do.

The best way to put it to a test is to try different hard issues we have detected while testing bots and platforms:

  • Negation: we realized that many bots don’t understand negation in a phrase because they have been built based on a keyword approach. That makes it difficult for users to ask for something as simple as “I want a barbeque pizza with no pork”.  Let’s see some examples:
    • “I want a barbeque pizza with no pork” (only negates pork).
    • “We don’t want any drinks” (negates the whole event).
    • “I’m not sure… I’ll take a beer (It doesn’t negate the main event)”
  • Coordination: it is one of the most used elements in how humans talk, and after our research we found out that most relevant platforms do not support a request where elements are joined by a coordinator. Our linguistic knowledge makes us capable to solve this issue.
    • “[[I want a Hawaiian pizza] and [my wife will have a Margherita]]” (two main events).
    • “I’ll have a Hawaiian [with [extra cheese] and [onion]] (two changes in ingredients)”.
    • “I’ll take [[a Hawaiian [with [extra cheese] and [onion]]] and [a Margherita]]” (two pizzas, the first one with two ingredients)
  • Connection between different phrases: Most of the chatbots have been designed following a tree model, so it’s not possible for a user to change his request and that forces him to start over. As solution we propose the usage of connectors as in the following examples:
    • “I want a Margherita with onion… Moreover, add extra cheese” (adds info to the first sentence: adds an ingredient).
    • “I want a Hawaiian with extra pineapple. However, I prefer it with no ham” (also adds info to the previous one).

If you want to start trying our demo chatbot, click here: 

  

admin

Recent Posts

From General-Purpose Models to Verticalized Enterprise GenAI Use Cases

Verticalization is a necessary step for deploying AI in the enterprise. But what does verticalizing…

7 days ago

Case Study: Finequities & Bitext Copilot – Redefining the New User Journey in Social Finance

Bitext introduced the Copilot, a natural language interface that replaces static forms with a conversational,…

3 months ago

Automating Online Sales with Proactive Copilots

Automating Online Sales with a New Breed of Copilots. The next generation of GenAI Copilots…

3 months ago

Taming the GPT Beast for Customer Service

GPT and other generative models tend to provide disparate answers for the same question. Having…

7 months ago

Can You Use GPT for CX Purposes? Yes, You Can

ChatGPT has major flaws that prevent it from becoming a useful tool in industries like…

7 months ago

Why Do You Need to Fine-tune Your Conversational LLM with 100’s (If Not 1,000’s) of Examples?

If data is the oil of the AI industry, we are running out of data…

7 months ago