Machine Learning

How to solve data scarcity for AI

Data scarcity is one of the major bottlenecks for Artificial Intelligence (AI) to reach production levels. The reason is simple: data, or the lack of it, is the number one reason why AI/Natural Language Understanding (NLU) projects fail. So the AI community is working extremely hard to come up with a solution.

 

As a result, the range of solutions is really wide. These are the two main trends:

  • Data simulation via software: This approach uses advanced Machine Learning (ML) techniques, like Transfer Learning or Active Learning and other next-generation AI algorithms. The biggest issue here is probably that it’s difficult to predict for which cases these will or won’t work, so it takes multiple experimentation, evaluation and re-training iterations, without any guarantee of significant improvement.
  • Manual data creation or labelling: There is a wide range of companies that create data from scratch, starting with Amazon Mechanical Turk. This approach produces customized data on demand. The main issue is how to scale it. It is also hard to edit and reuse the data for retraining/adjusting when results are not quite right.

 

As an intermediate path, a new trend is getting traction: Synthetic/Artificial data generation. This approach actually “writes” the new data using software rather than manual effort. Sometimes, data is produced with the required labeling, using NLP technologies. This approach is promising because it merges the best of both worlds: the scalability of an automatic approach and the data transparency and explainability of a manual approach.

At Bitext, we are working in this space, focused on HMI (Human Machine Interaction) and chatbots. You can download a test dataset and see how synthetic/artificial data works for your case:

For more information, visit www.bitext.com, and follow Bitext on Twitter or LinkedIn.

admin

Recent Posts

From General-Purpose Models to Verticalized Enterprise GenAI Use Cases

Verticalization is a necessary step for deploying AI in the enterprise. But what does verticalizing…

4 days ago

Case Study: Finequities & Bitext Copilot – Redefining the New User Journey in Social Finance

Bitext introduced the Copilot, a natural language interface that replaces static forms with a conversational,…

2 months ago

Automating Online Sales with Proactive Copilots

Automating Online Sales with a New Breed of Copilots. The next generation of GenAI Copilots…

3 months ago

Taming the GPT Beast for Customer Service

GPT and other generative models tend to provide disparate answers for the same question. Having…

6 months ago

Can You Use GPT for CX Purposes? Yes, You Can

ChatGPT has major flaws that prevent it from becoming a useful tool in industries like…

7 months ago

Why Do You Need to Fine-tune Your Conversational LLM with 100’s (If Not 1,000’s) of Examples?

If data is the oil of the AI industry, we are running out of data…

7 months ago