Question Answering
Trying to answer questions about a large corpus of domain knowledge? You might want to read this guide.
Question Answering
What is Question Answering?
Question answering is about letting the AI automatically answer a question. As an option, you can give a context and let the AI use this context to answer the question.
For example imagine you want to ask the following question:
How to bake some bread?
The AI might answer something like this:
1. Grease a clean, dry bread pan with butter.
2. Stir 1 envelope dry active yeast to 1/4 cup warm water in a large bowl. Let the yeast bloom for 10 minutes, or until dissolved, then add 1 teaspoon salt, 1 teaspoon honey, and 1/2 cup unsalted butter.
3. Add 3 1/2 cups strong flour and mix well, then wait to process your dough for 3 minutes.
4. Add an ounce of water and mix well. If the water was not already warm enough to make a dough (i.e., it was not warm to the touch) then add a warm to hot bowl of water to the machine. Stir continuously for 3 minutes.
5. Turn your dough out onto a clean sheet of plastic wrap and fold over.
6. Cover with another piece of plastic wrap and place in a warm corner of your kitchen.
Now maybe you have specific advanced data you want to give the AI and ask a question on it (also known as "context"):
All NLP Cloud plans can be stopped anytime. You only pay for the time you used the service. In case of a downgrade, you will get a discount on your next invoice.
You might want to ask the following question:
When can plans be stopped?
And the answer would be:
Anytime
Why Use Question Answering?
Question Answering can be usefully used in the "real world". Here are a couple of examples.
Contracts Questions
Chat bots are used more and more everyday, both to answer customer questions and internal collaborators questions. Imagine that a customer is asking a legal question about his contract. You could perfectly use a question answering model for that and pass the contract as a context.
Product Questions
Here's another chat bots related example. Imagine that a collaborator has a technical question about a product. Why not provide him with a natural language interface and make his life easier?
Use GPU
Control whether you want to use the model on a GPU. Machine learning models run much faster on GPUs.
Context
The block of text that the model will use in order to find an answer to your question.
Language
AI models don't always work well with non-English languages.
We do our best to add non-English models when it's possible. See for example Fine-tuned LLaMA 3.1 405B, LLaMA 3 70B, Dolphin, ChatDolphin, XLM Roberta Large XNLI, Paraphrase Multilingual Mpnet Base V2, or spaCy. Unfortunately not all the models are good at handling non-English languages.
In order to solve this challenge, we developed a multilingual module that automatically translates your input into English, performs the actual NLP operation, and then translates the result back to your original language. It makes your requests a bit slower but often returns very good results.
Even for models that natively understand non-English languages, they actually sometimes work even better with the multilingual addon.
Simply select your language in the list, and from now on you can write the input text in your own language!
This multilingual add-on is a free feature.