For large inputs (above 256 tokens) you will need to use the asynchronous mode: see more in the documentation.
You can automatically check whether the generated paraphrase is too close or too far from the original by using semantic similarity.
What is Paraphrasing and Why Use GPT?
Paraphrasing is about generating a new content that keeps the same sense as the original content, but with different words.
Performing simple paraphrasing by simply changing a couple of words is one thing, but generating advanced paraphrasing that completely changes the structure of sentences and the vocabulary used is another beast! Modern models like GPT-3, GPT-J, and GPT-NeoX, now make it possible to easily create advanced and complex paraphrasing that properly keeps the main sense while using a different wording.
GPT-J and GPT-NeoX are the most advanced open-source Natural Language Processing models as of this writing, and they are the best GPT-3 alternatives. These models are so big that they can adapt to many situations, and perfectly sound like a human. For advanced use cases, it is possible to fine-tune GPT (train it with your own data), which is a great way perform paraphrasing that is perfectly tailored to your industry.
Why Use Paraphrasing?
Marketing teams do appreciate paraphrasing as it makes their work much faster and less repetitive. Here are a couple of examples:
Creating marketing content can be long, tedious, and repetitive. It is sometimes interesting to get a hand from AI to increase productivity! Imagine you want to write a new blog article that partially says the same thing as another blog post your wrote earlier. You can paraphrase part of this content and supplement it with new original content.
Writing product descriptions is sometimes very repetitive. Sometimes products are very similar but you don't want to copy paste the same description. Using paraphrasing can really help.
If you are creating ads on a regular basis, you might sometimes lack inspiration. Paraphrasing is your friend here.
Control whether you want to use the model on a GPU. Machine learning models run much faster on GPUs.
AI models don't always work well with non-English languages.
We do our best to add non-English models when it's possible. See for example Fine-tuned LLaMA 2 70B, Dolphin, ChatDolphin, XLM Roberta Large XNLI, Paraphrase Multilingual Mpnet Base V2, or spaCy. Unfortunately not all the models are good at handling non-English languages.
In order to solve this challenge, we developed a multilingual module that automatically translates your input into English, performs the actual NLP operation, and then translates the result back to your original language. It makes your requests a bit slower but often returns very good results.
Even for models that natively understand non-English languages, they actually sometimes work even better with the multilingual addon.
Simply select your language in the list, and from now on you can write the input text in your own language!
This multilingual add-on is a free feature.