Hugging Face is well known for its great work on the Python Transformers library, and for its big machine learning models repository. But they also provide an inference API and a fine-tuning platform called AutoTrain.
NLP Cloud's API and NLP Cloud's fine-tuning platform are direct competitors of Hugging Face's API and AutoTrain. Let's compare the pricing and features of these 2 actors here!
First, it's worth noting that the NLP Cloud API can be tested for free when used on a CPU and a GPU (thanks to the free plan and the pay-as-you-go plan that offers 100k free tokens), while the Hugging Face API can only be tested for free on a CPU (thanks to their free plan). It is an important difference since the most interesting Transformer-based AI models run much faster on a GPU. Some even just don't run on a GPU.
In terms of plans, Hugging Face only proposes pay-as-you-go pricing (pricing based on your consumption) while NLP Cloud both proposes pre-paid plans and pay-as-you-go plans. Let's say you want to perform text classification on pieces of text that contain around 5k words on average, at a rate of 15 requests per minute, on a GPU. Hugging Face's pricing is based on the number of characters, while NLP Cloud's one is based on the number of tokens. 5k words are more or less equivalent to 15k characters and to 3,750 tokens. On NLP Cloud it will cost you $99/month by subscribing to the Starter GPU Plan, while on Hugging Face it will cost you 15k x 15 x 60 x 24 x 31 x $50 / 1M = $500k/month (!!!).
As you can see, it seems that the Hugging Face pay-as-you-go pricing is absolutely not suited for a production use. Literally no-one is going to pay such a price for text classification on a GPU...
As far as fine-tuning is concerned, it is not even possible to compare as Hugging Face's AutoTrain pricing is not public. We registered and tried their AutoTrain solution, but we were still unable to find any clear pricing...
The great thing with Hugging Face is that they host tons of AI models on their platform! However it does not mean that you can actually use these models. You can of course download them, but this is not the same as using them.
Only a very small fraction of Hugging Face's models are actually available for inference through their API. If you try to use a model that is not already loaded, you will either have to wait for several minutes, or simply get an error. A solution is to pin the models you want to use so they are always available, but in that case you have to pay an additional $5/month per model on a GPU.
On NLP Cloud we chose a different strategy: around 50 different AI models are available all the time. We select a model when we think it is the best model for a specific use case. For example we choose Bart Large MNLI for classification, Distilbert for sentiment analysis, GPT-J for intent detection, etc.
More importantly: the most advanced AI models like GPT-J are not available on the Hugging Face API, and cannot be fine-tuned on their AutoTrain platform, while you can easily use and fine-tune these large language models on NLP Cloud.
Hugging Face only offers support if you select their Lab or Enterprise plan.
NLP Cloud is completely different: we offer the best support we can to any customer whether it's a free customer, a small paid customer, or an Enterprise one. We believe that a good support is critical when it comes to AI and natural language processing because customers can have tons of interesting technical or business questions.
In our benchmarks, we noticed a lower latency on the NLP Cloud API for all the models we tested, whether on a CPU or a GPU.
Speed is critical for such a machine learning API, and the fact that NLP Cloud responds faster can make a big difference, depending on your business requirements.
As far as fine-tuning is concerned, we were unable to make a proper comparison for the moment because most of the fine-tunings we launched on the Hugging Face AutoTrain platform failed without explicit error message.
Users often compare NLP Cloud to the Hugging Face API and AutoTrain platform.
We do believe that the NLP Cloud API is much more interesting both from a pricing standpoint and from a performance standpoint.
We are also very proud to offer a high quality support to all our customers without distinction.
Would you like to have a try? Test NLP Cloud here!
CTO at NLP Cloud