I am working on AI with various large language models for different purposes such as medicine and law, where they are fine-tuned with specific requirements. I download LLMs from Hugging Face for these environments. I use it to support AI-driven projects and deploy AI applications for local use, focusing on local LLMs with real-world applications.
Hugging Face Hub
Hugging FaceExternal reviews
External reviews are not included in the AWS star rating for the product.
Extensive documentation and diverse models support AI-driven projects
What is our primary use case?
What is most valuable?
Hugging Face is valuable because it provides a single, comprehensive repository with thorough documentation and extensive datasets. It hosts nearly 400,000 open-source LLMs that cover a wide variety of tasks, including text classification, token classification, text generation, and more. It serves as a foundational platform offering updated resources, making it essential in the AI community.
What needs improvement?
It is challenging to suggest specific improvements for Hugging Face, as their platform is already very well-organized and efficient. However, they could focus on cleaning up outdated models if they seem unnecessary and continue organizing more LLMs.
For how long have I used the solution?
I have been working with Hugging Face for about one and a half years.
What do I think about the stability of the solution?
Hugging Face is stable, provided the environment is controlled, and the user base is limited. The stability relies on the specific models and the data they're fed, which minimizes issues like hallucination.
What do I think about the scalability of the solution?
Hugging Face is quite scalable, especially in terms of upgrading models for better performance. There is flexibility in using models of varying sizes while keeping the application environment consistent.
How are customer service and support?
I have not needed to communicate with Hugging Face's technical support because they have extensive documentation available.
How would you rate customer service and support?
Neutral
Which solution did I use previously and why did I switch?
Before Hugging Face, I used Ollama due to its ease of use, but Hugging Face offers a wider range of models.
How was the initial setup?
The initial setup can be rated as a seven out of ten due to occasional issues during model deployment, which might require adjustments. Recent developments have made the process easier though.
What's my experience with pricing, setup cost, and licensing?
The pricing is reasonable. I use a pro account, which costs about $9 a month. This positions it in the middle of the cost scale.
Which other solutions did I evaluate?
Before choosing Hugging Face, I used Ollama for its ease of use, but it lacked the variety offered by Hugging Face.
What other advice do I have?
Overall, the platform is excellent. For any AI enthusiast, Hugging Face provides a broad array of open-source models and a solid foundation for building AI applications. Using an on-premises model helps manage errors in critical environments. I rate Hugging Face as an eight out of ten.
Accessible inference APIs drive personal project success for students
What is our primary use case?
This is a simple personal project, non-commercial. As a student, that's all I do.
What is most valuable?
The most valuable features are the inference APIs as it takes me a long time to run inferences on my local machine.
What needs improvement?
Access to the models and datasets could be improved. Many interesting ones are restricted. It would be great if they provided access for students or non-professionals who just want to test things.
For how long have I used the solution?
I have been using this solution for about the last three or four months.
Which solution did I use previously and why did I switch?
I have used just TensorFlow and PyTorch. Nothing else.
What's my experience with pricing, setup cost, and licensing?
What other advice do I have?
I've been trying to implement some chatbots, and having free access to Hugging Face helped me a lot.
I use PyTorch and TensorFlow to implement other deep-learning models and access LLMs. Each one of these tools has its own purpose. Python is used for deep learning projects to train and fine-tune models at the deep learning level, while for Hugging Face, it's mainly for the transformers library and LLM APIs. I cannot compare them directly. For me, it's about access to datasets and models.
I would rate this product nine out of ten.
Open-source, reliable, and easy to learn
What is our primary use case?
I had to perform training on a model when I worked as a data scientist. There is already a pre-trained model, and we train our model on our custom data. We can accept things from this pre-trained model that has already been trained on a huge amount of data.
What is most valuable?
Hugging Face provides open-source models, making it the best open-source and reliable solution. Currently, Hugging Face is the best solution for exploring many models. There are several models that we can use in real life. There are several words, and we can use a Hugging Face model like NER to accept only limited words from a text.
What needs improvement?
Most people upload their pre-trained models on Hugging Face, but more details should be added about the models.
For how long have I used the solution?
I have been using Hugging Face for six months.
What do I think about the stability of the solution?
The solution provides good stability.
What do I think about the scalability of the solution?
Five people from our team totally depend on the Hugging Face model whenever the company gets a new project.
What's my experience with pricing, setup cost, and licensing?
Hugging Face is an open-source solution.
What other advice do I have?
The solution is deployed on the cloud in our organization. Hugging Face provides many open-source models like Meta and Gemma that are performing very well. When someone puts their model on Hugging Face, they provide us with all the steps. We can follow those steps and train our model. This is the best thing I have seen by Hugging Face.
Several IT industries in India are unable to purchase models like ChatGPT. Hugging Face provides open-source models, making it the best open-source and reliable solution. I would recommend the solution to other users. Users can easily use Hugging Face after watching YouTube videos on how to use it. It is easy to learn to use Hugging Face.
Overall, I rate the solution an eight out of ten.
Available at a low cost
What is our primary use case?
We use the solution for fine-tuning and RAC and LLM.
What is most valuable?
The most important feature is Secure LMM because there are so many NLMs that manage programs on the Internet.
What needs improvement?
It can incorporate AI into its services.
For how long have I used the solution?
I have been using Hugging Face for six months.
What do I think about the stability of the solution?
It is stable.
What do I think about the scalability of the solution?
It is scalable.
How was the initial setup?
Deployment can be challenging, but it becomes more manageable with the right education or by watching a tutorial. Many data science students might find it difficult to use. They need to learn about LLMs.
Since we have learned, we can use it easily. It takes two to three hours to deploy.
What's my experience with pricing, setup cost, and licensing?
It has reasonable pricing, which is six dollars per month.
What other advice do I have?
Integration is very easy.
Overall, I rate the solution an eight out of ten.
Open-sourced, reliable, and enables organizations to finetune data for business requirements
What is our primary use case?
Hugging Face is a website that provides various open-source models. We use them to finetune models for our business. It is just like ChatGPT, but ChatGPT has paid sources. If we have to call an API, we must pay for it. However, Hugging Face has various open-source models like Llama 2 and Llama 3 that provide similar functionalities to ChatGPT. We use Llama 2 with 6 billion parameters to finetune the data for our business.
What is most valuable?
The tool is available for free. We use the product because it is beneficial for the company. It reduces cost. The product is reliable.
What needs improvement?
The solution must provide an efficient LLM. Facebook provides Llama 3, which gives results similar to ChatGPT. For now, Facebook is ChatGPT’s only competition. Hugging Face must provide a similar product.
For how long have I used the solution?
I have been using the solution for two to three months.
What do I think about the stability of the solution?
Facebook provides llama 3. Hugging Face is just a pathway. We have not found any bugs in the last two months.
What do I think about the scalability of the solution?
Five AI engineers in our organization were using the solution.
How was the initial setup?
The installation is easy if the computer or laptop has good hardware, RAM, and NVIDIA graphics card. If a system has a low RAM, the installation will be difficult.
What's my experience with pricing, setup cost, and licensing?
We do not have to pay for the product.
Which other solutions did I evaluate?
Various closed-source models like ChatGPT charge us for every call we make. For example, if I make a call in ChatGPT, it will cost us $20. Hugging Face is an open-source model. It doesn’t charge anything. ChatGPT has better functionalities than other open-source tools. However, I think open-source products will increase their functionalities in the future and compete with OpenAI.
What other advice do I have?
I will recommend the solution to people. It is the only platform that provides open-source models. Once we understand the LLM, it will be easy to use the tool. The open-source community has limited resources. It is increasing, though.
Overall, I rate the solution a nine out of ten.
An open-source platform that has hundreds of packages for creating LLMs
What is our primary use case?
In my last project, I created an SQL chatbot to convert simple English requests to complex SQL queries. As you know, computers don't understand textual data, so we have to tokenize it. I used Hugging Face embeddings for that.
What is most valuable?
The tool's most valuable feature is that it's open-source and has hundreds of packages already available. This makes it quite helpful for creating our LLMs.
What needs improvement?
I've worked on three projects using Hugging Face, and only once did we encounter a problem with the code. We had to use another open-source embedding from OpenAI to resolve it. Our team has three members: me, my colleague, and a team leader. We looked at the problem and resolved it.
The solution offers numerous modules that can be loaded onto personal machines or local servers for use in Python or other programming environments. However, the instructions on how to use these modules are not detailed enough.
For how long have I used the solution?
I have been using the product for two months.
How are customer service and support?
I haven't contacted the solution's support team yet.
How was the initial setup?
You can download the packages and connect them to an external source.
What's my experience with pricing, setup cost, and licensing?
The solution is open source.
What other advice do I have?
I'm learning generative AI, and there's a course on the DeepLearning.AI platform on which to learn AI with Hugging Face. That's where I learned about Hugging Face. I found it very easy to load the packages for Hugging Face to do our work, so I used it. Anyone with basic knowledge of coding can use it.
I rate the overall product an eight out of ten.
An open-source solution that helps to fine-tune large language models
What is our primary use case?
I use Hugging Face to fine-tune large language models. We take our client's use case and an open-source model already deployed, download the model artifacts, and fine-tune the models according to our specific use case.
What is most valuable?
The tool's most valuable feature is that it shows trending models. All the new models, even Google's demo models, appear at the top. You can find all the open-source models in one place. You can use them directly and easily find their documentation. It's very simple to find documentation and write code. If you want to work with AI and machine learning, Hugging Face is a perfect place to start.
What needs improvement?
I believe Hugging Face has some room for improvement. There are some security issues. They provide code, but API tokens aren't indicated. Also, the documentation for particular models could use more explanation. But I think these things are improving daily. The main change I'd like to see is making the deployment of inference endpoints more customizable for users.
For how long have I used the solution?
I have been using the product for a year.
What do I think about the stability of the solution?
I think Hugging Face is a good, stable product. I don't see any major bugs or breakdowns. The entire company is working to bring all open-source libraries onto one platform. Many companies use it to deploy their large language models for generative AI. It's a good platform, and I don't hear many complaints about it.
What do I think about the scalability of the solution?
I estimate that this product will have around 20,000 to 30,000 users. It is revolutionary.
How are customer service and support?
We contact support through emails.
Which solution did I use previously and why did I switch?
We chose the solution because it helped us reduce costs. The same model would generate costs elsewhere.
How was the initial setup?
We have two deployment options: cloud and on-premises. On-premises means it's on-demand, and we have to monitor it. With cloud deployment, there's no need to watch for availability because it's always handled in the cloud. There should be no problems with cloud deployment. If we deploy on-premises, we have to monitor it ourselves. That's the main difference. We have both options available.
It's very easy to deploy an endpoint because there's already pre-built documentation. With just one click, you can directly load the knowledge handler. The challenging part is determining if the model suits our customized use case, which takes time. Once we're sure the model is right for our use case, it's straightforward.
What's my experience with pricing, setup cost, and licensing?
The tool is open-source. The cost depends on what task you're doing. If you're using a large language model with around 12 million parameters, it will cost more. On average, Hugging Face is open source so you can download models to your local machine for free. For deployment, you can use any cloud service.
What other advice do I have?
You can start with it on a personal device. If you're planning to deploy, you might want to consider integrating Hugging Face with a cloud platform. This can help reduce charges, and the deployment will happen on the cloud platform.
If you're joining our team and using this tool for the first time, you'll need some experience deploying models. Hugging Face is one platform where you can deploy open-source models. You should have six or seven months of experience handling large language models. After that, you can learn the basic documentation in two or three days.
I rate it an eight out of ten.
An open-source application for prototyping with built-in libraries
What is our primary use case?
Hugging Face is an open-source desktop solution.
What is most valuable?
The solution is open-source. There are so many models available for usage, especially for prototyping. You can play around with text-to-text, text-to-image, and text-to-video. They have also provided the Inference API as part of the WebUI for a smaller model. You can play around with their website.
What needs improvement?
You could use Hugging Face for libraries like Lambda. Hugging Face has upgraded from using Inference API for their free developer offering. Some ecosystem libraries are lagging. Both of them are still using Inference API. Perhaps Hugging Face could collaborate with these ecosystem library providers to ensure they update their offerings and provide users access to the latest technology.
For how long have I used the solution?
I have been using Hugging Face for four to five months. We are using the latest version of the solution.
What do I think about the stability of the solution?
The inference API and other stuff are rate-limited. It returns internal server errors or does not return any results a lot of times. There were no such crashes. Secondly, Hugging Face has made things easier for apps in production. They offer libraries, but much other work is left to the developers.
I rate the solution’s stability a seven out of ten.
What do I think about the scalability of the solution?
Hugging Face has not been built out for taking the app to production. They are offering prototype-level capabilities. We'll have to start consuming some managed offerings or build everything ourselves.
I rate the solution's scalability a six out of ten.
Which solution did I use previously and why did I switch?
I started using Hugging Face because I'm still prototyping. Other vendors are pretty managed offerings with many costs in getting code built out, whereas Hugging Face is free.
Alternatives like Vertex OpenAI and Azure OpenAI offer access to large language models, but most platforms are closed and restrict fine-tuning. This is where Hugging Face shines. Its open nature allows for fine-tuning of models, providing a significant advantage. Additionally, if data security is a concern, enterprises can deploy their own Hugging Face model as an endpoint or local instance, avoiding the need to send data to externally managed offerings. This flexibility and control over data makes Hugging Face a compelling choice for producing large language models.
How was the initial setup?
The initial setup is very easy and takes a few seconds to complete.
What's my experience with pricing, setup cost, and licensing?
There is no extra cost.
What other advice do I have?
Many advanced models are available on Hugging Face. The managed providers are working towards adding the usage of AI models and getting them to a ready stage for usage.
They're trying to give offerings for people to be able to use it. They are also coming up with options to productionize it, but some areas need work.
Overall, I rate the solution a nine out of ten.
A comprehensive natural language processing ecosystem offering a diverse range of pre-trained models and a collaborative platform
What is most valuable?
My preferred aspects are natural language processing and question-answering. It aids us in efficiently discovering effective features and models. The ability to enlarge and tag faces has assisted me in finding effective and well-documented packages. I incorporate their favored methods and utilize various packages and formats in my work.
What needs improvement?
Implementing a cloud system to showcase historical data would be beneficial.
For how long have I used the solution?
I have been working with it for one year and a half.
What do I think about the stability of the solution?
They are ever-present, consistently providing us with packages, models, and languages that are perpetually helpful and stable. I would rate it eight out of ten.
What do I think about the scalability of the solution?
It is a scalable tool. However, it's important to reiterate that it's not the application itself but rather a means to scale up knowledge. I would rate it eight out of ten.
How was the initial setup?
No setup is required; these are web servers.
What's my experience with pricing, setup cost, and licensing?
There are different pricing models, with options for enterprise-level features. I recall seeing a fee of nine dollars, and there's also an enterprise option priced at twenty dollars per month.
What other advice do I have?
Overall, I would rate it nine out of ten.
Stable, easy to set up, and useful
What is our primary use case?
I mainly use it for machine learning and AI. It's for a large language model, like LLaMA.
How has it helped my organization?
Hugging Face has helped me in many ways. For example, I can check the leading board and see which model gives the best performance. Another thing I can do is use an exact Q code to deploy and test the model. It has a lot of articles and papers where I can find out what I need.
What is most valuable?
What I find the most valuable about Hugging Face is that I can check all the models on it and see which ones have the best performance without using another platform.
What needs improvement?
The area that needs improvement would be the organization of the materials. It could be clearer and more systematic. It would be good if the layout was clear and we could search the models easily.
For how long have I used the solution?
I've been using Hugging Face for a little over a year.
What do I think about the stability of the solution?
When it comes to stability, I would give it a nine out of ten.
What do I think about the scalability of the solution?
It's a scalable solution. I would rate the scalability an eight out of ten. Approximately ten to twenty people use Hugging Face at our company. I try to use the solution as much as possible.
Which solution did I use previously and why did I switch?
I have previously used GitHub for codes and models. I still use it from time to time when I want to double-check something, but I use Hugging Face regularly.
How was the initial setup?
The ease of the initial setup is a nine out of ten. It only takes about ten minutes if you follow the instructions you find on Google.
What other advice do I have?
Hugging Face is the main hub for large language models and AIs. I would recommend it to anyone who's considering using it. Overall, I rate it a nine out of ten.