You can now talk to 1000s of AI models on Mac or Windows, thanks to this huge GPT4All 3.0 update
All are available offline — if your laptop can handle it
Locally running artificial intelligence has been big news recently with the launch of Copilot+PCs running versions of Microsoft Copilot on devices, and Apple's new Apple Intelligence bringing small language models to Mac and iOS.
There is a third, cross-platform solution from Nomic AI. It has just released GPT4All 3.0, a significant update to its AI platform that lets you chat with thousands of LLMs locally on your Mac, Linux, or Windows laptop.
With this new update, GPT4All is now a completely private experience that lets you chat with locally hosted versions of LLaMa, Mistral, Nous-Hermes, and more.
Other improvements include an improved UI, the ability to interact with your own files and documents, and more control over how the chatbot responds.
What is GPT4All?
GPT4All, developed by Nomic AI, has quickly become one of the fastest-growing repositories on GitHub, boasting over 250,000 monthly active users and 65,000 stars on the platform.
This open-source project, launched less than a year ago, allows users to run large language models (LLMs) locally on their devices, ensuring that sensitive data never leaves their machine. The focus on privacy and security has made GPT4All a popular choice for both individual users and enterprises.
With its massive community of users and developers, GPT4All is being constantly improved upon with 100+ unique contributors and more than 30 version releases since launch.
Sign up to get the BEST of Tom's Guide direct to your inbox.
Here at Tom’s Guide our expert editors are committed to bringing you the best news, reviews and guides to help you stay informed and ahead of the curve!
What's new in GPT4All v3.0?
GPT4All 3.0, launched in July 2024, marks several key improvements to the platform. It’s now a completely private laptop experience with its own dedicated UI.
It also comes with access to thousands of language models to choose from, and a new integration that lets you chat directly with your local files and documents.
Key updates include:
- Expanded Model Support: Users can now interact with thousands of AI models, including popular ones like LLaMa, Mistral, and Nous-Hermes. This vast selection allows for more tailored and versatile AI interactions.
- Enhanced Compatibility: GPT4All 3.0 fully supports Mac M Series chips, as well as AMD and NVIDIA GPUs, ensuring smooth performance across a wide range of hardware configurations.
- LocalDocs Integration: This feature allows users to grant their local LLM access to private and sensitive information without requiring an internet connection. This ensures that data remains secure and private, a crucial aspect for many users
- Customizable Chatbot Experience: Users can now fully customize their chatbot interactions with options for system prompts, temperature settings, context length, and batch size. This level of customization ensures that the AI can be tailored to meet specific needs and preferences.
“We are committed to building the premier desktop LLM experience for the everyday person,” said Nomic AI in a tweet announcing the 3.0 update. “At Nomic, we believe that anyone should be able to benefit from powerful generative AI,” they added.
More from Tom's Guide
- Apple WWDC 2024 announcements: Apple Intelligence, iOS 18, iPadOS, Siri 2.0 and more
- Apple's integration with ChatGPT is just the beginning — Google Gemini is coming and maybe an AI App Store
- Siri just got a huge boost with Apple Intelligence — here’s everything it can do now
Ritoban Mukherjee is a freelance journalist from West Bengal, India whose work on cloud storage, web hosting, and a range of other topics has been published on Tom's Guide, TechRadar, Creative Bloq, IT Pro, Gizmodo, Medium, and Mental Floss.