Running a ChatGPT-Quality Model Directly on Your Desktop

You can now run a chatbot that is nearly the quality of ChatGPT directly on your desktop PC or Mac. Open-source models are becoming a viable option for those who prioritize privacy or need to work offline.

Running a ChatGPT-Quality Model Directly on Your Desktop
GPT4All

Vicuna is a new, open-source model trained by teams from UC Berkeley, CMU, Stanford, MBZUAI, and UC San Diego. It is based on Meta's LLaMa and fine-tuned with 70,000 user-shared conversations from ShareGPT.com. The team specifically trained it on multi-round conversations, which sets it apart from other open-source models.

Although it is limited in its ability to solve math problems and perform advanced reasoning, it is quite impressive compared to other open-source models out there.

You can try Vicuna with an easy-to-use multiplatform chatbot called GPT4All. GPT4All lets you easily run various large language models locally. No data is sent to the cloud as everything operates locally. It is available for Windows, OS X, and Ubuntu.

You can download GPT4All clients here.

Once installed, you will use the app's built-in downloader to install the model. Scroll down to vicuna-7b-1.1.1-q4_2, or if you have a fast, modern CPU you can try vicuna-13b-1.1.1-q4_2.

Performance

The creators of Vicuna used GPT-4 as a judge to compare answers to a set of questions from multiple chatbots. Vicuna does well against commercial and open-source models.

Response Comparison Assessed by GPT-4

Vicuna demonstrates just how close open-source models are getting to commercial ones from OpenAI or Google.