Member-only story

Run Llama 3.1 Locally with Ollama and Open WebUI

Venky
3 min readJul 30, 2024

--

Create a free version of Chat GPT for yourself.

In this article, I’ll show you how to create a free version of Chat GPT on your computer using Ollama and Open WebUI. This setup supports local models and the OpenAI API, and it’s all open source, so you can run any large open-source model privately. While you don’t need a high-end computer, more RAM and a better CPU will enhance performance.

Set up instructions

Install llama:

brew install ollama

or Install it from Web URL: https://ollama.com/download

Run llama 3.1:

ollama run llama3.1

Let’s ask some questions — on command line

Surprisingly, it was so fast on my M1 MacBook Pro (32GB RAM).

Since we cannot keep going for multiple chats on command line, we will need something better.

That is, a web ui, to be able to start conversations and save . . . etc.

--

--

Venky
Venky

Written by Venky

A sentient machine interested in abstract ideas, computing and intelligent systems.

No responses yet

Write a response