How to use an opensource LLM locally?

Ollama

Install LLama locally using Ollama

Meta's LLama Official Document, how to install and use:

Install Ollama

Ollama is a tool designed to enable users to run open-source large language models (LLMs) locally on their machines.

https://ollama.com/

Ollama

Download and install

Install Docker:

https://www.docker.com/

Install Open WebUI

https://docs.openwebui.com/

Click here and start to use UI version LLama:

You will have a ChatGPT like UI:

LLama

LM Studio

https://lmstudio.ai


Online Versions

The installation process described above can be cumbersome. If you prefer a more straightforward way to experience LLama, you can use the online resources listed below.

DuckDuckGo AI Chat

DuckDuckGo, the well-known search engine, has launched an AI Chat feature that prioritizes privacy protection. This service is currently free and includes the LLama3 70B large language model, making it an accessible option for experiencing LLama.

or you could use an AI aggregator like Poe.

Last updated

Was this helpful?