Skip Navigation

Ollama: Easily run LLMs locally on macOS and Linux

ollama.ai Ollama

Get up and running with large language models, locally.

Ollama

If you’re just wanting to run LLMs quickly on your computer in the command line, this is about as simple as it gets. Ollama provides an easy CLI to generate text, and there’s also a Raycast extension for more powerful usage.

6
TechNews @radiation.party irradiated @radiation.party
BOT
[HN] Ollama – Easily Run LLMs on your laptop

You're viewing a single thread.

6 comments
You've viewed 6 comments.