Ollama — Local LLM Inference

Ibrahim Zaman
2 min readNov 2, 2023

--

The rise of large language models and making it easy for everyone to interact with models made easy with Ollama, but for who technical people? Simply no but for everyone. Ollama is the way to deploy models locally and utilize power of LLM easily. This is simple introduction like sweets (:

Is it cross platform?

OfCourse, it is available to download and install on all three major OS (Linux — You know it’s kernel but generally called OS, which is for technical people or server guys, windows — for general public. Download it here: Download!

What to do after Installing Ollama?

Find any favorite model of your choice here: Find Models!

I’ve chosen Mistral 7B base model, you can select any one in tags below…

Let’s run Mistral 7b which is most popular on Ollama Index:

ollama run mistral

# I will complete this story later; I had written all the article, but medium didn’t save my article. I lost the progress.

--

--

Ibrahim Zaman
Ibrahim Zaman

No responses yet