Run DeepSeek-R1 AI on Raspberry Pi

Artificial intelligence (AI) is no longer confined to supercomputers and high-end servers. With advancements like the DeepSeek-R1 AI model and tools such as Ollama, you can now run sophisticated language models right from your Raspberry Pi. This guide will walk you through the process of setting up the DeepSeek-R1 model on your Raspberry Pi 💻

Prerequisites
  • Raspberry Pi: Preferably a 4B or higher model with at least 4GB of RAM for optimal performance.
  • MicroSD Card: At least 16GB in size, loaded with a fresh copy of either Ubuntu Server (headless) or Raspberry Pi OS.
  • SSH Access: Make sure you can access your Raspberry Pi remotely via SSH. You can also use VNC Viewer if that's preferable.
Update Your System

Ensure the system is current by running below commands -

sudo apt update
sudo apt upgrade -y

Make sure your PI has 64-bit OS

uname -m

If this shows arch64, we are good else we need install 64 bit version of the OS.

Enters Ollama

Ollama is an open-source app that lets users run, create, and share large language models (LLMs) on their own computers.

Install Curl

First, ensure curl is installed. Then run

sudo apt install curl -y
Download and Run the Ollama Installer

Execute the following command to download and install Ollama:

curl -fsSL https://ollama.com/install.sh | sh

Verify that Ollama installed correctly by running:

ollama --version
Run DeepSeek-R1 Model

To download and start the deepseek r1 mode, run below command -

ollama run deepseek-r1:8b

This command downloads the distilled variantion of model which is a smaller, more efficient models to mimic the behaviour of the larger deepseek r1 model.

Check this for more distilled models

Interact with the model

Once running, interact with the model by asking questions with the '/?' command to see available options and features.

Some notes -

Raspberry PI doesn't have any AMD and NVIDIA GPU, so all processing will be with CPU.

There are options to hook up your PI with external graphics card but that's for another day.

And yes, it does hog some memory

Hope you learnt something new today and was able to run the new hot AI on a lean hardware!!

Cheers 🍻

Tirthankar Kundu

Tirthankar Kundu