Want to run AI models on your phone? You actually can! Powerful Android phones can now run Large Language Models (LLMs) like Llama3 and DeepSeek-R1 Locally without the need of ROOT. This means faster AI, works offline, and keeps your data private.
This guide shows you how to run LLMs locally on your Android using Ollama. It works on most Android phones with good processors. Let’s get started!
Get Termux Your Android Terminal
First, you need Termux, which is Terminal emulator for your Android to do computer things.
- Open your phone’s internet browser (like Chrome or Firefox).
- Go to the TERMUX GITHUB page.
- Find and download the Termux ARM64 V8. This works for most new Android phones.
- After downloading, install Termux on your phone.
Update Termux
- Now, let’s set up Termux:
- Open the Termux app.
- Give it permission to use your phone’s storage by typing this command and pressing Enter
termux-setup-storage
- Choose a Package Mirror by typing this command
termux-change-repo
- Then update Termux. Type this
pkg upgrade
Add Tur Repo: Get Ollama Easily
Next, we add the Tur repository to easily install Ollama.
- Install Tur-Repo:
pkg install tur-repo
and press Enter.
Install Ollama & Zellij
Now, install the main tools: Ollama and Zellij.
- Install Ollama:
pkg install ollama
and press Enter. Ollama is now installed! - Install Zellij:
pkg install zellij
and press Enter. Zellij helps us manage multiple screens in Termux, which is useful for running AI.
Disable Phantom process Killer
Android can stop apps running in the background to save battery. We need to stop this for Termux so our AI models keep running smoothly.
- Enable Developer Options: Go to your phone’s Settings > About device. Tap “Build number” 7 times. You’ll see a message saying “Developer options enabled.”
- Disable Restrictions: Go back to Settings and find Developer options. Look for “Disable child process restrictions” and turn it ON.
Use Zellij To Manage Terminals Easily
Zellij helps us use Ollama efficiently.
- Start Zellij: Type
zellij
in Termux and press Enter. Choose “default setup”. - Open New Tab: Press CTRL + T, then N, then Enter to open a new terminal tab.
- Switch Tabs: Press CTRL + T, then use arrow Left/right keys to choose a tab, then press Enter.
Test DeepSeek R1 & LlAma 3.2
Finally, let’s run some AI models!
- Start Ollama Server: In Tab 1, type
ollama serve
and press Enter. This starts Ollama in the background. - Open New Tab: Switch to Tab 2 (CTRL + T And Arrow Keys).
- Install DeepSeek: In Tab 2, type
ollama run deepseek-r1.5b
and press Enter. This downloads and starts the DeepSeek AI model (small version for phones). - Talk to DeepSeek: Once it’s ready, you can type questions or prompts and press Enter. DeepSeek will respond!
- Stop Output: Press CTRL + C to stop DeepSeek from talking.
- Exit DeepSeek: Press CTRL + D to exit DeepSeek and go back to the Bash Prompt.
- Try Llama3: Type
ollama run llama3.2
and press Enter to try the Llama3 model.
Quick Tips for Termux & Ollama
- Stop AI Output: CTRL + C
- Exit AI Model: CTRL + D
- Clear Screen: CTRL + L (in model or Termux)
- New Zellij Tab: CTRL + T then N then Enter
- Exit Zellij: CTRL + Q
Stop Ollama Server When Done
To stop the Ollama server:
- Exit Models & Zellij: Type
exit
in model tabs, then CTRL + C in Tab 1 to stop the server, then CTRL + Q to exit Zellij. - Force Stop (If Needed): If you restart Termux and need to stop Ollama, type
ps aux | grep ollama
to find its ID, thenkill [ID]
(replace[ID]
with the actual number).
Enjoy AI on Your Phone!
You’ve now got LLMs running on your Android! Try different prompts, explore other models on the Ollama website, and see what you can create. Local AI on your phone opens up a world of possibilities. Have fun experimenting!