How I deployed Jenbina into the cloud
- Matvei Kudelin
- Mar 16
- 2 min read
For Jenbina I am using ollama, llama 3.2, langchain and streamlit, so I decided to use docker to deploy it inside the Oracle free tier cloud and streamlit public cloud.
Deploying Ollama LLM on Oracle Cloud Free Tier: A Complete Guide
1. Getting Oracle Cloud Free Tier
Sign up at Oracle Cloud Free Tier
You get for free for trial period:
VM.Standard.E5.Flex
4 ARM-based core
24 GB RAM
200 GB storage
2. Setting Up Ubuntu Instance
In Oracle Cloud Console:
Navigate to Compute → Instances
Click “Create Instance”
Choose “Ubuntu 24.04” as the operating system
Select “VM.Standard.E5.Flex”
Configure with 4 OCPUs and 24 GB memory
Create and download SSH key pair
Connect to your instance:
ssh -i path/to/private_key ubuntu@your_instance_ip
3. Installing Docker on Ubuntu
# Update system
sudo apt update
sudo apt upgrade -y
# Install prerequisites
sudo apt install -y apt-transport-https ca-certificates curl software-properties-common
# Add Docker's GPG key
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg
# Add Docker repository
echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] https://download.docker.com/linux/ubuntu $(. /etc/os-release && echo "$VERSION_CODENAME") stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
# Install Docker
sudo apt update
sudo apt install -y docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin
# Add user to docker group
sudo usermod -aG docker $USER
newgrp docker
Running Ollama
# Pull Ollama image
docker pull ollama/ollama
# Run Ollama container
docker run -d -p 11434:11434 ollama/ollama
# Pull specific model
docker exec -it $(docker ps -q) ollama pull llama3.2:3b-instruct-fp16
# Verify installation
docker ps
docker logs $(docker ps -q)
5. Testing Local URL
# Install network tools
sudo apt install net-tools
# Check if port is listening
netstat -tulpn | grep 11434
# Test local API
curl http://localhost:11434/api/tags
6. Configuring Firewall (UFW)
# Enable UFW
sudo ufw enable
# Add necessary rules
sudo ufw allow 22/tcp # SSH
sudo ufw allow 11434/tcp # Ollama
# Verify rules
sudo ufw status numbered
7. Oracle Cloud Network Security
Navigate to Networking → Virtual Cloud Networks
Click on your VCN
Access Security Lists
Add Ingress Rule:
Source CIDR: 0.0.0.0/0
IP Protocol: TCP
Destination Port Range: 11434
Description: Ollama API
This is necessary because Oracle Cloud has two layers of firewall:
Instance level (UFW)
Network level (Security Lists)
8. Updating Your Application
# Update your application code
os.environ['OLLAMA_HOST'] = 'http://your_instance_ip:11434'
local_llm = 'llama3.2:3b-instruct-fp16'
llm = ChatOllama(
model=local_llm,
temperature=0,
base_url='http://your_instance_ip:11434'
)
Security Considerations
Limit access to specific IPs in Security Lists
Use strong SSH keys
Keep system and Docker updated
Consider implementing API authentication
Troubleshooting
Check Docker container status: docker ps
View logs: docker logs $(docker ps -q)
Test connectivity: curl -v http://localhost:11434/api/tags
Verify port binding: netstat -tulpn | grep 11434
Check firewall status: sudo ufw status
9. I ran streamlit locally with the following command in the terminal
streamlit run core/app.py
so after that, the demo page is opened and I click deploy and can deploy into the streamlit cloud:
The code of the application is located here:
Comments