3.3 KiB
Security Update: Ollama Internal-Only Configuration
Summary
Ollama service has been configured to be internal-only and is no longer exposed to the host machine. This improves security by reducing the attack surface.
Changes Made
Before (Exposed)
ollama:
ports:
- "11434:11434" # ❌ Accessible from host and external network
After (Internal Only)
ollama:
# No ports section - internal only ✓
# Only accessible within Docker network
Verification
✓ Port Not Accessible from Host
$ nc -z -w 2 localhost 11434
# Connection refused (as expected)
✓ Accessible from Docker Services
$ docker-compose exec crawler python -c "import requests; requests.get('http://ollama:11434/api/tags')"
# ✓ Works perfectly
Security Benefits
- No External Access: Ollama API cannot be accessed from outside Docker network
- Reduced Attack Surface: Service is not exposed to potential external threats
- Network Isolation: Only authorized Docker Compose services can communicate with Ollama
- No Port Conflicts: Port 11434 is not bound to host machine
Impact on Usage
No Change for Normal Operations ✓
- Crawler service works normally
- Translation and summarization work as before
- All Docker Compose services can access Ollama
Testing from Host Machine
Since Ollama is internal-only, you must test from inside the Docker network:
# ✓ Test from inside a container
docker-compose exec crawler python crawler_service.py 1
# ✓ Check Ollama status
docker-compose exec crawler python -c "import requests; print(requests.get('http://ollama:11434/api/tags').json())"
# ✓ Check logs
docker-compose logs ollama
If You Need External Access (Development Only)
For development/debugging, you can temporarily expose Ollama:
Option 1: SSH Port Forward
# Forward port through SSH (if accessing remote server)
ssh -L 11434:localhost:11434 user@server
Option 2: Temporary Docker Exec
# Run commands from inside network
docker-compose exec crawler curl http://ollama:11434/api/tags
Option 3: Modify docker-compose.yml (Not Recommended)
ollama:
ports:
- "127.0.0.1:11434:11434" # Only localhost, not all interfaces
Documentation Updated
- ✓ docker-compose.yml - Removed port exposure
- ✓ docs/OLLAMA_SETUP.md - Updated testing instructions
- ✓ docs/SECURITY_NOTES.md - Added security documentation
- ✓ test-ollama-setup.sh - Updated to test from inside network
- ✓ QUICK_START_GPU.md - Updated API testing examples
Testing
All functionality has been verified:
- ✓ Ollama not accessible from host
- ✓ Ollama accessible from crawler service
- ✓ Translation works correctly
- ✓ Summarization works correctly
- ✓ All tests pass
Rollback (If Needed)
If you need to expose Ollama again:
# In docker-compose.yml
ollama:
ports:
- "11434:11434" # or "127.0.0.1:11434:11434" for localhost only
Then restart:
docker-compose up -d ollama
Recommendation
Keep Ollama internal-only for production deployments. This is the most secure configuration and sufficient for normal operations.
Only expose Ollama if you have a specific need for external access, and always bind to 127.0.0.1 (localhost only), never 0.0.0.0 (all interfaces).