This commit is contained in:
2025-11-11 17:40:29 +01:00
parent 901e8166cd
commit 75a6973a49
11 changed files with 1028 additions and 21 deletions

125
SECURITY_UPDATE.md Normal file
View File

@@ -0,0 +1,125 @@
# Security Update: Ollama Internal-Only Configuration
## Summary
Ollama service has been configured to be **internal-only** and is no longer exposed to the host machine. This improves security by reducing the attack surface.
## Changes Made
### Before (Exposed)
```yaml
ollama:
ports:
- "11434:11434" # ❌ Accessible from host and external network
```
### After (Internal Only)
```yaml
ollama:
# No ports section - internal only ✓
# Only accessible within Docker network
```
## Verification
### ✓ Port Not Accessible from Host
```bash
$ nc -z -w 2 localhost 11434
# Connection refused (as expected)
```
### ✓ Accessible from Docker Services
```bash
$ docker-compose exec crawler python -c "import requests; requests.get('http://ollama:11434/api/tags')"
# ✓ Works perfectly
```
## Security Benefits
1. **No External Access**: Ollama API cannot be accessed from outside Docker network
2. **Reduced Attack Surface**: Service is not exposed to potential external threats
3. **Network Isolation**: Only authorized Docker Compose services can communicate with Ollama
4. **No Port Conflicts**: Port 11434 is not bound to host machine
## Impact on Usage
### No Change for Normal Operations ✓
- Crawler service works normally
- Translation and summarization work as before
- All Docker Compose services can access Ollama
### Testing from Host Machine
Since Ollama is internal-only, you must test from inside the Docker network:
```bash
# ✓ Test from inside a container
docker-compose exec crawler python crawler_service.py 1
# ✓ Check Ollama status
docker-compose exec crawler python -c "import requests; print(requests.get('http://ollama:11434/api/tags').json())"
# ✓ Check logs
docker-compose logs ollama
```
### If You Need External Access (Development Only)
For development/debugging, you can temporarily expose Ollama:
**Option 1: SSH Port Forward**
```bash
# Forward port through SSH (if accessing remote server)
ssh -L 11434:localhost:11434 user@server
```
**Option 2: Temporary Docker Exec**
```bash
# Run commands from inside network
docker-compose exec crawler curl http://ollama:11434/api/tags
```
**Option 3: Modify docker-compose.yml (Not Recommended)**
```yaml
ollama:
ports:
- "127.0.0.1:11434:11434" # Only localhost, not all interfaces
```
## Documentation Updated
- ✓ docker-compose.yml - Removed port exposure
- ✓ docs/OLLAMA_SETUP.md - Updated testing instructions
- ✓ docs/SECURITY_NOTES.md - Added security documentation
- ✓ test-ollama-setup.sh - Updated to test from inside network
- ✓ QUICK_START_GPU.md - Updated API testing examples
## Testing
All functionality has been verified:
- ✓ Ollama not accessible from host
- ✓ Ollama accessible from crawler service
- ✓ Translation works correctly
- ✓ Summarization works correctly
- ✓ All tests pass
## Rollback (If Needed)
If you need to expose Ollama again:
```yaml
# In docker-compose.yml
ollama:
ports:
- "11434:11434" # or "127.0.0.1:11434:11434" for localhost only
```
Then restart:
```bash
docker-compose up -d ollama
```
## Recommendation
**Keep Ollama internal-only** for production deployments. This is the most secure configuration and sufficient for normal operations.
Only expose Ollama if you have a specific need for external access, and always bind to `127.0.0.1` (localhost only), never `0.0.0.0` (all interfaces).