update
This commit is contained in:
@@ -7,10 +7,11 @@ This project includes an integrated Ollama service for AI-powered summarization
|
||||
## Docker Compose Setup (Recommended)
|
||||
|
||||
The docker-compose.yml includes an Ollama service that automatically:
|
||||
- Runs Ollama server on port 11434
|
||||
- Runs Ollama server (internal only, not exposed to host)
|
||||
- Pulls the phi3:latest model on first startup
|
||||
- Persists model data in a Docker volume
|
||||
- Supports GPU acceleration (NVIDIA GPUs)
|
||||
- Only accessible by other Docker Compose services for security
|
||||
|
||||
### GPU Support
|
||||
|
||||
@@ -90,8 +91,8 @@ docker-compose logs -f ollama
|
||||
# Check model setup logs
|
||||
docker-compose logs ollama-setup
|
||||
|
||||
# Verify Ollama is running
|
||||
curl http://localhost:11434/api/tags
|
||||
# Verify Ollama is running (from inside a container)
|
||||
docker-compose exec crawler curl http://ollama:11434/api/tags
|
||||
```
|
||||
|
||||
### First Time Setup
|
||||
@@ -187,8 +188,8 @@ If you prefer to run Ollama directly on your host machine:
|
||||
|
||||
### Basic API Test
|
||||
```bash
|
||||
# Test Ollama API directly
|
||||
curl http://localhost:11434/api/generate -d '{
|
||||
# Test Ollama API from inside a container
|
||||
docker-compose exec crawler curl -s http://ollama:11434/api/generate -d '{
|
||||
"model": "phi3:latest",
|
||||
"prompt": "Translate to English: Guten Morgen",
|
||||
"stream": false
|
||||
|
||||
Reference in New Issue
Block a user