4.4 KiB
4.4 KiB
Deployment Guide
Quick Start
# 1. Clone repository
git clone <repository-url>
cd munich-news
# 2. Configure environment
cp backend/.env.example backend/.env
# Edit backend/.env with your settings
# 3. Start system
docker-compose up -d
# 4. View logs
docker-compose logs -f
Environment Configuration
Required Settings
Edit backend/.env:
# Email (Required)
SMTP_SERVER=smtp.gmail.com
SMTP_PORT=587
EMAIL_USER=your-email@gmail.com
EMAIL_PASSWORD=your-app-password
# MongoDB (Optional - defaults provided)
MONGODB_URI=mongodb://localhost:27017/
# Tracking (Optional)
TRACKING_ENABLED=true
TRACKING_API_URL=http://localhost:5001
Optional Settings
# Newsletter
NEWSLETTER_MAX_ARTICLES=10
NEWSLETTER_HOURS_LOOKBACK=24
# Ollama AI
OLLAMA_ENABLED=true
OLLAMA_BASE_URL=http://127.0.0.1:11434
OLLAMA_MODEL=phi3:latest
# Tracking
TRACKING_DATA_RETENTION_DAYS=90
Production Deployment
1. Set MongoDB Password
export MONGO_PASSWORD=your-secure-password
docker-compose up -d
2. Use HTTPS for Tracking
Update backend/.env:
TRACKING_API_URL=https://yourdomain.com
3. Configure Log Rotation
Add to docker-compose.yml:
services:
crawler:
logging:
driver: "json-file"
options:
max-size: "10m"
max-file: "3"
4. Set Up Backups
# Daily MongoDB backup
0 3 * * * docker exec munich-news-mongodb mongodump --out=/data/backup/$(date +\%Y\%m\%d)
5. Enable Backend API
Uncomment backend service in docker-compose.yml:
backend:
build:
context: ./backend
ports:
- "5001:5001"
# ... rest of config
Schedule Configuration
Change Crawler Time
Edit news_crawler/scheduled_crawler.py:
schedule.every().day.at("06:00").do(run_crawler) # Change time
Change Sender Time
Edit news_sender/scheduled_sender.py:
schedule.every().day.at("07:00").do(run_sender) # Change time
Rebuild after changes:
docker-compose up -d --build
Database Setup
Add RSS Feeds
mongosh munich_news
db.rss_feeds.insertMany([
{
name: "Süddeutsche Zeitung München",
url: "https://www.sueddeutsche.de/muenchen/rss",
active: true
},
{
name: "Merkur München",
url: "https://www.merkur.de/lokales/muenchen/rss/feed.rss",
active: true
}
])
Add Subscribers
mongosh munich_news
db.subscribers.insertMany([
{
email: "user1@example.com",
active: true,
tracking_enabled: true,
subscribed_at: new Date()
},
{
email: "user2@example.com",
active: true,
tracking_enabled: true,
subscribed_at: new Date()
}
])
Monitoring
Check Container Status
docker-compose ps
View Logs
# All services
docker-compose logs -f
# Specific service
docker-compose logs -f crawler
docker-compose logs -f sender
Check Database
mongosh munich_news
// Count articles
db.articles.countDocuments()
// Count subscribers
db.subscribers.countDocuments({ active: true })
// View recent articles
db.articles.find().sort({ crawled_at: -1 }).limit(5)
Troubleshooting
Containers Won't Start
# Check logs
docker-compose logs
# Rebuild
docker-compose up -d --build
# Reset everything
docker-compose down -v
docker-compose up -d
Crawler Not Finding Articles
# Check RSS feeds
mongosh munich_news --eval "db.rss_feeds.find({ active: true })"
# Test manually
docker-compose exec crawler python crawler_service.py 5
Newsletter Not Sending
# Test email
docker-compose exec sender python sender_service.py test your-email@example.com
# Check SMTP config
docker-compose exec sender python -c "from sender_service import Config; print(Config.SMTP_SERVER)"
Maintenance
Update System
git pull
docker-compose up -d --build
Backup Database
docker exec munich-news-mongodb mongodump --out=/data/backup
Clean Old Data
mongosh munich_news
// Delete articles older than 90 days
db.articles.deleteMany({
crawled_at: { $lt: new Date(Date.now() - 90*24*60*60*1000) }
})
Security Checklist
- Set strong MongoDB password
- Use HTTPS for tracking URLs
- Secure SMTP credentials
- Enable firewall rules
- Set up log rotation
- Configure backups
- Monitor for failures
- Keep dependencies updated