All checks were successful
dongho-repo/Munich-news/pipeline/head This commit looks good
91 lines
2.0 KiB
Markdown
91 lines
2.0 KiB
Markdown
# ⚡ Quick Start Guide
|
|
|
|
Get Munich News Daily running in 5 minutes!
|
|
|
|
## 📋 Prerequisites
|
|
- **Docker** & **Docker Compose** installed
|
|
- **4GB+ RAM** (for AI models)
|
|
- *(Optional)* NVIDIA GPU for faster processing
|
|
|
|
## 🚀 Setup Steps
|
|
|
|
### 1. Configure Environment
|
|
```bash
|
|
cp backend/.env.example backend/.env
|
|
nano backend/.env
|
|
```
|
|
**Required:** Update `SMTP_SERVER`, `EMAIL_USER`, and `EMAIL_PASSWORD`.
|
|
|
|
### 2. Start the System
|
|
```bash
|
|
# Auto-detects GPU capabilities and starts services
|
|
./start-with-gpu.sh
|
|
|
|
# Watch installation progress (first time model download ~2GB)
|
|
docker-compose logs -f ollama-setup
|
|
```
|
|
|
|
### 3. Add News Sources
|
|
```bash
|
|
# Connect to database
|
|
docker-compose exec mongodb mongosh munich_news
|
|
|
|
# Paste this into the mongo shell:
|
|
db.rss_feeds.insertMany([
|
|
{
|
|
name: "Süddeutsche Zeitung München",
|
|
url: "https://www.sueddeutsche.de/muenchen/rss",
|
|
active: true
|
|
},
|
|
{
|
|
name: "Merkur München",
|
|
url: "https://www.merkur.de/lokales/muenchen/rss/feed.rss",
|
|
active: true
|
|
}
|
|
])
|
|
```
|
|
|
|
### 4. Add Yourself as Subscriber
|
|
```bash
|
|
# Still in mongo shell:
|
|
db.subscribers.insertOne({
|
|
email: "your-email@example.com",
|
|
active: true,
|
|
tracking_enabled: true,
|
|
subscribed_at: new Date()
|
|
})
|
|
```
|
|
|
|
### 5. Verify Installation
|
|
```bash
|
|
# 1. Run the crawler manually to fetch news
|
|
docker-compose exec crawler python crawler_service.py 5
|
|
|
|
# 2. Send a test email to yourself
|
|
docker-compose exec sender python sender_service.py test your-email@example.com
|
|
```
|
|
|
|
## 🎮 Dashboard Access
|
|
|
|
Once running, access the services:
|
|
- **Dashboard**: [http://localhost:3000](http://localhost:3000)
|
|
- **API**: [http://localhost:5001](http://localhost:5001)
|
|
|
|
## ⏭️ What's Next?
|
|
|
|
The system is now fully automated:
|
|
1. **6:00 AM**: Crawls news and generates AI summaries.
|
|
2. **7:00 AM**: Sends the daily newsletter.
|
|
|
|
### Useful Commands
|
|
```bash
|
|
# Stop everything
|
|
docker-compose down
|
|
|
|
# View logs for a service
|
|
docker-compose logs -f crawler
|
|
|
|
# Update code & rebuild
|
|
docker-compose up -d --build
|
|
```
|