diff --git a/QUICKSTART.md b/QUICKSTART.md index 40486d4..7bfdc45 100644 --- a/QUICKSTART.md +++ b/QUICKSTART.md @@ -1,56 +1,36 @@ -# Quick Start Guide +# โšก Quick Start Guide Get Munich News Daily running in 5 minutes! -## Prerequisites +## ๐Ÿ“‹ Prerequisites +- **Docker** & **Docker Compose** installed +- **4GB+ RAM** (for AI models) +- *(Optional)* NVIDIA GPU for faster processing -- Docker & Docker Compose installed -- 4GB+ RAM (for Ollama AI models) -- (Optional) NVIDIA GPU for 5-10x faster AI processing - -## Setup +## ๐Ÿš€ Setup Steps ### 1. Configure Environment - ```bash -# Copy example environment file cp backend/.env.example backend/.env - -# Edit with your settings (required: email configuration) nano backend/.env ``` +**Required:** Update `SMTP_SERVER`, `EMAIL_USER`, and `EMAIL_PASSWORD`. -**Minimum required settings:** -```env -SMTP_SERVER=smtp.gmail.com -SMTP_PORT=587 -EMAIL_USER=your-email@gmail.com -EMAIL_PASSWORD=your-app-password -``` - -### 2. Start System - +### 2. Start the System ```bash -# Option 1: Auto-detect GPU and start (recommended) +# Auto-detects GPU capabilities and starts services ./start-with-gpu.sh -# Option 2: Start without GPU -docker-compose up -d - -# View logs -docker-compose logs -f - -# Wait for Ollama model download (first time only, ~2-5 minutes) +# Watch installation progress (first time model download ~2GB) docker-compose logs -f ollama-setup ``` -**Note:** First startup downloads the phi3:latest AI model (2.2GB). This happens automatically. - -### 3. Add RSS Feeds - +### 3. Add News Sources ```bash -mongosh munich_news +# Connect to database +docker-compose exec mongodb mongosh munich_news +# Paste this into the mongo shell: db.rss_feeds.insertMany([ { name: "Sรผddeutsche Zeitung Mรผnchen", @@ -65,11 +45,9 @@ db.rss_feeds.insertMany([ ]) ``` -### 4. Add Subscribers - +### 4. Add Yourself as Subscriber ```bash -mongosh munich_news - +# Still in mongo shell: db.subscribers.insertOne({ email: "your-email@example.com", active: true, @@ -78,90 +56,35 @@ db.subscribers.insertOne({ }) ``` -### 5. Test It - +### 5. Verify Installation ```bash -# Test crawler +# 1. Run the crawler manually to fetch news docker-compose exec crawler python crawler_service.py 5 -# Test newsletter +# 2. Send a test email to yourself docker-compose exec sender python sender_service.py test your-email@example.com ``` -## What Happens Next? +## ๐ŸŽฎ Dashboard Access -The system will automatically: -- **Backend API**: Runs continuously at http://localhost:5001 for tracking and analytics -- **6:00 AM Berlin time**: Crawl news articles -- **7:00 AM Berlin time**: Send newsletter to subscribers +Once running, access the services: +- **Dashboard**: [http://localhost:3000](http://localhost:3000) +- **API**: [http://localhost:5001](http://localhost:5001) -## View Results +## โญ๏ธ What's Next? +The system is now fully automated: +1. **6:00 AM**: Crawls news and generates AI summaries. +2. **7:00 AM**: Sends the daily newsletter. + +### Useful Commands ```bash -# Check articles -mongosh munich_news -db.articles.find().sort({ crawled_at: -1 }).limit(5) - -# Check logs -docker-compose logs -f crawler -docker-compose logs -f sender -``` - -## Common Commands - -```bash -# Stop system +# Stop everything docker-compose down -# Restart system -docker-compose restart +# View logs for a service +docker-compose logs -f crawler -# View logs -docker-compose logs -f - -# Rebuild after changes +# Update code & rebuild docker-compose up -d --build ``` - -## New Features - -### GPU Acceleration (5-10x Faster) -Enable GPU support for faster AI processing: -```bash -./check-gpu.sh # Check if GPU is available -./start-with-gpu.sh # Start with GPU support -``` -See [docs/GPU_SETUP.md](docs/GPU_SETUP.md) for details. - -### Send Newsletter to All Subscribers -```bash -# Send newsletter to all active subscribers -curl -X POST http://localhost:5001/api/admin/send-newsletter \ - -H "Content-Type: application/json" \ - -d '{"max_articles": 10}' -``` - -### Security Features -- โœ… Only Backend API exposed (port 5001) -- โœ… MongoDB internal-only (secure) -- โœ… Ollama internal-only (secure) -- โœ… All services communicate via internal Docker network - -## Need Help? - -- **Documentation Index**: [docs/INDEX.md](docs/INDEX.md) -- **GPU Setup**: [docs/GPU_SETUP.md](docs/GPU_SETUP.md) -- **API Reference**: [docs/ADMIN_API.md](docs/ADMIN_API.md) -- **Security Guide**: [docs/SECURITY_NOTES.md](docs/SECURITY_NOTES.md) -- **Full Documentation**: [README.md](README.md) - -## Next Steps - -1. โœ… **Enable GPU acceleration** - [docs/GPU_SETUP.md](docs/GPU_SETUP.md) -2. Set up tracking API (optional) -3. Customize newsletter template -4. Add more RSS feeds -5. Monitor engagement metrics -6. Review security settings - [docs/SECURITY_NOTES.md](docs/SECURITY_NOTES.md) - -That's it! Your automated news system is running. ๐ŸŽ‰ diff --git a/README.md b/README.md index aabdb2a..9e04ac2 100644 --- a/README.md +++ b/README.md @@ -1,460 +1,193 @@ # Munich News Daily - Automated Newsletter System -A fully automated news aggregation and newsletter system that crawls Munich news sources, generates AI summaries, and sends daily newsletters with engagement tracking. +A fully automated news aggregation system that crawls Munich news sources, generates AI-powered summaries, tracks local transport disruptions, and delivers personalized daily newsletters. + +![Munich News Daily](https://via.placeholder.com/800x400?text=Munich+News+Daily+Dashboard) ## โœจ Key Features -- **๐Ÿค– AI-Powered Clustering** - Automatically detects duplicate stories from different sources -- **๐Ÿ“ฐ Neutral Summaries** - Combines multiple perspectives into balanced coverage -- **๐ŸŽฏ Smart Prioritization** - Shows most important stories first (multi-source coverage) -- **๐ŸŽจ Personalized Newsletters** - AI-powered content recommendations based on user interests -- **๐Ÿ“Š Engagement Tracking** - Open rates, click tracking, and analytics -- **โšก GPU Acceleration** - 5-10x faster AI processing with GPU support -- **๐Ÿ”’ GDPR Compliant** - Privacy-first with data retention controls - -**๐Ÿš€ NEW:** GPU acceleration support for 5-10x faster AI processing! See [docs/GPU_SETUP.md](docs/GPU_SETUP.md) +- **๐Ÿค– AI-Powered Clustering** - Smartly detects duplicate stories and groups related articles using ChromaDB vector search. +- **๐Ÿ“ Neutral Summaries** - Generates balanced, multi-perspective summaries using local LLMs (Ollama). +- **๐Ÿš‡ Transport Updates** - Real-time tracking of Munich public transport (MVG) disruptions options. +- **๐ŸŽฏ Smart Prioritization** - Ranks stories based on relevance and user preferences. +- **๐ŸŽจ Personalized Newsletters** - diverse content delivery system. +- **๐Ÿ“Š Engagement Analytics** - Detailed tracking of open rates, click-throughs, and user interests. +- **โšก GPU Acceleration** - Integrated support for NVIDIA GPUs for faster AI processing. +- **๐Ÿ”’ Privacy First** - GDPR-compliant with automatic data retention policies and anonymization. ## ๐Ÿš€ Quick Start +For a detailed 5-minute setup guide, see [QUICKSTART.md](QUICKSTART.md). + ```bash # 1. Configure environment cp backend/.env.example backend/.env # Edit backend/.env with your email settings -# 2. Start everything -docker-compose up -d +# 2. Start everything (Auto-detects GPU) +./start-with-gpu.sh -# 3. View logs -docker-compose logs -f +# Questions? +# See logs: docker-compose logs -f ``` -That's it! The system will automatically: -- **Frontend**: Web interface and admin dashboard (http://localhost:3000) -- **Backend API**: Runs continuously for tracking and analytics (http://localhost:5001) -- **6:00 AM Berlin time**: Crawl news articles and generate summaries -- **7:00 AM Berlin time**: Send newsletter to all subscribers +The system will automatically: +1. **6:00 AM**: Crawl news & transport updates. +2. **6:30 AM**: Generate AI summaries & clusters. +3. **7:00 AM**: Send personalized newsletters. -### Access Points +## ๐Ÿ“‹ System Architecture -- **Newsletter Page**: http://localhost:3000 -- **Admin Dashboard**: http://localhost:3000/admin.html -- **Backend API**: http://localhost:5001 +The system is built as a set of microservices orchestrated by Docker Compose. -๐Ÿ“– **New to the project?** See [QUICKSTART.md](QUICKSTART.md) for a detailed 5-minute setup guide. - -๐Ÿš€ **GPU Acceleration:** Enable 5-10x faster AI processing with [GPU Setup Guide](docs/GPU_SETUP.md) - -## ๐Ÿ“‹ System Overview - -``` -6:00 AM โ†’ News Crawler - โ†“ - Fetches articles from RSS feeds - Extracts full content - Generates AI summaries - Saves to MongoDB - โ†“ -7:00 AM โ†’ Newsletter Sender - โ†“ - Waits for crawler to finish - Fetches today's articles - Generates newsletter with tracking - Sends to all subscribers - โ†“ - โœ… Done! Repeat tomorrow +```mermaid +graph TD + User[Subscribers] -->|Email| Sender[Newsletter Sender] + User -->|Web| Frontend[React Frontend] + Frontend -->|API| Backend[Backend API] + + subgraph "Core Services" + Crawler[News Crawler] + Transport[Transport Crawler] + Sender + Backend + end + + subgraph "Data & AI" + Mongo[(MongoDB)] + Redis[(Redis)] + Chroma[(ChromaDB)] + Ollama[Ollama AI] + end + + Crawler -->|Save| Mongo + Crawler -->|Embeddings| Chroma + Crawler -->|Summarize| Ollama + + Transport -->|Save| Mongo + + Sender -->|Read| Mongo + Sender -->|Track| Backend + + Backend -->|Read/Write| Mongo + Backend -->|Cache| Redis ``` -## ๐Ÿ—๏ธ Architecture +### Core Components -### Components +| Service | Description | Port | +|---------|-------------|------| +| **Frontend** | React-based user dashboard and admin interface. | 3000 | +| **Backend API** | Flask API for tracking, analytics, and management. | 5001 | +| **News Crawler** | Fetches RSS feeds, extracts content, and runs AI clustering. | - | +| **Transport Crawler** | Monitors MVG (Munich Transport) for delays and disruptions. | - | +| **Newsletter Sender** | Manages subscribers, generates templates, and sends emails. | - | +| **Ollama** | Local LLM runner for on-premise AI (Phi-3, Llama3, etc.). | - | +| **ChromaDB** | Vector database for semantic search and article clustering. | - | -- **Ollama**: AI service for summarization and translation (internal only, GPU-accelerated) -- **MongoDB**: Data storage (articles, subscribers, tracking) (internal only) -- **Backend API**: Flask API for tracking and analytics (port 5001 - only exposed service) -- **News Crawler**: Automated RSS feed crawler with AI summarization (internal only) -- **Newsletter Sender**: Automated email sender with tracking (internal only) -- **Frontend**: React dashboard (optional) +## ๐Ÿ“‚ Project Structure -### Technology Stack +```text +munich-news/ +โ”œโ”€โ”€ backend/ # Flask API for tracking & analytics +โ”œโ”€โ”€ frontend/ # React dashboard & admin UI +โ”œโ”€โ”€ news_crawler/ # RSS fetcher & AI summarizer service +โ”œโ”€โ”€ news_sender/ # Email generation & dispatch service +โ”œโ”€โ”€ transport_crawler/ # MVG transport disruption monitor +โ”œโ”€โ”€ docker-compose.yml # Main service orchestration +โ””โ”€โ”€ docs/ # Detailed documentation +``` -- Python 3.11 -- MongoDB 7.0 -- Ollama (phi3:latest model for AI) -- Docker & Docker Compose -- Flask (API) -- Schedule (automation) -- Jinja2 (email templates) +## ๐Ÿ› ๏ธ Installation & Setup -## ๐Ÿ“ฆ Installation +1. **Clone the repository** + ```bash + git clone https://github.com/yourusername/munich-news.git + cd munich-news + ``` -### Prerequisites +2. **Environment Configuration** + ```bash + cp backend/.env.example backend/.env + nano backend/.env + ``` + *Critical settings:* `SMTP_SERVER`, `EMAIL_USER`, `EMAIL_PASSWORD`. -- Docker & Docker Compose -- 4GB+ RAM (for Ollama AI models) -- (Optional) NVIDIA GPU for 5-10x faster AI processing +3. **Start the System** + ```bash + # Recommended: Helper script (handles GPU & Model setup) + ./start-with-gpu.sh + + # Alternative: Standard Docker Compose + docker-compose up -d + ``` -### Setup - -1. **Clone the repository** - ```bash - git clone - cd munich-news - ``` - -2. **Configure environment** - ```bash - cp backend/.env.example backend/.env - # Edit backend/.env with your settings - ``` - -3. **Configure Ollama (AI features)** - ```bash - # Option 1: Use integrated Docker Compose Ollama (recommended) - ./configure-ollama.sh - # Select option 1 - - # Option 2: Use external Ollama server - # Install from https://ollama.ai/download - # Then run: ollama pull phi3:latest - ``` - -4. **Start the system** - ```bash - # Auto-detect GPU and start (recommended) - ./start-with-gpu.sh - - # Or start manually - docker-compose up -d - - # First time: Wait for Ollama model download (2-5 minutes) - docker-compose logs -f ollama-setup - ``` - -๐Ÿ“– **For detailed Ollama setup & GPU acceleration:** See [docs/OLLAMA_SETUP.md](docs/OLLAMA_SETUP.md) - -๐Ÿ’ก **To change AI model:** Edit `OLLAMA_MODEL` in `.env`, then run `./pull-ollama-model.sh`. See [docs/CHANGING_AI_MODEL.md](docs/CHANGING_AI_MODEL.md) +4. **Initial Setup (First Run)** + * The system needs to download the AI model (approx. 2GB). + * Watch progress: `docker-compose logs -f ollama-setup` ## โš™๏ธ Configuration -Edit `backend/.env`: +Key configuration options in `backend/.env`: -```env -# MongoDB -MONGODB_URI=mongodb://localhost:27017/ +| Category | Variable | Description | +|----------|----------|-------------| +| **Email** | `SMTP_SERVER` | SMTP Server (e.g., smtp.gmail.com) | +| | `EMAIL_USER` | Your sending email address | +| **AI** | `OLLAMA_MODEL` | Model to use (default: phi3:latest) | +| **Schedule** | `CRAWLER_TIME` | Time to start crawling (e.g., "06:00") | +| | `SENDER_TIME` | Time to send emails (e.g., "07:00") | -# Email (SMTP) -SMTP_SERVER=smtp.gmail.com -SMTP_PORT=587 -EMAIL_USER=your-email@gmail.com -EMAIL_PASSWORD=your-app-password +## ๐Ÿ“Š Usage & Monitoring -# Newsletter -NEWSLETTER_MAX_ARTICLES=10 -NEWSLETTER_HOURS_LOOKBACK=24 +### Access Points +* **Web Dashboard**: [http://localhost:3000](http://localhost:3000) (or configured domain) +* **API**: [http://localhost:5001](http://localhost:5001) -# Tracking -TRACKING_ENABLED=true -TRACKING_API_URL=http://localhost:5001 -TRACKING_DATA_RETENTION_DAYS=90 +### Useful Commands -# Ollama (AI Summarization) -OLLAMA_ENABLED=true -OLLAMA_BASE_URL=http://127.0.0.1:11434 -OLLAMA_MODEL=phi3:latest +**View Logs** +```bash +docker-compose logs -f [service_name] +# e.g., docker-compose logs -f crawler ``` -## ๐Ÿ“Š Usage - -### View Logs - +**Manual Trigger** ```bash -# All services -docker-compose logs -f - -# Specific service -docker-compose logs -f crawler -docker-compose logs -f sender -docker-compose logs -f mongodb -``` - -### Manual Operations - -```bash -# Run crawler manually +# Run News Crawler immediately docker-compose exec crawler python crawler_service.py 10 -# Send test newsletter -docker-compose exec sender python sender_service.py test your-email@example.com +# Run Transport Crawler immediately +docker-compose exec transport-crawler python transport_service.py -# Preview newsletter -docker-compose exec sender python sender_service.py preview +# Send Test Newsletter +docker-compose exec sender python sender_service.py test user@example.com ``` -### Database Access - +**Database Access** ```bash # Connect to MongoDB docker-compose exec mongodb mongosh munich_news - -# View articles -db.articles.find().sort({ crawled_at: -1 }).limit(5).pretty() - -# View subscribers -db.subscribers.find({ active: true }).pretty() - -# View tracking data -db.newsletter_sends.find().sort({ created_at: -1 }).limit(10).pretty() ``` -## ๐Ÿ”ง Management +## ๐ŸŒ Production Deployment (Traefik) -### Add RSS Feeds +This project is configured to work with **Traefik** as a reverse proxy. +The `docker-compose.yml` includes labels for: +- `news.dongho.kim` (Frontend) +- `news-api.dongho.kim` (Backend) -```bash -mongosh munich_news - -db.rss_feeds.insertOne({ - name: "Source Name", - url: "https://example.com/rss", - active: true -}) +To use this locally, add these to your `/etc/hosts`: +```text +127.0.0.1 news.dongho.kim news-api.dongho.kim ``` -### Add Subscribers - -```bash -mongosh munich_news - -db.subscribers.insertOne({ - email: "user@example.com", - active: true, - tracking_enabled: true, - subscribed_at: new Date() -}) -``` - -### View Analytics - -```bash -# Newsletter metrics -curl http://localhost:5001/api/analytics/newsletter/2024-01-15 - -# Article performance -curl http://localhost:5001/api/analytics/article/https://example.com/article - -# Subscriber activity -curl http://localhost:5001/api/analytics/subscriber/user@example.com -``` - -## โฐ Schedule Configuration - -### Change Crawler Time (default: 6:00 AM) - -Edit `news_crawler/scheduled_crawler.py`: -```python -schedule.every().day.at("06:00").do(run_crawler) # Change time -``` - -### Change Sender Time (default: 7:00 AM) - -Edit `news_sender/scheduled_sender.py`: -```python -schedule.every().day.at("07:00").do(run_sender) # Change time -``` - -After changes: -```bash -docker-compose up -d --build -``` - -## ๐Ÿ“ˆ Monitoring - -### Container Status - -```bash -docker-compose ps -``` - -### Check Next Scheduled Runs - -```bash -# Crawler -docker-compose logs crawler | grep "Next scheduled run" - -# Sender -docker-compose logs sender | grep "Next scheduled run" -``` - -### Engagement Metrics - -```bash -mongosh munich_news - -// Open rate -var sent = db.newsletter_sends.countDocuments({ newsletter_id: "2024-01-15" }) -var opened = db.newsletter_sends.countDocuments({ newsletter_id: "2024-01-15", opened: true }) -print("Open Rate: " + ((opened / sent) * 100).toFixed(2) + "%") - -// Click rate -var clicks = db.link_clicks.countDocuments({ newsletter_id: "2024-01-15" }) -print("Click Rate: " + ((clicks / sent) * 100).toFixed(2) + "%") -``` - -## ๐Ÿ› Troubleshooting - -### Crawler Not Finding Articles - -```bash -# Check RSS feeds -mongosh munich_news --eval "db.rss_feeds.find({ active: true })" - -# Test manually -docker-compose exec crawler python crawler_service.py 5 -``` - -### Newsletter Not Sending - -```bash -# Check email config -docker-compose exec sender python -c "from sender_service import Config; print(Config.SMTP_SERVER)" - -# Test email -docker-compose exec sender python sender_service.py test your-email@example.com -``` - -### Containers Not Starting - -```bash -# Check logs -docker-compose logs - -# Rebuild -docker-compose up -d --build - -# Reset everything -docker-compose down -v -docker-compose up -d -``` - -## ๐Ÿ” Privacy & Compliance - -### GDPR Features - -- **Data Retention**: Automatic anonymization after 90 days -- **Opt-Out**: Subscribers can disable tracking -- **Data Deletion**: Full data removal on request -- **Transparency**: Privacy notice in all emails - -### Privacy Endpoints - -```bash -# Delete subscriber data -curl -X DELETE http://localhost:5001/api/tracking/subscriber/user@example.com - -# Anonymize old data -curl -X POST http://localhost:5001/api/tracking/anonymize - -# Opt out of tracking -curl -X POST http://localhost:5001/api/tracking/subscriber/user@example.com/opt-out -``` - -## ๐Ÿ“š Documentation - -### Getting Started -- **[QUICKSTART.md](QUICKSTART.md)** - 5-minute setup guide -- **[CONTRIBUTING.md](CONTRIBUTING.md)** - Contribution guidelines - -### Core Features -- **[docs/AI_NEWS_AGGREGATION.md](docs/AI_NEWS_AGGREGATION.md)** - AI-powered clustering & neutral summaries -- **[docs/PERSONALIZATION.md](docs/PERSONALIZATION.md)** - Personalized newsletter system -- **[docs/PERSONALIZATION_COMPLETE.md](docs/PERSONALIZATION_COMPLETE.md)** - Personalization implementation guide -- **[docs/FEATURES.md](docs/FEATURES.md)** - Complete feature list -- **[docs/API.md](docs/API.md)** - API endpoints reference - -### Technical Documentation -- **[docs/ARCHITECTURE.md](docs/ARCHITECTURE.md)** - System architecture -- **[docs/SETUP.md](docs/SETUP.md)** - Detailed setup guide -- **[docs/OLLAMA_SETUP.md](docs/OLLAMA_SETUP.md)** - AI/Ollama configuration -- **[docs/GPU_SETUP.md](docs/GPU_SETUP.md)** - GPU acceleration setup -- **[docs/DEPLOYMENT.md](docs/DEPLOYMENT.md)** - Production deployment -- **[docs/SECURITY.md](docs/SECURITY.md)** - Security best practices -- **[docs/REFERENCE.md](docs/REFERENCE.md)** - Complete reference -- **[docs/DEPLOYMENT.md](docs/DEPLOYMENT.md)** - Deployment guide -- **[docs/API.md](docs/API.md)** - API reference -- **[docs/DATABASE_SCHEMA.md](docs/DATABASE_SCHEMA.md)** - Database structure -- **[docs/BACKEND_STRUCTURE.md](docs/BACKEND_STRUCTURE.md)** - Backend organization - -### Component Documentation -- **[docs/CRAWLER_HOW_IT_WORKS.md](docs/CRAWLER_HOW_IT_WORKS.md)** - Crawler internals -- **[docs/EXTRACTION_STRATEGIES.md](docs/EXTRACTION_STRATEGIES.md)** - Content extraction -- **[docs/RSS_URL_EXTRACTION.md](docs/RSS_URL_EXTRACTION.md)** - RSS parsing - -## ๐Ÿงช Testing - -All test files are organized in the `tests/` directory: - -```bash -# Run crawler tests -docker-compose exec crawler python tests/crawler/test_crawler.py - -# Run sender tests -docker-compose exec sender python tests/sender/test_tracking_integration.py - -# Run backend tests -docker-compose exec backend python tests/backend/test_tracking.py - -# Test personalization system (all 4 phases) -docker exec munich-news-local-backend python test_personalization_system.py -``` - -## ๐Ÿš€ Production Deployment - -### Environment Setup - -1. Update `backend/.env` with production values -2. Set strong MongoDB password -3. Use HTTPS for tracking URLs -4. Configure proper SMTP server - -### Security - -```bash -# Use production compose file -docker-compose -f docker-compose.prod.yml up -d - -# Set MongoDB password -export MONGO_PASSWORD=your-secure-password -``` - -### Monitoring - -- Set up log rotation -- Configure health checks -- Set up alerts for failures -- Monitor database size - -## ๐Ÿ“š Documentation - -Complete documentation available in the [docs/](docs/) directory: - -- **[Documentation Index](docs/INDEX.md)** - Complete documentation guide -- **[GPU Setup](docs/GPU_SETUP.md)** - 5-10x faster with GPU acceleration -- **[Admin API](docs/ADMIN_API.md)** - API endpoints reference -- **[Security Guide](docs/SECURITY_NOTES.md)** - Security best practices -- **[System Architecture](docs/SYSTEM_ARCHITECTURE.md)** - Technical overview - -## ๐Ÿ“ License - -[Your License Here] +For production, ensure your Traefik proxy network is named `proxy` or update the `docker-compose.yml` accordingly. ## ๐Ÿค Contributing -Contributions welcome! Please read [CONTRIBUTING.md](CONTRIBUTING.md) first. +We welcome contributions! Please check [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines. -## ๐Ÿ“ง Support +## ๐Ÿ“„ License -For issues or questions, please open a GitHub issue. - ---- - -**Built with โค๏ธ for Munich News Daily** +MIT License - see [LICENSE](LICENSE) for details.