Files
Munich-news/PROJECT_STRUCTURE.md
2025-11-11 14:09:21 +01:00

127 lines
4.5 KiB
Markdown

# Project Structure
```
munich-news/
├── backend/ # Backend API and services
│ ├── routes/ # API routes
│ ├── services/ # Business logic
│ ├── .env.example # Environment template
│ ├── app.py # Flask application
│ ├── config.py # Configuration
│ └── database.py # MongoDB connection
├── news_crawler/ # News crawler service
│ ├── Dockerfile # Crawler container
│ ├── crawler_service.py # Main crawler logic
│ ├── scheduled_crawler.py # Scheduler (6 AM)
│ ├── rss_utils.py # RSS parsing utilities
│ └── requirements.txt # Python dependencies
├── news_sender/ # Newsletter sender service
│ ├── Dockerfile # Sender container
│ ├── sender_service.py # Main sender logic
│ ├── scheduled_sender.py # Scheduler (7 AM)
│ ├── tracking_integration.py # Email tracking
│ ├── newsletter_template.html # Email template
│ └── requirements.txt # Python dependencies
├── frontend/ # React dashboard (optional)
│ ├── src/ # React components
│ ├── public/ # Static files
│ └── package.json # Node dependencies
├── tests/ # All test files
│ ├── crawler/ # Crawler tests
│ ├── sender/ # Sender tests
│ └── backend/ # Backend tests
├── docs/ # Documentation
│ ├── ARCHITECTURE.md # System architecture
│ ├── DEPLOYMENT.md # Deployment guide
│ ├── API.md # API reference
│ ├── DATABASE_SCHEMA.md # Database structure
│ ├── BACKEND_STRUCTURE.md # Backend organization
│ ├── CRAWLER_HOW_IT_WORKS.md # Crawler internals
│ ├── EXTRACTION_STRATEGIES.md # Content extraction
│ └── RSS_URL_EXTRACTION.md # RSS parsing
├── .kiro/ # Kiro IDE configuration
│ └── specs/ # Feature specifications
├── docker-compose.yml # Docker orchestration
├── README.md # Main documentation
├── QUICKSTART.md # 5-minute setup guide
├── CONTRIBUTING.md # Contribution guidelines
├── .gitignore # Git ignore rules
└── .dockerignore # Docker ignore rules
```
## Key Files
### Configuration
- `backend/.env` - Environment variables (create from .env.example)
- `docker-compose.yml` - Docker services configuration
### Entry Points
- `news_crawler/scheduled_crawler.py` - Crawler scheduler (6 AM)
- `news_sender/scheduled_sender.py` - Sender scheduler (7 AM)
- `backend/app.py` - Backend API server
### Documentation
- `README.md` - Main project documentation
- `QUICKSTART.md` - Quick setup guide
- `docs/` - Detailed documentation
### Tests
- `tests/crawler/` - Crawler test files
- `tests/sender/` - Sender test files
- `tests/backend/` - Backend test files
## Docker Services
When you run `docker-compose up -d`, these services start:
1. **mongodb** - Database (port 27017)
2. **crawler** - News crawler (scheduled for 6 AM)
3. **sender** - Newsletter sender (scheduled for 7 AM)
4. **backend** - API server (port 5001, optional)
## Data Flow
```
RSS Feeds → Crawler → MongoDB → Sender → Subscribers
Backend API
Analytics
```
## Development Workflow
1. Edit code in respective directories
2. Rebuild containers: `docker-compose up -d --build`
3. View logs: `docker-compose logs -f`
4. Run tests: `docker-compose exec <service> python tests/...`
## Adding New Features
1. Create spec in `.kiro/specs/`
2. Implement in appropriate directory
3. Add tests in `tests/`
4. Update documentation in `docs/`
5. Submit pull request
## Clean Architecture
- **Separation of Concerns**: Each service has its own directory
- **Centralized Configuration**: All config in `backend/.env`
- **Organized Tests**: All tests in `tests/` directory
- **Clear Documentation**: All docs in `docs/` directory
- **Single Entry Point**: One `docker-compose.yml` file
This structure makes the project:
- ✅ Easy to navigate
- ✅ Simple to deploy
- ✅ Clear to understand
- ✅ Maintainable long-term