Analyze GitHub repositories and generate insights using AI.
- Setup Environment
# Create virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install dependencies
pip install -r requirements.txt
- Start Services (in this exact order)
# 1. Start infrastructure services first
docker-compose up
uvicorn backend.api.app:app --reload --log-level=info
- Initialize Repository
curl -X POST http://localhost:8000/repos/init \
-H "Content-Type: application/json" \
-d '{"owner": "openai", "repo": "tiktoken"}'
curl -X POST http://localhost:8000/repos/init \
-H "Content-Type: application/json" \
-d '{"owner": "Tialo", "repo": "githubXplainer"}'
# init
curl -X POST http://localhost:8000/elasticsearch/init
# drop
curl -X POST http://localhost:8000/elasticsearch/clear
curl -X POST http://localhost:8000/search/faiss \
-H "Content-Type: application/json" \
-d '{"query": "what where last bugfixes", "owner": "openai", "name": "tiktoken"}'
curl -X DELETE http://localhost:8000/repos/delete \
-H "Content-Type: application/json" \
-d '{"owner": "Tialo", "repo": "githubXplainer"}'
# Drop volumes
docker-compose down -v
# Format code
poetry run black backend/
poetry run isort backend/
# Run tests
poetry run pytest
- Database issues: Check
docker-compose ps
and database credentials - GitHub API: Verify token in
.env
and rate limits - Service errors: Check
logs/app.log
for details - RQ worker: Verify Redis connection and check RQ logs with
python -m backend.tasks.worker --loglevel=debug
Browse OpenAPI docs at http://localhost:8000/docs
Access the Flower dashboard at http://localhost:5555 to monitor:
- Task progress and history
- Worker status
- Real-time statistics
- Task graphs and metrics
Access the Kafka UI dashboard at http://localhost:8080 to monitor:
- Topic management and browsing
- Consumer groups
- Message browsing
- Cluster state
- Performance metrics
The application uses Kafka for message queuing with two main topics:
readme
: For processing README file changescommit
: For processing commit information
import { KafkaService, TOPICS } from './kafka/KafkaService';
// Initialize service
const kafkaService = new KafkaService();
await kafkaService.initialize();
// Create a consumer
const consumer = await kafkaService.createConsumer('my-group');
// Subscribe to topics
await kafkaService.subscribeToTopic(consumer, TOPICS.README, async (message) => {
console.log('Received README update:', message);
});
// Publish a message
await kafkaService.publishMessage(TOPICS.COMMIT, {
id: '123',
message: 'Initial commit'
});