Skip to content

Tialo/githubXplainer

Repository files navigation

GitHub Xplainer

Analyze GitHub repositories and generate insights using AI.

Quick Start

  1. Setup Environment
# Create virtual environment
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

# Install dependencies
pip install -r requirements.txt
  1. Start Services (in this exact order)
# 1. Start infrastructure services first
docker-compose up

uvicorn backend.api.app:app --reload --log-level=info
  1. Initialize Repository
curl -X POST http://localhost:8000/repos/init \
-H "Content-Type: application/json" \
-d '{"owner": "openai", "repo": "tiktoken"}'


curl -X POST http://localhost:8000/repos/init \
-H "Content-Type: application/json" \
-d '{"owner": "Tialo", "repo": "githubXplainer"}'
# init
curl -X POST http://localhost:8000/elasticsearch/init

# drop
curl -X POST http://localhost:8000/elasticsearch/clear

curl -X POST http://localhost:8000/search/faiss \
-H "Content-Type: application/json" \
-d '{"query": "what where last bugfixes", "owner": "openai", "name": "tiktoken"}'

curl -X DELETE http://localhost:8000/repos/delete \
-H "Content-Type: application/json" \
-d '{"owner": "Tialo", "repo": "githubXplainer"}'
# Drop volumes
docker-compose down -v

Development

# Format code
poetry run black backend/
poetry run isort backend/

# Run tests
poetry run pytest

Troubleshooting

  • Database issues: Check docker-compose ps and database credentials
  • GitHub API: Verify token in .env and rate limits
  • Service errors: Check logs/app.log for details
  • RQ worker: Verify Redis connection and check RQ logs with python -m backend.tasks.worker --loglevel=debug

API Documentation

Browse OpenAPI docs at http://localhost:8000/docs

Monitoring

RQ Monitoring

Access the Flower dashboard at http://localhost:5555 to monitor:

  • Task progress and history
  • Worker status
  • Real-time statistics
  • Task graphs and metrics

Kafka Monitoring

Access the Kafka UI dashboard at http://localhost:8080 to monitor:

  • Topic management and browsing
  • Consumer groups
  • Message browsing
  • Cluster state
  • Performance metrics

Kafka Setup

The application uses Kafka for message queuing with two main topics:

  • readme: For processing README file changes
  • commit: For processing commit information

Usage Example

import { KafkaService, TOPICS } from './kafka/KafkaService';

// Initialize service
const kafkaService = new KafkaService();
await kafkaService.initialize();

// Create a consumer
const consumer = await kafkaService.createConsumer('my-group');

// Subscribe to topics
await kafkaService.subscribeToTopic(consumer, TOPICS.README, async (message) => {
  console.log('Received README update:', message);
});

// Publish a message
await kafkaService.publishMessage(TOPICS.COMMIT, {
  id: '123',
  message: 'Initial commit'
});

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published