Muse AR is an innovative mobile application developed for the KU Hackfest organized by Kathmandu University. It uses Augmented Reality (AR) to provide real-time information about scanned artifacts. The application employs a YOLO model trained on custom datasets for artifact recognition and retrieves detailed information dynamically by scraping Wikipedia and YouTube. Additionally, it includes voice output and translation support for accessibility. A complementary web platform delivers general information about the museums.
- Features
- Technology Stack
- Usage
- Screenshots
- Contributing
- License
- Artifact Recognition: Uses a YOLO model trained on custom datasets to identify artifacts in real time.
- Dynamic Information Retrieval: Scrapes Wikipedia and YouTube for live information about artifacts.
- Voice Support: Provides audio narration for artifact information.
- Translation: Offers multi-language support for translated descriptions.
- AR Integration: Delivers an immersive user experience with real-time artifact tracking.
- Web Platform: A React-based web portal for general information about artifacts.
- Framework: Unity
- AR Library: Vuforia
- Language: Python
- Framework: FastAPI
- Model: YOLO (Object Detection)
- Dataset: Custom-trained and labelled datasets for artifact recognition
- Frontend: React.js
- Backend: Python
- Open the mobile application and allow camera access for AR functionality.
- Point the device's camera at any artifact.
- Muse AR will:
- Recognize the artifact using the YOLO model.
- Provide pre-recorded details and live dynamic information from Wikipedia and YouTube.
- Narrate the details and offer translation if selected.
To learn more about Muse AR, view our detailed presentation here.
Checkout our YouTube video here.
- Built at KU Hackfest 2024