Welcome to WhatsNext, an AI text generator powered by LSTM models trained on the New York Times comments dataset. This project enables users to generate predictive text based on their input, aiding writers, content creators, and conversational interfaces.
- Python 3.x
- pip
Clone this repository to your local machine using the following command:
git clone https://github.com/your-username/whatsnext.git
Navigate to the project directory:
cd whatsnext
Install the required Python packages:
pip install -r requirements.txt
-
Prepare Dataset: Before running the model, prepare your dataset in a CSV format containing text data. You can use the New York Times comments dataset or any other text dataset of your choice.
-
Train Model: Use the provided Jupyter notebook or Python script to train the LSTM model on your dataset.
-
Generate Text: Once the model is trained, you can use it to generate text by providing a partial sentence or phrase as input.
- TensorFlow Version: 2.15.0
- Model Architecture: LSTM
- Vocabulary Size: 17
- Embedding Size: 10
- LSTM Units: 256
- Dropout Rate: 0.5
The model was trained on the New York Times comments dataset available on Kaggle.
TODO :
- Add the demo video
- Code Snippets with outputs of each section