Skip to content

Latest commit

 

History

History
65 lines (47 loc) · 2.1 KB

README.md

File metadata and controls

65 lines (47 loc) · 2.1 KB

WhatsNext: AI Text Generator

AI Text Generator

Introduction

Welcome to WhatsNext, an AI text generator powered by LSTM models trained on the New York Times comments dataset. This project enables users to generate predictive text based on their input, aiding writers, content creators, and conversational interfaces.

Setup

Prerequisites

  • Python 3.x
  • pip

Clone Repository

Clone this repository to your local machine using the following command:

git clone https://github.com/your-username/whatsnext.git

Installation

Navigate to the project directory:

cd whatsnext

Install the required Python packages:

pip install -r requirements.txt

Usage

  1. Prepare Dataset: Before running the model, prepare your dataset in a CSV format containing text data. You can use the New York Times comments dataset or any other text dataset of your choice.

  2. Train Model: Use the provided Jupyter notebook or Python script to train the LSTM model on your dataset.

  3. Generate Text: Once the model is trained, you can use it to generate text by providing a partial sentence or phrase as input.

Model Details

  • TensorFlow Version: 2.15.0
  • Model Architecture: LSTM
  • Vocabulary Size: 17
  • Embedding Size: 10
  • LSTM Units: 256
  • Dropout Rate: 0.5

Model Architecture

Model

Code and Outputs

Example Code Snippet

Example Code

Example Output

Example Output

Dataset Used

The model was trained on the New York Times comments dataset available on Kaggle.

TODO :

  • Add the demo video
  • Code Snippets with outputs of each section