In MT-BERT
we reproduce a neural language understanding model based on the paper
by Liu et al.(2019).
Such model implements a Multi-Task Deep Neural Network (MT-DNN) for learning representations across multiple NLU tasks.
MT-DNN extends the model proposed in paper by Liu et al.(2015) by incorporating a pre-trained bidirectional transformer language model, known as BERT.
More details about the project are available in the presentation
Original implementation available at repo
To get a local copy up and running follow these simple steps.
The project provide a Pipfile
file that can be managed with pipenv.
pipenv
installation is strongly encouraged in order to avoid dependency/reproducibility problems.
- pipenv
pip install pipenv
- Clone the repo
git clone https://gitlab.com/reddeadrecovery/mt-bert
- Install Python dependencies
pipenv install
Here's a brief description of each and every file in the repo:
model.py
: Model definitiontask.py
: Task dataset preprocessing and definitiontrain_glue.py
: Training file for Multi task training on GLUEfine_tune_task.py
: Fine tuning, domain adaptation and single task training fileutils.py
: utils file
There is also a executable jupyter notebook:train.ipnyb
Machine Learning © Course held by Professor Paolo Frasconi - Computer Engineering Master Degree @University of Florence