You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 16, 2023. It is now read-only.
Hi, I am using one of the notebooks for sequence classification, finetuning a BERT variant. I noticed that the notebook instructions conclude when the model finishes training. I 'd like to ask if there's a straightforward way to save the finetuned model locally, and load it on another machine for inference on new data.
I have already managed to convert from txt/csv to df, convert the df to dataset and then create the dataloader for inference. However, I am unable to load the trained model. What I did to save the finetuned model is: classifier.save_model("./trained_bert_base_classifier.bin")
and tried loading it (unsuccessfully) using the transformers Automodel, the torch.load and Transformer.load_model() from utils.
I would really appreciate some help on how to properly save and load a finetuned model using the existing recipes.
The text was updated successfully, but these errors were encountered:
To answer my own question, what I did to load the finetuned checkpoint is the following:
from utils_nlp.models.transformers.sequence_classification import Processor, SequenceClassifier
MODEL_NAME = "bert-base-cased"
CACHE_DIR = '/local/path/where/base/model/will/be/downloaded/'
model = SequenceClassifier(model_name=MODEL_NAME, cache_dir=CACHE_DIR, num_labels=3)
model.load_model("trained/pytorch_model.bin")
The downside of this approach is that on every fresh run, the model is instantiated first using the head-less BERT, which needs to be downloaded or exist in the CACHE_DIR. Then, I use load_model to load the correct weights.
Is there a more elegant method to avoid downloading base-bert?
Description
Hi, I am using one of the notebooks for sequence classification, finetuning a BERT variant. I noticed that the notebook instructions conclude when the model finishes training. I 'd like to ask if there's a straightforward way to save the finetuned model locally, and load it on another machine for inference on new data.
I have already managed to convert from txt/csv to df, convert the df to dataset and then create the dataloader for inference. However, I am unable to load the trained model. What I did to save the finetuned model is:
classifier.save_model("./trained_bert_base_classifier.bin")
and tried loading it (unsuccessfully) using the transformers Automodel, the torch.load and Transformer.load_model() from utils.
I would really appreciate some help on how to properly save and load a finetuned model using the existing recipes.
The text was updated successfully, but these errors were encountered: