Language Models within the AdaptNLP library

class LMFineTuner[source]

LMFineTuner(model_name_or_path:Union[str, HFModelResult]='bert-base-cased')

A Language Model Fine Tuner object you can set language model configurations and then train and evaluate

Usage:

>>> finetuner = adaptnlp.LMFineTuner()
>>> finetuner.train()

Parameters:

  • model_name_or_path - The model checkpoint for weights initialization. Leave None if you want to train a model from scratch.

Usage Examples:

Sample code from: 20b_tutorial.fine_tuning_manual.ipynb (View Notebook for more context)

ft_configs = {
              "train_data_file": train_data_file,
              "eval_data_file": eval_data_file,
              "model_type": "bert",
              "model_name_or_path": "bert-base-cased",
              "mlm": True,
              "mlm_probability": 0.15,
              "config_name": None,
              "tokenizer_name": None,
              "cache_dir": None,
              "block_size": -1,
              "no_cuda": False,
              "overwrite_cache": False,
              "seed": 42,
              "fp16": False,
              "fp16_opt_level": "O1",
              "local_rank": -1,
             }
finetuner = LMFineTuner(**ft_configs)
finetuner.freeze()