Splitters
Functions designed for splitting your data
To write your own you should make a function that returns two L
's of indicies (or lists
work as well)
For example, if I have a dataset of 5 items, we start with [0,1,2,3,4]
. If I wanted to write a split function to split the first three and last two items into train and validation, I can write it as:
def split_func(idxs): return L(idxs[:3]), L(idxs[3:])
And we can see it work:
split_func([0,1,2,3,4])
from fastcore.basics import mk_class
from nbverbose.showdoc import show_doc
Since fastai
is a very lightweight framework that is easily approachable and incorporates state-of-the-art ideas, AdaptNLP
bridges the gap between HuggingFace and fastai, allowing you to train with their framework through the *Tuner
classes
The constructor of the AdaptiveTuner
class has an optional expose_fastai_api
parameter. When set to True
, the Tuner
inherits fastai's Learner
, so every attribute of the Learner
is available to you. This is only recommended for those very familiar with the fastai API.
Otherwise, you have access to eight functions in each class:
tune
lr_find
predict
save
load
export
All task fine-tuners should inherit the AdaptiveTuner
, write good defaults, and override any specific needs as dictated by the task