AdaptNLP's internal `AdaptiveModel` class, along with `fastai` internals for predicting
DataLoader.one_batch
[source]
DataLoader.one_batch
()
Pathed functionality that grabs one batch of data from the DataLoader
and deletes the iter
Returns:
<class 'torch.Tensor'>
A batch of data
GatherPredsCallback.after_validate
[source]
GatherPredsCallback.after_validate
()
Patched functionality that does nothing
class
CudaCallback
[source]
CudaCallback
(device
:str
=None
) ::Callback
Move data to CUDA device
Parameters:
device
: <class 'str'>
, optional A device to move the data to, such as 'cuda:0' or 'cpu'
class
AdaptiveModel
[source]
AdaptiveModel
() ::ABC
Helper class that provides a standard way to create an ABC using inheritance.
AdaptiveModel.set_model
[source]
AdaptiveModel.set_model
(model
)
Sets model in _learn
Parameters:
model
: <class 'inspect._empty'>
A PyTorch model
AdaptiveModel.set_as_dict
[source]
AdaptiveModel.set_as_dict
(as_dict
:bool
=False
)
Sets as_dict
in _learn
Parameters:
as_dict
: <class 'bool'>
, optional Whether to return the inputs as a dictionary when predicting or training
AdaptiveModel.set_device
[source]
AdaptiveModel.set_device
(device
:str
='cpu'
)
Sets the device for CudaCallback
in __learn
Parameters:
device
: <class 'str'>
, optional A device for the `CudaCallback`, such as 'cuda:0' or 'cpu'
AdaptiveModel.get_preds
[source]
AdaptiveModel.get_preds
(dl
=None
,cbs
=[]
)
Get raw predictions based on dl
with cbs
.
For basic inference, cbs
should include any Callbacks
needed to do general inference
Parameters:
dl
: <class 'NoneType'>
, optional An iterable DataLoader or DataLoader-like object
cbs
: <class 'list'>
, optional Optional fastai `Callbacks`
AdaptiveModel.load
[source]
AdaptiveModel.load
(model_name_or_path
:Union
[str
,Path
])
Load model into the AdaptiveModel
object as alternative constructor
Parameters:
model_name_or_path
: typing.Union[str, pathlib.Path]
A model file location to use
AdaptiveModel.predict
[source]
AdaptiveModel.predict
(text
:Union
[List
[Sentence
],Sentence
,List
[str
],str
],mini_batch_size
:int
=32
, **kwargs
)
Run inference on the model
Parameters:
text
: typing.Union[typing.List[flair.data.Sentence], flair.data.Sentence, typing.List[str], str]
Some text to predict on
mini_batch_size
: <class 'int'>
, optional A batch size for if a list of texts are passed in
kwargs
: <class 'inspect._empty'>
Returns:
typing.List[flair.data.Sentence]
A list of predicted sentences