Sequence Classification API for Transformers and Flair

class TransformersSequenceClassifier[source]

TransformersSequenceClassifier(tokenizer:PreTrainedTokenizer, model:PreTrainedModel) :: AdaptiveModel

Adaptive model for Transformer's Sequence Classification Model

Usage:

>>> classifier = TransformersSequenceClassifier.load('transformers-sc-model')
>>> classifier.predict(text='Example text', mini_batch_size=32)

Parameters:

  • tokenizer - A tokenizer object from Huggingface's transformers (TODO)and tokenizers
  • model - A transformers Sequence Classsifciation model

Usage Examples:

Sample code from: 06_sequence_classification.ipynb (View Notebook for more context)

example_text = "This didn't work at all"

classifier = TransformersSequenceClassifier.load("nlptown/bert-base-multilingual-uncased-sentiment")

sentences = classifier.predict(text=example_text,mini_batch_size=1,
)

TransformersSequenceClassifier.load[source]

TransformersSequenceClassifier.load(model_name_or_path:Union[HFModelResult, str])

Class method for loading and constructing this classifier

  • model_name_or_path - A key string of one of Transformer's pre-trained Sequence Classifier Model or a HFModelResult

Usage Examples:

Sample code from: 06_sequence_classification.ipynb (View Notebook for more context)

example_text = "This didn't work at all"

classifier = TransformersSequenceClassifier.load("nlptown/bert-base-multilingual-uncased-sentiment")

sentences = classifier.predict(text=example_text,mini_batch_size=1,
)

TransformersSequenceClassifier.predict[source]

TransformersSequenceClassifier.predict(text:Union[List[Sentence], Sentence, List[str], str], mini_batch_size:int=32, **kwargs)

Predict method for running inference using the pre-trained sequence classifier model

  • text - String, list of strings, sentences, or list of sentences to run inference on
  • mini_batch_size - Mini batch size
  • **kwargs(Optional) - Optional arguments for the Transformers classifier

Usage Examples:

Sample code from: 06_sequence_classification.ipynb (View Notebook for more context)

example_text = "This didn't work at all"

classifier = TransformersSequenceClassifier.load("nlptown/bert-base-multilingual-uncased-sentiment")

sentences = classifier.predict(text=example_text,mini_batch_size=1,
)

class FlairSequenceClassifier[source]

FlairSequenceClassifier(model_name_or_path:str) :: AdaptiveModel

Adaptive Model for Flair's Sequence Classifier...very basic

Usage:

>>> classifier = FlairSequenceClassifier.load('sentiment')
>>> classifier.predict(text='Example text', mini_batch_size=32)

Parameters:

  • model_name_or_path - A key string of one of Flair's pre-trained Sequence Classifier Model

Usage Examples:

Sample code from: 06_sequence_classification.ipynb (View Notebook for more context)

example_text = "This didn't work at all"


classifier = FlairSequenceClassifier.load('sentiment')

sentences = classifier.predict(text=example_text,mini_batch_size=1,
)

pred = sentences[0].get_labels()[0]

test_eq(pred.value, 'NEGATIVE')
test_close(pred.score, 0.999, 1e-3)

FlairSequenceClassifier.load[source]

FlairSequenceClassifier.load(model_name_or_path:Union[HFModelResult, FlairModelResult, str])

Class method for loading a constructing this classifier

  • model_name_or_path - A key string of one of Flair's pre-trained Sequence Classifier Model or a HFModelResult

Usage Examples:

Sample code from: 06_sequence_classification.ipynb (View Notebook for more context)

example_text = "This didn't work at all"


classifier = FlairSequenceClassifier.load('sentiment')

sentences = classifier.predict(text=example_text,mini_batch_size=1,
)

pred = sentences[0].get_labels()[0]

test_eq(pred.value, 'NEGATIVE')
test_close(pred.score, 0.999, 1e-3)

FlairSequenceClassifier.predict[source]

FlairSequenceClassifier.predict(text:Union[List[Sentence], Sentence, List[str], str], mini_batch_size:int=32, **kwargs)

Predict method for running inference using the pre-trained sequence classifier model

  • text - String, list of strings, sentences, or list of sentences to run inference on
  • mini_batch_size - Mini batch size
  • **kwargs(Optional) - Optional arguments for the Flair classifier

Usage Examples:

Sample code from: 06_sequence_classification.ipynb (View Notebook for more context)

example_text = "This didn't work at all"


classifier = FlairSequenceClassifier.load('sentiment')

sentences = classifier.predict(text=example_text,mini_batch_size=1,
)

pred = sentences[0].get_labels()[0]

test_eq(pred.value, 'NEGATIVE')
test_close(pred.score, 0.999, 1e-3)

class EasySequenceClassifier[source]

EasySequenceClassifier()

Sequence classification models

Usage:

>>> classifier = EasySequenceClassifier()
>>> classifier.tag_text(text='text you want to label', model_name_or_path='en-sentiment')

Usage Examples:

Sample code from: 06_sequence_classification.ipynb (View Notebook for more context)

hub = HFModelHub()
model = hub.search_model_by_name("nlptown/bert-base", user_uploaded=True)[0]
classifier = EasySequenceClassifier()
sentences = classifier.tag_text(text=example_text, 
                               model_name_or_path=model,
                               mini_batch_size=1)
for pred, truth in zip(preds, truth_lbls):
    test_eq(pred.value, truth.value)
    test_close(pred.score, truth.score, 1e-4)

EasySequenceClassifier.tag_text[source]

EasySequenceClassifier.tag_text(text:Union[List[Sentence], Sentence, List[str], str], model_name_or_path:Union[str, FlairModelResult, HFModelResult]='en-sentiment', mini_batch_size:int=32, **kwargs)

Tags a text sequence with labels the sequence classification models have been trained on

  • text - String, list of strings, Sentence, or list of Sentences to be classified
  • model_name_or_path - The model name key or model path
  • mini_batch_size - The mini batch size for running inference
  • **kwargs - (Optional) Keyword Arguments for Flair's TextClassifier.predict() method params return A list of Flair's Sentence's

Usage Examples:

Sample code from: 06_sequence_classification.ipynb (View Notebook for more context)

hub = HFModelHub()
model = hub.search_model_by_name("nlptown/bert-base", user_uploaded=True)[0]
classifier = EasySequenceClassifier()
sentences = classifier.tag_text(text=example_text, 
                               model_name_or_path=model,
                               mini_batch_size=1)
for pred, truth in zip(preds, truth_lbls):
    test_eq(pred.value, truth.value)
    test_close(pred.score, truth.score, 1e-4)

EasySequenceClassifier.tag_all[source]

EasySequenceClassifier.tag_all(text:Union[List[Sentence], Sentence, List[str], str], mini_batch_size:int=32, **kwargs)

Tags text with all labels from all sequence classification models

  • text - Text input, it can be a string or any of Flair's Sentence input formats
  • mini_batch_size - The mini batch size for running inference
  • **kwargs - (Optional) Keyword Arguments for Flair's TextClassifier.predict() method params
  • return - A list of Flair's Sentence's

EasySequenceClassifier.release_model[source]

EasySequenceClassifier.release_model(model_name_or_path:str)

Unload model_name_or_path from classifier and empty cuda memory cache

May leave residual cache per pytorch documentation on torch.cuda.empty_cache()

  • model_name_or_path - The model name or key path that you want to unload and release memory from