Performing summarization within the AdaptNLP library

class TransformersSummarizer[source]

TransformersSummarizer(tokenizer:PreTrainedTokenizer, model:PreTrainedModel) :: AdaptiveModel

Adaptive model for Transformer's Conditional Generation or Language Models (Transformer's T5 and Bart conditiional generation models have a language modeling head)

Usage:

>>> summarizer = TransformersSummarizer.load("transformers-summarizer-model")
>>> summarizer.predict(text="Example text", mini_batch_size=32)

Parameters:

  • tokenizer - A tokenizer object from Huggingface's transformers (TODO)and tokenizers
  • model - A transformers Conditional Generation (Bart or T5) or Language model

class EasySummarizer[source]

EasySummarizer()

Summarization Module

Usage:

>>> summarizer = EasySummarizer()
>>> summarizer.summarize(text="Summarize this text", model_name_or_path="t5-small")

Usage Examples:

Sample code from: 07a_tutorial.summarization.ipynb (View Notebook for more context)

summarizer = EasySummarizer()