Text Generation API

class TransformersTextGenerator[source]

TransformersTextGenerator(tokenizer:PreTrainedTokenizer, model:PreTrainedModel) :: AdaptiveModel

Adaptive model for Transformer's Language Models

Usage:

>>> generator = TransformersTextGenerator.load("gpt2")
>>> generator.generate(text="Example text", mini_batch_size=32)

Parameters:

  • tokenizer - A tokenizer object from Huggingface's transformers (TODO)and tokenizers
  • model - A transformers Language model

class EasyTextGenerator[source]

EasyTextGenerator()

Text Generation Module

Usage:

>>> generator = EasyGenerator()
>>> generator.generate(text="generate from this text", num_tokens_to_produce=50)

Usage Examples:

Sample code from: 09a_tutorial.easy_text_generator.ipynb (View Notebook for more context)

generator = EasyTextGenerator()