AdaptNLP is a python package that allows users ranging from beginner python coders to experienced Machine Learning Engineers to leverage state-of-the-art Natural Language Processing (NLP) models and training techniques in one easy-to-use python package.
Utilizing fastai with HuggingFace's Transformers library and Humboldt University of Berlin's Flair library, AdaptNLP provides Machine Learning Researchers and Scientists a modular and adaptive approach to a variety of NLP tasks simplifying what it takes to train, perform inference, and deploy NLP-based models and microservices.
Despite quick inference functionalities such as the
pipeline API in
transformers, it still is not quite as flexible nor fast enough. With AdaptNLP's
Easy* inference modules, these tend to be slightly faster than the
pipeline interface (bare minimum the same speed), while also providing the user with simple intuitive returns to alleviate any unneeded junk that may be returned.
Along with this, with the integration of the
fastai library the code needed to train or run inference on your models has a completely modular API through the
fastai Callback system. Rather than needing to write your entire torch loop, if there is anything special needed for a model a Callback can be written in less than 10 lines of code to achieve your specific functionalities.
Finally, when training your model fastai is on the forefront of beign a library constantly bringing in the best practices for achiving state-of-the-art training with new research methodologies heavily tested before integration. As such, AdaptNLP fully supports training with the One-Cycle policy, and using new optimizer combinations such as the Ranger optimizer with Cosine Annealing training through simple one-line fitting functions (
To install any developmental style builds, please follow the below directions to install directly from git:
Stable Master Branch The master branch generally is not updated much except for hotfixes and new releases. To install please use:
pip install git+https://github.com/Novetta/adaptnlp
pip install git+https://github.com/Novetta/adaptnlp@dev
There are actively updated Docker images hosted on Novetta's DockerHub
The guide to each tag is as follows:
- latest: This is the latest pypi release and installs a complete package that is CUDA capable
- dev: These are occasionally built developmental builds at certain stages. They are built by the
devbranch and are generally stable
- *api: The API builds are for the REST-API
To pull and run any AdaptNLP image immediatly you can run:
docker run -itp 8888:8888 novetta/adaptnlp:TAG
TAG with any of the afformentioned tags earlier.
localhost:888/lab to access the notebook containers
The AdaptNLP library is built with nbdev, so any documentation page you find (including this one!) can be directly run as a Jupyter Notebook. Each page at the top includes an "Open in Colab" button as well that will open the notebook in Google Colaboratory to allow for immediate access to the code.
The documentation is split into six sections, each with a specific purpose:
This group contains quick access to the homepage, what are the AdaptNLP Cookbooks, and how to contribute
These contain any relevant documentation for the
AdaptiveModel class, the HuggingFace Hub model search integration, and the
Result class that various inference API's return
This section contains the module documentation for the inference framework, the tuning framework, as well as the utilities and foundations for the AdaptNLP library.
These two sections provide quick access to single use recipies for starting any AdaptNLP project for a particular task, with easy to use code designed for that specific use case. There are currently over 13 different tutorials available, with more coming soon.
This section provides directions on how to use the AdaptNLP REST API for deploying your models quickly with FastAPI
AdaptNLP is run on the
nbdev framework. To run all tests please do the following:
pip install nbverbose
git clone https://github.com/Novetta/adaptnlp
pip install -e .
This will run every notebook and ensure that all tests have passed. Please see the nbdev documentation for more information about it.