spaCy meets Transformers: Fine-tune BERT, XLNet and GPT-2
Explosion
AUGUST 1, 2019
Support is provided for fine-tuning the transformer models via spaCy’s standard nlp.update training API. Transformers and transfer-learning Natural Language Processing (NLP) systems face a problem known as the “knowledge acquisition bottleneck”. In this post we introduce our new wrapping library, spacy-transformers.
Let's personalize your content