Publication | Open Access
Transformers: State-of-the-Art Natural Language Processing
7.7K
Citations
25
References
2020
Year
Unknown Venue
EngineeringLanguage LearningLanguage ProcessingText MiningNatural Language ProcessingApplied LinguisticsSyntaxThomas WolfComputational LinguisticsLanguage EngineeringGrammarEmpirical MethodsCorpus AnalysisLanguage StudiesMachine TranslationNatural LanguageNlp TaskLanguage TechnologyKnowledge DiscoveryComputer ScienceSemantic ParsingAlexander RushData-driven LearningLinguistics
Recent progress in natural language processing has been driven by advances in both model architecture and pretraining, with transformer architectures enabling higher‑capacity models and effective utilization across a wide variety of tasks. Transformers is an open‑source library aimed at making state‑of‑the‑art transformer advances accessible to the broader machine learning community. The library provides a unified API for carefully engineered state‑of‑the‑art transformer architectures, a curated set of pretrained models, and is designed to be extensible, user‑friendly, and efficient for industrial use. The library is available at https://github.com/huggingface/transformers.
Recent progress in natural language processing has been driven by advances in both model architecture and model pretraining. Transformer architectures have facilitated building higher-capacity models and pretraining has made it possible to effectively utilize this capacity for a wide variety of tasks. Transformers is an open-source library with the goal of opening up these advances to the wider machine learning community. The library consists of carefully engineered state-of-the art Transformer architectures under a unified API. Backing this library is a curated collection of pretrained models made by and available for the community. Transformers is designed to be extensible by researchers, simple for practitioners, and fast and robust in industrial deployments. The library is available at https://github.com/huggingface/transformers.
| Year | Citations | |
|---|---|---|
Page 1
Page 1