Publication | Open Access
What Language Model to Train if You Have One Million GPU Hours?
32
Citations
53
References
2022
Year
Unknown Venue
Artificial IntelligenceEngineeringMachine LearningComputer ArchitectureCommunicationEmnlp 2022Mathematical LinguisticsLarge Language ModelCorpus LinguisticsLanguage ProcessingText MiningGpu ComputingNatural Language ProcessingApplied LinguisticsSyntaxComputational LinguisticsCorpus AnalysisParallel ComputingLanguage StudiesTeven Le ScaoSpoken Language UnderstandingLarge Ai ModelLanguage TechnologySpoken Language AssessmentComputer ScienceMillion Gpu HoursGpu ArchitectureThomas WangLanguage CorpusParallel ProgrammingLinguisticsTheoretical Linguistics
Teven Le Scao, Thomas Wang, Daniel Hesslow, Stas Bekman, M Saiful Bari, Stella Biderman, Hady Elsahar, Niklas Muennighoff, Jason Phang, Ofir Press, Colin Raffel, Victor Sanh, Sheng Shen, Lintang Sutawika, Jaesung Tae, Zheng Xin Yong, Julien Launay, Iz Beltagy. Findings of the Association for Computational Linguistics: EMNLP 2022. 2022.
| Year | Citations | |
|---|---|---|
Page 1
Page 1