Transformers in Natural Language Processing - Traitement du Langage Parlé
Chapitre D'ouvrage Année : 2023

Transformers in Natural Language Processing

François Yvon

Résumé

This chapter presents an overview of the state of the art in natural language processing, exploring one specific computational architecture, the Transformer model, which plays a central role in a wide range of applications. This architecture condenses many advances in neural learning methods and can be exploited in many ways : to learn representations for linguistic entities ; to generate coherent utterances and answer questions ; to perform utterance transformations, an illustration being their automatic translation. These different facets of the architecture will be successively presented, also allowing us to discuss its limitations.
Fichier principal
Vignette du fichier
Transformers-en.pdf (980.98 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04224531 , version 1 (02-10-2023)

Identifiants

Citer

François Yvon. Transformers in Natural Language Processing. Mohamed Chetouani; Virginia Dignum; Paul Lukowicz; Carles Sierra. Human-Centered Artificial Intelligence. Advanced Lectures, 13500, Springer International Publishing, pp.81-105, 2023, Lecture Notes in Computer Science, 978-3-031-24348-6. ⟨10.1007/978-3-031-24349-3_6⟩. ⟨hal-04224531⟩
245 Consultations
453 Téléchargements

Altmetric

Partager

More