Skip to main content

License single eBooks

Your results

Search and filter

Showing 1 - 1 of 1
0 eBooks €0

  • Pre-trained Language Models Integrating Media
    Gerhard Paaß; Sven Giesselbach
    978-3-031-23190-2
    2023
    Edition 1
    • Offers an overview of pre-trained language models such as BERT, GPT, and sequence-to-sequence Transformer
    • Explains the key techniques to improve the performance of pre-trained models
    • Presents advanced pre-trained models for a broad range of NLP tasks
    • This book is open access, which means that you have free and unlimited access

    Open Access

Is this helpful?

Survey

Survey to collect feedback on the helpfulness of this page.

Survey

Survey to collect feedback on the helpfulness of this page.