000 | 04073nam a22005415i 4500 | ||
---|---|---|---|
001 | 978-3-031-02173-2 | ||
003 | DE-He213 | ||
005 | 20240730163828.0 | ||
007 | cr nn 008mamaa | ||
008 | 220601s2020 sz | s |||| 0|eng d | ||
020 |
_a9783031021732 _9978-3-031-02173-2 |
||
024 | 7 |
_a10.1007/978-3-031-02173-2 _2doi |
|
050 | 4 | _aQ334-342 | |
050 | 4 | _aTA347.A78 | |
072 | 7 |
_aUYQ _2bicssc |
|
072 | 7 |
_aCOM004000 _2bisacsh |
|
072 | 7 |
_aUYQ _2thema |
|
082 | 0 | 4 |
_a006.3 _223 |
100 | 1 |
_aNarayan, Shashi. _eauthor. _4aut _4http://id.loc.gov/vocabulary/relators/aut _980700 |
|
245 | 1 | 0 |
_aDeep Learning Approaches to Text Production _h[electronic resource] / _cby Shashi Narayan, Claire Gardent. |
250 | _a1st ed. 2020. | ||
264 | 1 |
_aCham : _bSpringer International Publishing : _bImprint: Springer, _c2020. |
|
300 |
_aXXIV, 175 p. _bonline resource. |
||
336 |
_atext _btxt _2rdacontent |
||
337 |
_acomputer _bc _2rdamedia |
||
338 |
_aonline resource _bcr _2rdacarrier |
||
347 |
_atext file _bPDF _2rda |
||
490 | 1 |
_aSynthesis Lectures on Human Language Technologies, _x1947-4059 |
|
505 | 0 | _aList of Figures -- List of Tables -- Preface -- Introduction -- Pre-Neural Approaches -- Deep Learning Frameworks -- Generating Better Text -- Building Better Input Representations -- Modelling Task-Specific Communication Goals -- Data Sets and Challenges -- Conclusion -- Bibliography -- Authors' Biographies. | |
520 | _aText production has many applications. It is used, for instance, to generate dialogue turns from dialogue moves, verbalise the content of knowledge bases, or generate English sentences from rich linguistic representations, such as dependency trees or abstract meaning representations. Text production is also at work in text-to-text transformations such as sentence compression, sentence fusion, paraphrasing, sentence (or text) simplification, and text summarisation. This book offers an overview of the fundamentals of neural models for text production. In particular, we elaborate on three main aspects of neural approaches to text production: how sequential decoders learn to generate adequate text, how encoders learn to produce better input representations, and how neural generators account for task-specific objectives. Indeed, each text-production task raises a slightly different challenge (e.g, how to take the dialogue context into account when producing a dialogue turn, how to detect and merge relevant information when summarising a text, or how to produce a well-formed text that correctly captures the information contained in some input data in the case of data-to-text generation). We outline the constraints specific to some of these tasks and examine how existing neural models account for them. More generally, this book considers text-to-text, meaning-to-text, and data-to-text transformations. It aims to provide the audience with a basic knowledge of neural approaches to text production and a roadmap to get them started with the related work. The book is mainly targeted at researchers, graduate students, and industrials interested in text production from different forms of inputs. | ||
650 | 0 |
_aArtificial intelligence. _93407 |
|
650 | 0 |
_aNatural language processing (Computer science). _94741 |
|
650 | 0 |
_aComputational linguistics. _96146 |
|
650 | 1 | 4 |
_aArtificial Intelligence. _93407 |
650 | 2 | 4 |
_aNatural Language Processing (NLP). _931587 |
650 | 2 | 4 |
_aComputational Linguistics. _96146 |
700 | 1 |
_aGardent, Claire. _eauthor. _4aut _4http://id.loc.gov/vocabulary/relators/aut _980701 |
|
710 | 2 |
_aSpringerLink (Online service) _980702 |
|
773 | 0 | _tSpringer Nature eBook | |
776 | 0 | 8 |
_iPrinted edition: _z9783031001840 |
776 | 0 | 8 |
_iPrinted edition: _z9783031010453 |
776 | 0 | 8 |
_iPrinted edition: _z9783031033018 |
830 | 0 |
_aSynthesis Lectures on Human Language Technologies, _x1947-4059 _980703 |
|
856 | 4 | 0 | _uhttps://doi.org/10.1007/978-3-031-02173-2 |
912 | _aZDB-2-SXSC | ||
942 | _cEBK | ||
999 |
_c85017 _d85017 |