Neural Wikipedian: generating textual summaries from knowledge base triples
Neural Wikipedian: generating textual summaries from knowledge base triples
Most people need textual or visual interfaces in order to make sense of Semantic Web data. In this paper, we investigate the problem of generating natural language summaries for Semantic Web data using neural networks. Our end-to-end trainable architecture encodes the information from a set of triples into a vector of fixed dimensionality and generates a textual summary by conditioning the output on the encoded vector. We explore a set of different approaches that enable our models to verbalise entities from the input set of triples in the generated text. Our systems are trained and evaluated on two corpora of loosely aligned Wikipedia snippets with triples from DBpedia and Wikidata, with promising results.
1-15
Vougiouklis, Pavlos
4cd0a8f1-c5e2-4ba2-8dcd-753db616b215
Elsahar, Hady
04528e31-9e9e-4de3-99ce-b6221889e912
Kaffee, Lucie-Aimée
8975c12f-9033-47ed-a2eb-b674b707c2ac
Gravier, Christophe
3d1a8495-afbd-4a61-b19b-a00036d4e74b
Laforest, Frederique
f61f682e-55a5-4626-a8d6-52aa2f3809d6
Hare, Jonathon
65ba2cda-eaaf-4767-a325-cd845504e5a9
Simperl, Elena
40261ae4-c58c-48e4-b78b-5187b10e4f67
October 2018
Vougiouklis, Pavlos
4cd0a8f1-c5e2-4ba2-8dcd-753db616b215
Elsahar, Hady
04528e31-9e9e-4de3-99ce-b6221889e912
Kaffee, Lucie-Aimée
8975c12f-9033-47ed-a2eb-b674b707c2ac
Gravier, Christophe
3d1a8495-afbd-4a61-b19b-a00036d4e74b
Laforest, Frederique
f61f682e-55a5-4626-a8d6-52aa2f3809d6
Hare, Jonathon
65ba2cda-eaaf-4767-a325-cd845504e5a9
Simperl, Elena
40261ae4-c58c-48e4-b78b-5187b10e4f67
Vougiouklis, Pavlos, Elsahar, Hady, Kaffee, Lucie-Aimée, Gravier, Christophe, Laforest, Frederique, Hare, Jonathon and Simperl, Elena
(2018)
Neural Wikipedian: generating textual summaries from knowledge base triples.
Journal of Web Semantics, 52-53, .
(doi:10.1016/j.websem.2018.07.002).
Abstract
Most people need textual or visual interfaces in order to make sense of Semantic Web data. In this paper, we investigate the problem of generating natural language summaries for Semantic Web data using neural networks. Our end-to-end trainable architecture encodes the information from a set of triples into a vector of fixed dimensionality and generates a textual summary by conditioning the output on the encoded vector. We explore a set of different approaches that enable our models to verbalise entities from the input set of triples in the generated text. Our systems are trained and evaluated on two corpora of loosely aligned Wikipedia snippets with triples from DBpedia and Wikidata, with promising results.
Text
- Accepted Manuscript
More information
Submitted date: 29 November 2017
Accepted/In Press date: 27 July 2018
e-pub ahead of print date: 30 July 2018
Published date: October 2018
Identifiers
Local EPrints ID: 422672
URI: http://eprints.soton.ac.uk/id/eprint/422672
ISSN: 1570-8268
PURE UUID: 51cc3f49-a4d6-41a7-b28b-6224bb04bfb1
Catalogue record
Date deposited: 30 Jul 2018 16:30
Last modified: 16 Mar 2024 06:56
Export record
Altmetrics
Contributors
Author:
Pavlos Vougiouklis
Author:
Hady Elsahar
Author:
Lucie-Aimée Kaffee
Author:
Christophe Gravier
Author:
Frederique Laforest
Author:
Jonathon Hare
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics