Open Access Journal

ISSN : 2394-2320 (Online)

International Journal of Engineering Research in Computer Science and Engineering (IJERCSE)

Monthly Journal for Computer Science and Engineering

Open Access Journal

International Journal of Engineering Research in Computer Science and Engineering (IJERCSE)

Monthly Journal for Computer Science and Engineering

ISSN : 2394-2320 (Online)

Text Summarization of Data by Using the Recursive Iteration Technique

Author : Vijayalaxmi M H 1 Rajeshwari R 2

Date of Publication :25th April 2018

Abstract: Neural arrangement to-grouping models have given a reasonable new way to deal with abstractive content synopsis (which means they are not limited to just choosing what's more, reworking sections from the first content). Be that as it may, these models have two deficiencies: they are at risk to imitate accurate points of interest erroneously, and they tend to rehash themselves. In this work we propose a Recursive iteration technique in which accuracy of the summarization of the data can be increased. By applying the multiple level of summarization without missing of any important data present in the document the results will be achieved. Here we can apply our models in the news articles which need to be summarized.

Reference :

    1. Sho Takase “Neural headline generation on abstract meaning representation.
    2. See, A., Liu, P.J. and Manning, C.D., 2017. Get to the point: Summarization with pointer-generator networks. Ar Xiv preprint arXiv: 1704.04368.
    3. Bahdanau, D., Cho, K. and Bengio, Y., 2014. Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473.
    4. Chen, Q., Zhu, X., Ling, Z., Wei, S. and Jiang, H., 2016. Distraction-based neural networks for document summarization. arXiv preprint arXiv:1610.08462.
    5. Cheung, J.C.K. and Penn, G., 2014. Unsupervised sentence enhancement for automatic summarization. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP) (pp. 775-786)
    6. Chopra, S., Auli, M. and Rush, A.M., 2016. Abstractive sentence summarization with attentive recurrent neural networks. In Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (pp. 93-98).
    7. Gu, J., Lu, Z., Li, H. and Li, V.O., 2016. Incorporating copying mechanism in sequence-tosequence learning. arXiv preprint arXiv:1603.06393.

Recent Article