Open Access Journal

ISSN : 2394-2320 (Online)

International Journal of Engineering Research in Computer Science and Engineering (IJERCSE)

Monthly Journal for Computer Science and Engineering

Open Access Journal

International Journal of Engineering Research in Computer Science and Engineering (IJERCSE)

Monthly Journal for Computer Science and Engineering

ISSN : 2394-2320 (Online)

Application of Deep learning Techniques to Natural Language Processing

Author : Sonam Gandotra 1 Bhavna Arora 2

Date of Publication :17th October 2017

Abstract: Deep learning refers to artificial neural networks comprising of multiple layers. The deep learning algorithms automate the representation of abstract features by composing simple representations from raw data at one level to complex representations at the higher. These algorithms have improved the current state of art results and bought new insights to the current data. In this paper, the deep neural architectures and their application on NLP have been discussed. The paper also evaluates the various approaches to train the data. Various classifications of deep neural nets are also an integral part of the paper. On the basis of architecture and transfer of information from input to output layers via hidden layers, deep neural nets have been broadly classified and elaborated. Brief comparisons of the various techniques used in deep neural networks on various parameters are evaluated and have been presented.

Reference :

    1. M. M. Lopez, “Deep Learning applied to NLP.”
    2. Y. Lecun, Y. Bengio, and G. Hinton, “Deep learning,” 2015.
    3. C. Masolo, “Supervised, unsupervised and deep learning – Towards Data Science – Medium,” 2017. [Online]. Available: https://medium.com/towardsdata-science/supervised-unsupervised-and-deep learning-aa61a0e5471c. [Accessed: 07-Oct-2017].
    4. M. Riedmiller, “Advanced Supervised Learning in Multi-layer Perceptrons - From Backpropagation to Adaptive Learning Algorithms.”
    5. J. McGonagle, “Feedforward Neural Networks | Brilliant Math & Science Wiki.” [Online]. Available: https://brilliant.org/wiki/feedforwardneural-networks/. [Accessed: 02-Oct-2017].
    6. M. Peng, C. Wang, T. Chen, and G. Liu, “NIRFaceNet: A convolutional neural network for near-infrared face identification,” Inf., vol. 7, no. 4, p. 61, Oct. 2016.
    7. “A Beginner’s Guide to Recurrent Networks and LSTMs - Deeplearning4j: Open-source, Distributed Deep Learning for the JVM.” [Online]. Available: https://deeplearning4j.org/lstm.html. [Accessed: 03- Oct-2017].
    8. “Recurrent neural networks in Ruby - Joseph Wilk.” [Online]. Available: http://blog.josephwilk.net/ruby/recurrent-neuralnetworks-in-ruby.html. [Accessed: 03-Oct-2017]
    9. A. Chinea, “Understanding the Principles of Recursive Neural Networks: A Generative Approach to Tackle Model Complexity.”
    10. Y. Kim, “Convolutional Neural Networks for Sentence Classification,” pp. 1746–1751.
    11. P. Wang et al., “Semantic Clustering and Convolutional Neural Network for Short Text Categorization,” pp. 352–357.
    12. C. Yao, J. Shen, and G. Chen, “Automatic Document Summarization via Deep Neural Networks,” in 2015 8th International Symposium on Computational Intelligence and Design (ISCID), 2015, pp. 291–296.
    13. Z. Cao, W. Li, S. Li, and F. Wei, “Improving MultiDocument Summarization via Text Classification,” pp. 3053–3059, 2016.
    14. A. Kumar et al., “Ask Me Anything: Dynamic Memory Networks for Natural Language Processing.”
    15. A. Severyn and A. Moschitti, “Modeling Relational Information in Question-Answer Pairs with Convolutional Neural Networks.”
    16. A. Passos, V. Kumar, and A. Mccallum, “Lexicon Infused Phrase Embeddings for Named Entity Resolution,” pp. 78–86, 2014.
    17. J. P. C. Chiu and E. Nichols, “Named Entity Recognition with Bidirectional LSTM-CNNs,” no. 2003, 2014.

Recent Article