Affiliation:
1. Vellore Institute of Technology, India
Abstract
The volume of data in diverse data formats from various data sources has led the way for a new drift in the digital world, Big Data. This article proposes sl-LSTM (sequence labelling LSTM), a neural network architecture that combines the effectiveness of typical LSTM models to perform sequence labeling tasks. This is a bi-directional LSTM which uses stochastic gradient descent optimization and combines two features of the existing LSTM variants: coupled input-forget gates for reducing the computational complexity and peephole connections that allow all gates to inspect the current cell state. The model is tested on different datasets and the results show that the integration of various neural network models can further improve the efficiency of approach for identifying sensitive information in Big data.
Subject
Computer Networks and Communications
Reference24 articles.
1. Large-scale machine learning with stochastic gradient descent.;L.Bottou;Proceedings of COMPSTAT,2010
2. Camron, G. (2016). Recurrent Neural Networks for Beginners. Retrieved from https://medium.com/@camrongodbout/recurrent-neural-networks-for-beginners-7aca4e933b82
3. Cho, K., Van Merriënboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., & Bengio, Y. (2014). Learning phrase representations using RNN encoder-decoder for statistical machine translation.
4. Introduction to information retrieval.;D. M.Christopher;An Introduction To Information Retrieval,2008
5. Broad-coverage sense disambiguation and information extraction with a supersense sequence tagger
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献