Author:
Suganya S.,Selvamuthukumaran S.,Swaminathan S.,Kalaichelvi V.
Abstract
Abstract
Hadoop is a widely used analytical tool for analyzing large volume of data which are in terms of Exa & Zetta bytes. Hadoop Distributed File System(HDFS) is a well-known storage system which provides distributed storage using commodity hardware. Since Big Data consists of highly sensitive data like Electronic Health Records, Financial Data etc., harnessing big data access and security plays a vital role. The bottom most layer of security can be provided by encryption. Since the data stored is highly unstructured, wider in variety and the velocity at which data is stored is also unpredictable, traditionally used ciphers like AES, RSA, RC4 etc. will not work efficiently on Big Data. The solution must have to consider at-risk sensitive data, variety of data which needs intelligence to classify the data. When data blocks are stored in data nodes, with each block 4-bits called sensitivity bits has to be added. Two bits indicate the level of sensitivity and two more bits indicate the type of data. These 4-bits are used by the map and reduce functions to effectively encrypt the required data blocks alone and not all, in a time efficient manner by using the available commodity hardware.
Subject
General Physics and Astronomy
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献