700 Journals and 15,000,000 Readers Each Journal is getting 25,000+ ReadersThis Readership is 10 times more when compared to other Subscription Journals (Source: Google Analytics)
Research Article Open Access
Data de-duplication is one of the data compression techniques that can be used to eliminate the repetitive data and can be used in cloud computing architecture. For privacy and security point of view convergent technique has been used to encrypt data before outsourcing. But previous systems have the limitation of convergent encryption and in proposed system applied the techniques of cryptographic tuning to make the encryption more secure and flexible. Data de-duplication prohibits the storage of repetitive blocks and implements the pointer concept which basically puts the pointer to the existing blocks. Access control is provided into the application which allows the data owner the freedom of selecting users to have access to the published file. The integrity of data outsourced to the cloud is managed by the hash calculation of any content following the proof-of-ownership module. Proposed system calculate hash of the content on source and destination side and request the hash for the cloud side to predict the tampering of data. The expected analysis shows the improvement in execution time and development cost.
To read the full article Peer-reviewed Article PDF
Author(s): Naziya Tabassum, Roshani B. Talmale
De-duplication, authorized duplicate check, confidentiality, public cloud., Autonomic and Context Aware Computing,Bioinformatics and Computational Biology,Broadband and Intelligent Networks,Calm Technology,CDMA/GSM Communication Protocol