Blockchain Based Data Integrity Verification for Large-Scale IoT Data
Blockchain Based Data Integrity Verification for Large-Scale IoT Data
Achieving data integrity verification for large-scale IoT data in cloud storage safely
and efficiently has become one of the hot topics with further applications of Internet
of Things. Traditional data integrity verification methods generally use encryption
techniques to protect data in the cloud, relying on trusted Third Party Auditors
(TPAs). Blockchain based data integrity schemes can successfully avoid the trust
problem of TPAs, however, they have to face the problems of large computational
and communication overhead. To address the issues above, we propose a Blockchain
and Bilinear mapping based Data Integrity Scheme (BB-DIS) for large-scale IoT
data. In our BB-DIS, IoT data is sliced into shards and homomorphic verifiable tags
(HVTs) are generated for sampling verification. Data integrity can be achieved
according to the characteristics of bilinear mapping in the form of blockchain
transactions. Performance analysis of BBDIS including feasibility, security,
dynamicity and complexity is also discussed in detail. A prototype system of BB-DIS
is then presented to illustrate how to implement our verification scheme.
Experimental results based on Hyperledger Fabric demonstrate that the proposed
verification scheme significantly improves the efficiency of integrity verification for
large-scale IoT data with no need of TPAs.
EXISTING SYSTEM
Sebé et al. [14] used the method of blocking the original data files to reduce the
computational cost. Later, Ateniese et al. [4] first proposed the PDP scheme, which
uses Rivest-Shamir-Adleman (RSA) signatures. The model generated probabilistic
proofs of possession by sampling random sets of blocks from the server, which
drastically reduced I/O costs. The client maintained a constant amount of metadata to
verify the proof. Its protocol was defined for static files and could not handle
dynamic data storage without introducing security vulnerabilities. This problem was
solved in [15], but did not support fully dynamic data operations.
Wang et al. [16] proposed a new challenge and response protocol, which used the
Merkle hash tree to ensure the correctness of the data block and introduced an
independent TPA instead of the user to execute the verification operation to alleviate
burden. Juels and Kaliski [6] first proposed a sentinel-based POR model, which
added some "sentinel" data blocks to the stored data at random and used the erasure
code to detect distorted data and downgrade them to storage with undefined quality
of service. Shacham and Waters [17] used Boneh-Lynn-Shacham (BLS) signature
mechanism to generate homomorphic verifiable tags (HVTs) based on Ateniese et al’
s research, which reduced communication overhead while supporting public auditing.
But it could not guarantee users’ data privacy. Wang et al. [5] used the linear
characteristics of erasure codes to achieve partial dynamic operations. Chen and
Curtmola [18] used Cauchy Reed- Solomon linear coding to preprocess data to
improve the recovery speed of erroneous data, but the computational cost was still
very large.
In order to prevent TPA from leaking privacy data [19], Wang et al. [7] proposed a
data integrity verification mechanism based on public key-based homomorphic
authenticator and random mask to achieve privacy protection in public cloud system.
To reduce the energy consumption in wireless sensor networks (WSNs) [20],
Othman et al. [21] adopted a symmetric-key homomorphic encryption to protect data
privacy and combines it with homomorphic signature to check the aggregation data
integrity. Zhu et al. [22] reduced the computational overhead of the hash function in
the signature process [23] and used the random masking technique to preserving data
privacy.
Liu et al. [9] proposed a blockchain based approach for IoT data integrity service.
This solution performed integrity verification without relying on any TPAs in a
dynamic IoT environment. However, the speed of uploading IoT data and the size of
the verified data need to be improved. Yue et al. [10] proposed a blockchain based
P2P cloud storage data integrity verification framework. They used Merkle tree for
data integrity verification, and analyzed system performance under different Merkle
tree structures. Liang et al. [32] proposed a decentralized and trusted cloud data
provenance to verify data security. The provenance auditor verifies provenance
data through information in the block. Wang et al. [11] proposed a decentralized
model to solve the single point of trust problem in the traditional data auditing
service model by collective trust. The protocol allows users to trace the
history of their data.
Disadvantages
In an existing system, the system didn’t use smart contracts in the verification
process, and not proposed the verification algorithm according to the verification
protocol to verify the authentication metadata. The step stage, challenge stage and
verification stage of applying the SC-Verification algorithm.
Proposed System
we propose a Blockchain and Bilin ear mapping based Data Integrity Scheme (BB-
DIS) for large-scale IoT data in cloud storage. Main contributions of this paper are
listed as follows: l A blockchain based data integrity verification framework is
proposed for large-scale IoT data. An associated series of protocols followed with
verification algorithms and performance analysis are also presented in detail.A
prototype system is built with an edge computing processor in the vicinity of the IoT
devices to preprocess the large-scale IoT data so that communication cost and
computation burden can be reduced significantly. Multiple simulation experiments
are conducted on Hyperledger Fabric. Comparative analysis on computational and
communication overhead among BB-BIS and other baseline schemes is given.
Various sampling strategies are introduced, and optimized sampling verification
scheme is finally recommended.
Advantages
SYSTEM REQUIREMENTS
Software Requirements:
Operating System - Windows XP
Coding Language - Java/J2EE(JSP,Servlet)
Front End - J2EE
Back End - MySQL