Affiliation:
1. Department of Computer Engineering and Information Technology, Amirkabir University of Technology, Tehran, Iran
2. Internet of Things (IOT) Lab, Department of Engineering and Architecture, University of Parma, Parma, Italy
Abstract
Internet of Things integrates various technologies, including wireless sensor networks, edge computing, and cloud computing, to support a wide range of applications such as environmental monitoring and disaster surveillance. In these types of applications, IoT devices operate using limited resources in terms of battery, communication bandwidth, processing, and memory capacities. In this context, load balancing, fault tolerance, and energy and memory efficiency are among the most important issues related to data dissemination in IoT networks. In order to successfully cope with the abovementioned issues, two main approaches—data-centric storage and distributed data storage—have been proposed in the literature. Both approaches suffer from data loss due to memory and/or energy depletion in the storage nodes. Even though several techniques have been proposed so far to overcome the abovementioned problems, the proposed solutions typically focus on one issue at a time. In this article, we propose a cross-layer optimization approach to increase memory and energy efficiency as well as support load balancing. The optimization problem is a mixed-integer nonlinear programming problem, and we solve it using a genetic algorithm. Moreover, we integrate the data-centric storage features into distributed data storage mechanisms and present a novel heuristic approach, denoted as Collaborative Memory and Energy Management, to solve the underlying optimization problem. We also propose analytical and simulation frameworks for performance evaluation. Our results show that the proposed method outperforms the existing approaches in various IoT scenarios.
Subject
Computer Networks and Communications,General Engineering