MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Collaborative duty cycling strategies in energy harvesting sensor networks

Author(s)
Long, James; Buyukozturk, Oral
Thumbnail
DownloadAccepted version (4.181Mb)
Open Access Policy

Open Access Policy

Creative Commons Attribution-Noncommercial-Share Alike

Terms of use
Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/
Metadata
Show full item record
Abstract
 Computer-Aided Civil and Infrastructure Engineering Energy harvesting wireless sensor networks are a promising solution for low cost, long lasting civil monitoring applications. But management of energy consumption is a critical concern to ensure these systems provide maximal utility. Many common civil applications of these networks are fundamentally concerned with detecting and analyzing infrequently occurring events. To conserve energy in these situations, a subset of nodes in the network can assume active duty, listening for events of interest, while the remaining nodes enter low power sleep mode to conserve battery. However, judicious planning of the sequence of active node assignments is needed to ensure that as many nodes as possible can be reached upon the detection of an event, and that the system maintains capability in times of low energy harvesting capabilities. In this article, we propose a novel reinforcement learning (RL) agent, which acts as a centralized power manager for this system. We develop a comprehensive simulation environment to emulate the behavior of an energy harvesting sensor network, with consideration of spatially varying energy harvesting capabilities, and wireless connectivity. We then train the proposed RL agent to learn optimal node selection strategies through interaction with the simulation environment. The behavior and performance of these strategies are tested on real unseen solar energy data, to demonstrate the efficacy of the method. The deep RL agent is shown to outperform baseline approaches on both seen and unseen data.
Date issued
2019-12
URI
https://hdl.handle.net/1721.1/126602
Department
Massachusetts Institute of Technology. Department of Civil and Environmental Engineering
Journal
Computer-Aided Civil and Infrastructure Engineering
Publisher
Wiley
Citation
Long, James and Oral Buyukozturk. "Collaborative duty cycling strategies in energy harvesting sensor networks." Computer-Aided Civil and Infrastructure Engineering 35, 6 (December 2019): 534-548 © 2019 Wiley
Version: Author's final manuscript
ISSN
1093-9687
1467-8667

Collections
  • MIT Open Access Articles

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.