Show simple item record

dc.contributor.authorLong, James
dc.contributor.authorBuyukozturk, Oral
dc.date.accessioned2020-08-14T22:43:39Z
dc.date.available2020-08-14T22:43:39Z
dc.date.issued2019-12
dc.identifier.issn1093-9687
dc.identifier.issn1467-8667
dc.identifier.urihttps://hdl.handle.net/1721.1/126602
dc.description.abstract Computer-Aided Civil and Infrastructure Engineering Energy harvesting wireless sensor networks are a promising solution for low cost, long lasting civil monitoring applications. But management of energy consumption is a critical concern to ensure these systems provide maximal utility. Many common civil applications of these networks are fundamentally concerned with detecting and analyzing infrequently occurring events. To conserve energy in these situations, a subset of nodes in the network can assume active duty, listening for events of interest, while the remaining nodes enter low power sleep mode to conserve battery. However, judicious planning of the sequence of active node assignments is needed to ensure that as many nodes as possible can be reached upon the detection of an event, and that the system maintains capability in times of low energy harvesting capabilities. In this article, we propose a novel reinforcement learning (RL) agent, which acts as a centralized power manager for this system. We develop a comprehensive simulation environment to emulate the behavior of an energy harvesting sensor network, with consideration of spatially varying energy harvesting capabilities, and wireless connectivity. We then train the proposed RL agent to learn optimal node selection strategies through interaction with the simulation environment. The behavior and performance of these strategies are tested on real unseen solar energy data, to demonstrate the efficacy of the method. The deep RL agent is shown to outperform baseline approaches on both seen and unseen data.en_US
dc.language.isoen
dc.publisherWileyen_US
dc.relation.isversionofhttp://dx.doi.org/10.1111/mice.12522en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourceProf. Buyukozturk via Elizabeth Soergelen_US
dc.titleCollaborative duty cycling strategies in energy harvesting sensor networksen_US
dc.typeArticleen_US
dc.identifier.citationLong, James and Oral Buyukozturk. "Collaborative duty cycling strategies in energy harvesting sensor networks." Computer-Aided Civil and Infrastructure Engineering 35, 6 (December 2019): 534-548 © 2019 Wileyen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Civil and Environmental Engineeringen_US
dc.relation.journalComputer-Aided Civil and Infrastructure Engineeringen_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dc.date.updated2020-08-14T12:25:47Z
dspace.date.submission2020-08-14T12:25:50Z
mit.journal.volume35en_US
mit.journal.issue6en_US
mit.licenseOPEN_ACCESS_POLICY
mit.metadata.statusComplete


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record