MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Computation-Communication Trade-offs and Sensor Selection in Real-time Estimation for Processing Networks

Author(s)
Ballotta, Luca; Schenato, Luca; Carlone, Luca
Thumbnail
DownloadAccepted version (1.690Mb)
Open Access Policy

Open Access Policy

Creative Commons Attribution-Noncommercial-Share Alike

Terms of use
Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/
Metadata
Show full item record
Abstract
© 2013 IEEE. Recent advances on hardware accelerators and edge computing are enabling substantial processing to be performed at each node (e.g., robots, sensors) of a networked system. Local processing typically enables data compression and may help mitigate measurement noise, but it is still usually slower compared to a central computer (i.e., it entails a larger computational delay). Moreover, while nodes can process the data in parallel, the computation at the central computer is sequential in nature. On the other hand, if a node decides to send raw data to a central computer for processing, it incurs a communication delay. This leads to a fundamental communication-computation trade-off, where each node has to decide on the optimal amount of local preprocessing in order to maximize the network performance. Here we consider the case where the network is in charge of estimating the state of a dynamical system and provide three key contributions. First, we provide a rigorous problem formulation for optimal real-Time estimation in processing networks, in the presence of communication and computation delays. Second, we develop analytical results for the case of a homogeneous network (where all sensors have the same computation) that monitors a continuous-Time scalar linear system. In particular, we show how to compute the optimal amount of local preprocessing to minimize the estimation error and prove that sending raw data is in general suboptimal in the presence of communication delays. Third, we consider the realistic case of a heterogeneous network that monitors a discrete-Time multi-variate linear system and provide practical algorithms (i) to decide on a suitable preprocessing at each node, and (ii) to select a sensor subset when computational constraints make using all sensors suboptimal. Numerical simulations show that selecting the sensors is crucial: The more may not be the merrier. Moreover, we show that if the nodes apply the preprocessing policy suggested by our algorithms, they can largely improve the network estimation performance.
Date issued
2020
URI
https://hdl.handle.net/1721.1/134433
Department
Massachusetts Institute of Technology. Laboratory for Information and Decision Systems
Journal
IEEE Transactions on Network Science and Engineering
Publisher
Institute of Electrical and Electronics Engineers (IEEE)

Collections
  • MIT Open Access Articles

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.