Operations Research - Master's degree
http://hdl.handle.net/1721.1/7720
2017-01-22T01:53:33ZPredicting performance using galvanic skin response
http://hdl.handle.net/1721.1/105086
Predicting performance using galvanic skin response
Mundell, Lee Carter
The rapid growth of the availability of wearable biosensors has created the opportunity for using physiological signals to measure worker performance. An important question is how to use such signals to not just measure, but actually predict worker performance on a task under stressful and potentially high risk conditions. Here we show that the biological signal known as galvanic skin response (GSR) allows such a prediction. We conduct an experiment where subjects answer arithmetic questions under low and high stress conditions while having their GSR monitored. Using only the GSR measured under low stress conditions, we are able to predict which subjects will perform well under high stress conditions with a median accuracy of 75%. If we try to make similar predictions without using any biometric signals, the median accuracy is 50%. Our results suggest that performance in high stress conditions can be predicted using signals obtained from GSR sensors in low stress conditions.
Thesis: S.M., Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2016.; Cataloged from PDF version of thesis.; Includes bibliographical references (pages 49-52).
2016-01-01T00:00:00ZThink global, act local when estimating a sparse precision matrix
http://hdl.handle.net/1721.1/105001
Think global, act local when estimating a sparse precision matrix
Lee, Peter Alexander
Substantial progress has been made in the estimation of sparse high dimensional precision matrices from scant datasets. This is important because precision matrices underpin common tasks such as regression, discriminant analysis, and portfolio optimization. However, few good algorithms for this task exist outside the space of L1 penalized optimization approaches like GLASSO. This thesis introduces LGM, a new algorithm for the estimation of sparse high dimensional precision matrices. Using the framework of probabilistic graphical models, the algorithm performs robust covariance estimation to generate potentials for small cliques and fuses the local structures to form a sparse yet globally robust model of the entire distribution. Identification of appropriate local structures is done through stochastic discrete optimization. The algorithm is implemented in Matlab and benchmarked against competitor algorithms for an array of synthetic datasets. Simulation results suggest that LGM may outperform GLASSO when model sparsity is especially important and when variables in the dataset belong to a number of closely related (if unknown) groups.
Thesis: S.M., Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2016.; This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.; Cataloged from student-submitted PDF version of thesis.; Includes bibliographical references (pages 99-100).
2016-01-01T00:00:00ZInteracting with users in social networks : the follow-back problem
http://hdl.handle.net/1721.1/105000
Interacting with users in social networks : the follow-back problem
Rajagopalan, Krishnan, S.M. Sloan School of Management
An agent wants to form a connection with a predetermined set of target users over social media. Because forming a connection is known as "following" in social networks such as Twitter, we refer to this as the follow-back problem. The targets and their friends form a directed graph which we refer to as the "friends graph." The agent's goal is to get the targets to follow it, and it is allowed to interact with the targets and their friends. To understand what features impact the probability of an interaction resulting in a follow-back, we conduct an empirical analysis of several thousand interactions in Twitter. We build a model of the follow-back probabilities based upon this analysis which incorporates features such as the friend and follower count of the target and the neighborhood overlap of the target with the agent. We find optimal policies for simple network topologies such as directed acyclic graphs. For arbitrary directed graphs we develop integer programming heuristics that employ network centrality measures and a graph score we define as the follow-back score. We show that these heuristic policies perform well in simulation on a real Twitter network.
Thesis: S.M., Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2016.; This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.; Cataloged from student-submitted PDF version of thesis.; Includes bibliographical references (pages 69-71).
2016-01-01T00:00:00ZInternet of Things and anomaly detection for the iron ore mining industry
http://hdl.handle.net/1721.1/104999
Internet of Things and anomaly detection for the iron ore mining industry
Saroufim, Carl Elie
In the context of a world flooded with data, the Internet of Things (IoT) is exploding. This thesis considers the problem of applying IoT technology to the reduction of costs in the iron ore mining industry, to compensate for the iron ore price slumping observed over the past years. More specifically, we focused on improving the quality of the output in a data-driven iron ore concentration factory. In this plant, mined iron ore goes through a series of complex physical and chemical transformations so as to increase the concentration in iron and reduce the concentration in impurities such as silica. In this thesis, we developed an IoT infrastructure comprising of machines, a network of sensors, a database, a random forest prediction model, an algorithm for adjusting its cutoff parameter dynamically, and a predictive maintenance algorithm. It can preventively detect and maybe fix poor quality events in the iron ore concentration factory, improving the overall quality and decreasing costs. The random forest model was selected among other anomaly detection techniques. It is able, on an independent test data set, with an AUC of about 0.92, to detect 90% of the poor quality events, with a false positive rate of 23.02%, lowered by the dynamic cutoff algorithm. These methods can be applied to any factory in any industry, as long as it has a good infrastructure of sensors, providing sufficiently precise and frequent data.
Thesis: S.M., Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2016.; This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.; Cataloged from student-submitted PDF version of thesis.; Includes bibliographical references (pages 176-182).
2016-01-01T00:00:00Z