Show simple item record

dc.contributor.authorKruse, Jakob Adrian
dc.contributor.authorCiechanowski, Leon
dc.contributor.authorDupuis, Ambre
dc.contributor.authorVazquez, Ignacio
dc.contributor.authorGloor, Peter A.
dc.date.accessioned2024-03-27T16:07:44Z
dc.date.available2024-03-27T16:07:44Z
dc.date.issued2024-03-16
dc.identifier.issn1424-8220
dc.identifier.urihttps://hdl.handle.net/1721.1/153952
dc.description.abstractRecent advances in artificial intelligence combined with behavioral sciences have led to the development of cutting-edge tools for recognizing human emotions based on text, video, audio, and physiological data. However, these data sources are expensive, intrusive, and regulated, unlike plants, which have been shown to be sensitive to human steps and sounds. A methodology to use plants as human emotion detectors is proposed. Electrical signals from plants were tracked and labeled based on video data. The labeled data were then used for classification., and the MLP, biLSTM, MFCC-CNN, MFCC-ResNet, Random Forest, 1-Dimensional CNN, and biLSTM (without windowing) models were set using a grid search algorithm with cross-validation. Finally, the best-parameterized models were trained and used on the test set for classification. The performance of this methodology was measured via a case study with 54 participants who were watching an emotionally charged video; as ground truth, their facial emotions were simultaneously measured using facial emotion analysis. The Random Forest model shows the best performance, particularly in recognizing high-arousal emotions, achieving an overall weighted accuracy of 55.2% and demonstrating high weighted recall in emotions such as fear (61.0%) and happiness (60.4%). The MFCC-ResNet model offers decently balanced results, with AccuracyMFCC−ResNet=0.318 and RecallMFCC−ResNet=0.324. Regarding the MFCC-ResNet model, fear and anger were recognized with 75% and 50% recall, respectively. Thus, using plants as an emotion recognition tool seems worth investigating, addressing both cost and privacy concerns.en_US
dc.publisherMDPI AGen_US
dc.relation.isversionof10.3390/s24061917en_US
dc.rightsCreative Commons Attributionen_US
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/en_US
dc.sourceMultidisciplinary Digital Publishing Instituteen_US
dc.subjectElectrical and Electronic Engineeringen_US
dc.subjectBiochemistryen_US
dc.subjectInstrumentationen_US
dc.subjectAtomic and Molecular Physics, and Opticsen_US
dc.subjectAnalytical Chemistryen_US
dc.titleLeveraging the Sensitivity of Plants with Deep Learning to Recognize Human Emotionsen_US
dc.typeArticleen_US
dc.identifier.citationSensors 24 (6): 1917 (2024)en_US
dc.contributor.departmentMassachusetts Institute of Technology. Center for Collective Intelligence
dc.relation.journalSensorsen_US
dc.identifier.mitlicensePUBLISHER_CC
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dc.date.updated2024-03-27T13:15:50Z
dspace.date.submission2024-03-27T13:15:50Z
mit.journal.volume24en_US
mit.journal.issue6en_US
mit.licensePUBLISHER_CC
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record