Machine learning approaches for large scale classification of produce
Author(s)
Gupta, Otkrist; Das, Anshuman Jyothi; Hellerstein, Joshua K.; Raskar, Ramesh
DownloadPublished version (5.469Mb)
Terms of use
Metadata
Show full item recordAbstract
The analysis and identification of different attributes of produce such as taxonomy, vendor, and organic nature is vital to verifying product authenticity in a distribution network. Though a variety of analysis techniques have been studied in the past, we present a novel data-centric approach to classifying produce attributes. We employed visible and near infrared (NIR) spectroscopy on over 75,000 samples across several fruit and vegetable varieties. This yielded 0.90-0.98 and 0.98-0.99 classification accuracies for taxonomy and farmer classes, respectively. The most significant factors in the visible spectrum were variations in the produce color due to chlorophyll and anthocyanins. In the infrared spectrum, we observed that the varying water and sugar content levels were critical to obtaining high classification accuracies. High quality spectral data along with an optimal tuning of hyperparameters in the support vector machine (SVM) was also key to achieving high classification accuracies. In addition to demonstrating exceptional accuracies on test data, we explored insights behind the classifications, and identified the highest performing approaches using cross validation. We presented data collection guidelines, experimental design parameters, and machine learning optimization parameters for the replication of studies involving large sample sizes. ©2018 The Author(s).
Date issued
2018-03Department
Massachusetts Institute of Technology. Media Laboratory; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer ScienceJournal
Scientific Reports
Publisher
Springer Nature
Citation
Gupta, Otkrist et al., "Machine learning approaches for large scale classification of produce." Scientific Reports 8, 1 (March 2018): 5226 doi. 10.1038/s41598-018-23394-3 ©2018 Author(s)
Version: Final published version
ISSN
2045-2322