Show simple item record

dc.contributor.authorBiggs, Max
dc.contributor.authorPerakis, Georgia
dc.date.accessioned2025-11-18T18:51:01Z
dc.date.available2025-11-18T18:51:01Z
dc.date.issued2025-05-29
dc.identifier.urihttps://hdl.handle.net/1721.1/163756
dc.description.abstractWe focus on modeling the relationship between an input feature vector and the predicted outcome of a trained decision tree using mixed-integer optimization. This can be used in many practical applications where a decision tree or a tree ensemble is incorporated into an optimization problem to model the predicted outcomes of a decision. We propose novel tight mixed-integer optimization formulations for this problem. Existing formulations can be shown to have linear relaxations that have fractional extreme points, even for the simple case of modeling a single decision tree or a very large number of constraints, which leads to slow solve times in practice. A formulation we propose, based on a projected union of polyhedra approach, is ideal (i.e., the extreme points of the linear relaxation are integer when required) for a single decision tree. Although the formulation is generally not ideal for tree ensembles, it generally has fewer extreme points, leading to a faster time to solve. We also study formulations with a binary representation of the feature vector and present multiple approaches to tighten existing formulations. We show that fractional extreme points are removed when multiple splits are on the same feature. At an extreme, we prove that this results in an ideal formulation for a tree ensemble modeling a one-dimensional feature vector. Building on this result, we also show that these additional constraints result in significantly tighter linear relaxations when the feature vector is low dimensional.en_US
dc.publisherSpringer USen_US
dc.relation.isversionofhttps://doi.org/10.1007/s10994-025-06771-8en_US
dc.rightsCreative Commons Attributionen_US
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/en_US
dc.sourceSpringer USen_US
dc.titleTight mixed-integer optimization formulations for prescriptive treesen_US
dc.typeArticleen_US
dc.identifier.citationBiggs, M., Perakis, G. Tight mixed-integer optimization formulations for prescriptive trees. Mach Learn 114, 156 (2025).en_US
dc.contributor.departmentSloan School of Managementen_US
dc.relation.journalMachine Learningen_US
dc.identifier.mitlicensePUBLISHER_CC
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dc.date.updated2025-07-18T15:31:44Z
dc.language.rfc3066en
dc.rights.holderThe Author(s)
dspace.embargo.termsN
dspace.date.submission2025-07-18T15:31:44Z
mit.journal.volume114en_US
mit.licensePUBLISHER_CC
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record