Show simple item record

dc.contributor.advisorIsola, Phillip
dc.contributor.authorChai, Lucy
dc.date.accessioned2023-11-02T20:05:09Z
dc.date.available2023-11-02T20:05:09Z
dc.date.issued2023-09
dc.date.submitted2023-09-21T14:26:11.823Z
dc.identifier.urihttps://hdl.handle.net/1721.1/152643
dc.description.abstractImage synthesis has developed at an unprecedented pace over the past few years, giving us new abilities to create synthetic yet photorealistic content. Typically, unconditional synthesis takes in a tensor of random numbers as input and produces a randomly generated image that mimics real-world content, with little to no way of controlling the result. The work contained in this thesis explores two avenues of obtaining controllable content from image generative models using emergent and designed priors. Emergent priors leverage the capabilities of a pre-trained generator to infer how the world operates, simply by training on large quantities of data. On the other hand, designed priors use built-in constraints to enforce desired properties about the world. Using emergent priors, we can control content by discovering factors of variation and compositional properties in the latent space of synthesis models. We further add coordinate information and camera inputs as designed controls to generate continuous-resolution and 3D-consistent imagery.
dc.publisherMassachusetts Institute of Technology
dc.rightsIn Copyright - Educational Use Permitted
dc.rightsCopyright retained by author(s)
dc.rights.urihttps://rightsstatements.org/page/InC-EDU/1.0/
dc.titleControlling Image Synthesis with Emergent and Designed Priors
dc.typeThesis
dc.description.degreePh.D.
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
mit.thesis.degreeDoctoral
thesis.degree.nameDoctor of Philosophy


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record