Show simple item record

dc.contributor.advisorRichard Holton and Rae Langton.en_US
dc.contributor.authorDougherty, Tom (Tom J.)en_US
dc.contributor.otherMassachusetts Institute of Technology. Dept. of Linguistics and Philosophy.en_US
dc.date.accessioned2011-04-25T15:53:56Z
dc.date.available2011-04-25T15:53:56Z
dc.date.copyright2010en_US
dc.date.issued2010en_US
dc.identifier.urihttp://hdl.handle.net/1721.1/62407
dc.descriptionThesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Linguistics and Philosophy, 2010.en_US
dc.descriptionCataloged from PDF version of thesis.en_US
dc.descriptionIncludes bibliographical references (p. 54-56).en_US
dc.description.abstractIf you can help someone without detriment to your interest, nor anyone else's interest, then it is clear that you ought to do so. But things are not always so easy. When there is a conflict of interest, you have to decide what to do. Whom should you help when you cannot help everyone? How much should you sacrifice to help others? May you expose people to risks when helping them? This dissertation addresses aspects of these questions. You ought to save a larger group of people rather than a distinct smaller group of people, all else equal. Why? Chapter 1, "Rational Numbers," offers an explanation. Its two parts can be roughly summarized as follows. First, you are morally required to want each person's survival for its own sake. Second, you are rationally required to achieve as many of these ends as possible, if you have these ends. Chapter 2, "Ambition and Altruism in the Dynamic Moral Life," poses a puzzle. We would like an account of beneficence to be moderately demanding, and yet still to require you to be ambitious with your altruism. How can these diverging desiderata be simultaneously met? Drawing on empirical work, the chapter defends the following solution: beneficence requires you to develop morally, and increase how much you give over time. Chapter 3, "Chancy Charity and Aggregative Altruism," argues that two initially attractive claims are inconsistent. First, you must save someone's life rather than cure the headaches of many. Second, you may take a small risk of someone's death when curing this person's headache. Since we are unable to hold both these claims, we are in danger of lacking an explanation of some common intuitions about risk and the priority of serious needs. A candidate explanation is considered, but criticized.en_US
dc.description.statementofresponsibilityby Tom Dougherty.en_US
dc.format.extent56 p.en_US
dc.language.isoengen_US
dc.publisherMassachusetts Institute of Technologyen_US
dc.rightsM.I.T. theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission. See provided URL for inquiries about permission.en_US
dc.rights.urihttp://dspace.mit.edu/handle/1721.1/7582en_US
dc.subjectLinguistics and Philosophy.en_US
dc.titleHelp! not just anybody : essays on altruism and conflicts of interesten_US
dc.title.alternativeEssays on altruism and conflicts of interesten_US
dc.typeThesisen_US
dc.description.degreePh.D.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Linguistics and Philosophy
dc.identifier.oclc710844064en_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record