Show simple item record

dc.contributor.authorCummings, M. L.
dc.date.accessioned2014-09-24T18:48:54Z
dc.date.available2014-09-24T18:48:54Z
dc.date.issued2006
dc.identifier.urihttp://hdl.handle.net/1721.1/90321
dc.description.abstractWhen the human element is introduced into decision support system design, entirely new layers of social and ethical issues emerge but are not always recognized as such. This paper discusses those ethical and social impact issues specific to decision support systems and highlights areas that interface designers should consider during design with an emphasis on military applications. Because of the inherent complexity of socio-technical systems, decision support systems are particularly vulnerable to certain potential ethical pitfalls that encompass automation and accountability issues. If computer systems diminish a user’s sense of moral agency and responsibility, an erosion of accountability could result. In addition, these problems are exacerbated when an interface is perceived as a legitimate authority. I argue that when developing human computer interfaces for decision support systems that have the ability to harm people, the possibility exists that a moral buffer, a form of psychological distancing, is created which allows people to ethically distance themselves from their actions.en_US
dc.publisherJournal of Technology Studiesen_US
dc.subjectdecision support system designen_US
dc.subjectethicsen_US
dc.subjecthuman computer interfacesen_US
dc.titleAutomation and Accountability in Decision Support System Interface Designen_US
dc.typeArticleen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

  • HAL Reports
    Technical Reports Series - Humans and Automation Laboratory

Show simple item record