MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Libraries
  • MIT Theses
  • Graduate Theses
  • View Item
  • DSpace@MIT Home
  • MIT Libraries
  • MIT Theses
  • Graduate Theses
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

NeverMind : an interface for human Memory augmentation

Author(s)
Rosello, Oscar (Rosello Gil)
Thumbnail
DownloadFull printable version (806.5Kb)
Alternative title
Never mind : an interface for human Memory augmentation
Interface for human Memory augmentation
An interface for human memory augmentation
Other Contributors
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science.
Advisor
Terry Knight, Patrick H. Winston and Pattie Maes.
Terms of use
MIT theses are protected by copyright. They may be viewed, downloaded, or printed from this source but further reproduction or distribution in any format is prohibited without written permission. http://dspace.mit.edu/handle/1721.1/7582
Metadata
Show full item record
Abstract
If we are to understand human-level intelligence, we need to understand how memories are encoded, stored and retrieved. In this thesis, I take a step towards that understanding by focusing on a high-level interpretation of the relationship between episodic memory formation and spatial navigation. On the basis of the biologically inspired process, I focus on the implementation of NeverMind, an augmented reality (AR) interface designed to help people memorize effectively. Early experiments conducted with a prototype of NeverMind suggest that the long-term memory recall accuracy of sequences of items is nearly tripled compared to paper-based memorization tasks. For this thesis, I suggest that we can trigger episodic memory for tasks that we normally associate with semantic memory, by using interfaces to passively stimulate the hippocampus, the entorhinal cortex, and the neocortex. Inspired by the methods currently used by memory champions, NeverMind facilitates memory encoding by engaging in hippocampal activation and promoting task-specific neural firing. NeverMind pairs spatial navigation with visual cues to make memorization tasks effective and enjoyable. The contributions of this thesis are twofold: first, I developed NeverMind, a tool to facilitate memorization through a single exposure by biasing our minds into using episodic memory. When studying, we tend to use semantic memory and encoding through repetition; however, by using augmented reality interfaces we can manipulate how our brain encodes information and memorize long term content with a single exposure, making a memory champion technique accessible to anyone. Second, I provide an open-source platform for researchers to conduct high-level experiments on episodic memory and spatial navigation. In this thesis I suggest that digital user interfaces can be used as a tool to gather insights on how human memory works.
Description
Thesis: S.M. in Architecture Studies - Design and Computation, Massachusetts Institute of Technology, Department of Architecture, 2017.
 
Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2017.
 
Cataloged from PDF version of thesis. "June 2017."
 
Includes bibliographical references (pages [67]-70).
 
Date issued
2017
URI
http://hdl.handle.net/1721.1/111494
Department
Massachusetts Institute of Technology. Department of Architecture; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Publisher
Massachusetts Institute of Technology
Keywords
Architecture., Electrical Engineering and Computer Science.

Collections
  • Graduate Theses

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.