Off-campus UMass Amherst users: To download campus access dissertations, please use the following link to log into our proxy server with your UMass Amherst user name and password.
Non-UMass Amherst users: Please talk to your librarian about requesting this dissertation through interlibrary loan.
Dissertations that have an embargo placed on them will not be available to anyone until the embargo expires.
Author ORCID Identifier
Open Access Dissertation
Doctor of Philosophy (PhD)
Year Degree Awarded
Month Degree Awarded
Artificial Intelligence and Robotics | Cognitive Neuroscience | Robotics
In the book "On Intelligence", Hawkins states that intelligence should be measured by the capacity to memorize and predict patterns. I further suggest that the ability to predict action consequences based on perception and memory is essential for robots to demonstrate intelligent behaviors in unstructured environments. However, traditional approaches generally represent action and perception separately---as computer vision modules that recognize objects and as planners that execute actions based on labels and poses. I propose here a more integrated approach where action and perception are combined in a memory model, in which a sequence of actions can be planned based on predicted action outcomes. In this framework, hierarchical visual features based on convolutional neural networks are introduced to capture the essential affordances. These features in different hierarchies are associated with robot controllers of corresponding kinematic subchains to support manipulation. Through learning from demonstration, both actions and informative features in the memory model can be learned efficiently. As more demonstrations are recorded and more interactions are observed, the robot becomes more capable of predicting the consequences of actions, thus, is better at planning sequences of actions to solve tasks under different circumstances.
Ku, Li Yang, "Integration of Robotic Perception, Action, and Memory" (2018). Doctoral Dissertations. 1364.