Please use this identifier to cite or link to this item: http://hdl.handle.net/10071/5356
Full metadata record
DC FieldValueLanguage
dc.contributor.authorJardim, David-
dc.contributor.authorOliveira, Sancho-
dc.contributor.authorNunes, Luís-
dc.date.accessioned2013-07-30T14:04:49Z-
dc.date.available2013-07-30T14:04:49Z-
dc.date.issued2013-07-30-
dc.identifier.urihttp://hdl.handle.net/10071/5356-
dc.description.abstractIn this paper we present a method that allows an agent to discover and create temporal abstractions autonomously. Our method is based on the concept that to reach the goal, the agent must pass through relevant states that we will interpret as subgoals. To detect useful subgoals, our method creates intersections between several paths leading to a goal. Our research focused on domains largely used in the study of temporal abstractions. We used several versions of the room-to-room navigation problem. We determined that, in the problems tested, an agent can learn more rapidly by automatically discovering subgoals and creating abstractions.por
dc.language.isoengpor
dc.rightsrestrictedAccesspor
dc.subjectAutonomous Agentspor
dc.subjectMachine Learningpor
dc.subjectReinforcement Learningpor
dc.subjectSub-goalspor
dc.titleHierarchical Reinforcement Learning: Learning Sub-goals and State-Abstractionpor
dc.typeconferenceObjectpor
dc.event.titleWorkshop on Intelligent Systems and Application (WISA 2011), 6ª Conferência Ibérica de Sistemas e Tecnologias de Informação (CISTI'2011)por
dc.event.typeWorkshoppor
dc.event.locationChaves, Portugalpor
dc.event.date2011por
dc.paginationVol. II, pp. 245 - 248por
dc.publicationstatusPublicadopor
dc.peerreviewedSimpor
Appears in Collections:CTI-CRI - Comunicações a conferências internacionais

Files in This Item:
File Description SizeFormat 
HRL Short Paper.pdf
  Restricted Access
321,61 kBAdobe PDFView/Open Request a copy


FacebookTwitterDeliciousLinkedInDiggGoogle BookmarksMySpaceOrkut
Formato BibTex mendeley Endnote Logotipo do DeGóis Logotipo do Orcid 

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.