"Enhancing Intelligent Agents with Episoic Memories"

30
Enhancing intelligent agents with episodic memories Dan Tecuci d [email protected] Cognitive Systems Institute Weekly Meeting Sep 8, 2016

Transcript of "Enhancing Intelligent Agents with Episoic Memories"

Enhancing intelligent agents with episodic memories

Dan [email protected]

Cognitive Systems Institute Weekly Meeting

Sep 8, 2016

2

Outline

• Motivation

– Human Memory

– Why is episodic memory needed in a cognitive

system?

• Approach

– Generic Episodic Memory Module

– Requirements

– A proposed implementation

• Evaluation

• Conclusions & Discussion

WHY DO WE NEED MEMORY?

“Those who cannot remember the past are condemned to repeat it.”

George Santyana

• Remembering is an essential characteristic of intelligence

• Humans can

– recall their past experience

– Use memories to

– solve similar problems

– avoid unwanted behavior

– recognize plans - infer other people’s goals

– remember own goals and track progress

• Memory and intelligence go hand in hand

Why it’s important to remember the past

4

• Experience

– important knowledge source

– mostly unused in current systems

• Importance of experience grows with

– Complexity of task

– Life-expectancy of system

• Eager approach (generalize & discard) – machine learning

– Assumes all value can be extracted up-front

• Lazy approach (store for now, use later)

– Defers (part of) learning till later

The Role of Memory in a Cognitive System

5

6

Benefits of Using Stored Memories

• Memory enables a system to:

–improve performance

– solve problem faster by adapting previous solutions

–improve competence

– informed search

–perform additional tasks

–avoid and detect failures

– monitor long-term goals

– reflect on past

7

Human Memory - Episodic vs. Semantic

• Differences

– concrete vs. abstract

– dated vs. timeless

– personal vs. general

• Similarities

– knowledge is acquired through senses

– automatic retention

– retrieval triggered by stimuli, automatic

8

Episodic Memory Functions

• Encoding

– activation (when to store an episode)

– salient feature selection (what to sore in an episode)

– cue selection (what features to use as cues)

• Storage

– how to maintain an episode in memory (forgetting)

• Retrieval

– cue construction

– matching

– recall

– recollective experience

A GENERIC EPISODIC

MEMORY MODULE

• Characteristics:

– Generic

– Same memory - different apps

– Can be used for various tasks and domains

– Separate from application

– Interface through API

– Store complex experience (e.g. temporal, graph-based)

• Memory function:

– Return most relevant prior episodes

• Advantages:

– focus on memory organization

– reduce complexity of overall system

PROPOSAL: Generic Memory Module

10

• Accuracy in retrieval - retrieve memories relevant to the situation at hand

• Scalability - accommodate a large number of episodes without a significant decrease in performance

• Efficiency - efficient storage and retrieval (both in space and time)

• Content addressability - memories should be addressable by their content

• Flexible matching - recognize prior situations even if they only partially match the current one

General Memory Requirements

11

• Conceptual representation for generic events

• Domain-independent storage/retrieval algorithms

• Flexible interface

Challenges

12

• Episode– unit of storage

– capture a complex event, with temporal extent.

– represented as Conceptual graphs (sets of S-P-O triples)

– Use ontology for concept representation

• Divide episodes into three dimensions

– context = setting of episode

– contents = ordered set of events

– outcome = evaluation of episode’s effects

• What constitutes an episode – application

Episode Representation

13

14

Planning Episode Example

Context: “move all perl scripts to the linux folder”

Contents:

sh> find -name linux

./code/linux

sh> find -name *.pl

./code/accessor.pl

./code/constructor.pl

./code/gang/dwarf/aml.pl

sh> mv ./bin/gang/set/convert.pl

./code/accessor.pl

./code/constructor.pl

./code/gang/dwarf/aml.pl

./code/linux

Outcome: “Success”

15

Using Stored Episodes

• Episodes should be multifunctional

– Same episode can be used for different purposes

• E.g. Planning Episode = [plan goal, plan steps, plan

outcome]

• Retrieval can be done on each dimension

– On context (plan goal) planning

– On contents (plan steps) plan recognition

outcome prediction

– On outcome root cause analysis

16

Memory Implementation - Storage

• Episodes stored unchanged (no generalization)

• Indexing

–separate on each dimension (context, contents,

outcome)

–shallow indexing (only feature types, no structure)

• Forgetting [AISB-10]

17

Memory Implementation–Retrieval

• Shallow indexing then deep semantic matching

– Shallow indexing: compute surface-level similarity

– Goal: Reduce pool of candidates

– Fast, high recall, low precision

– Deep semantic matching [Yeh-06]

– Goal: Resolve structural mismatches

– Slow, high precision

– Uses: taxonomic knowledge, transformation rules, resolves mismatches

• Given a new stimulus and an episode computes:

– similarities and differences

– quantitative and qualitative

•store (episode)

•retrieve (stimulus, dimension)

– Returns:

–most similar prior Episodes on dimension

–match score

–how they matched stimulus

–how they differ from stimulus

– mappings from stimulus to Episode

– Has an incremental version used for recognizing of sequences of events

Memory API

18

Goal:

–Make predictions after each observation

Applicable to:

– plan recognition, dialog understanding

Idea:

– segment episodes based on temporal links,

– recognize individual pieces

– then aggregate into episodes

– Confidence of recognized episode = a combination of:

– confidence in recognition of individual pieces

– the order in which they were observed

Incremental Recognition

19

20

Incremental Recognition Algorithm

initialize candidates

observe next action

new-candidates ← retrieve (current-action)

forall episode in new-candidates do

if episode not in candidates then

synchronize-candidate(episode, prior-actions)

forall candidate in candidates do

candidate-match ← match(curr-action, candidate)

candidate ← update-candidate(candidate-match)

candidates ← sort(candidates)

result ← sort(candidates)

make-available first-n (N, result)

EXPERIMENTAL EVALUATION

“In God we trust, all others bring data”

W Edwards Deming

22

Experimental Evaluation

Evaluated on Three Tasks [Tecuci-Diss]

–Memory-based planning:

initial state + goal plan

–Episodic-based goal recognition:

plan goal schema

–Memory-based question answering:

question answer

• Measured

–Task Performance (Precision, Recall)

–Memory performance (retrieval time, memory overhead)

• Same representation across tasks

23

Memory-Based Planning

• Problem

– Given: initial state, goal state, operators

– Find: sequence of operators that changes initial state

into goal state

• Solution:

– search (restricted: hierarchical, skeletal)

• Memory-based planning

– reuse and adapt past experience

24

Episodic-based Plan Recognition

• Plan Recognition problem:

– predict goals, intentions, future actions from observed actions

– keyhole, intended

• Desired characteristics

– incremental, early predictions

– extensible plan library

• Approaches:

– deductive, abductive, probabilistic, case-based

25

Memory-Based Problem Solving

• Problem– Given: KB, complex question

– Find: correct model in KB that answers question and explain answer

• Ex: A car starts from rest and reaches 28 m/s in 2 s. What distance does it cover?

• Questions = scenario + query

• Classical solution: search– KB size and complexity of models makes it infeasible or

incomplete

• Memory - fast access to relevant models

26

Summary of Evaluation Results

• Accuracy – same as

– exhaustive search (planning, problem solving)

– statistical approaches (plan recognition)

• Scalability

– retrieval not proportional to memory size

• Sped-up problem solving

• Multifunctional memory structure

Watson + Memory

• Application domains: complex tasks, temporal aspect

– Dialog

– Remembering what was said

– Goal detection

– Prediction

– Robotics

– Complex behaviors

– Virtual agent

– Prediction

• Memory as a Service ?

28

Summary

• The need for memory in cognitive systems

• Separation of memory from system

– Generic, reusable memory module

– Adds episodic memory functionality to system

• Requirements

• Implementation satisfying requirements

• Evaluation

– planning, plan recognition, problem solving

• [Yeh-06] Yeh, P “Flexible semantic matching of rich knowledge structures” PhD Diss, UT Austin 2006

• [Tecuci-Flairs-09] Tecuci, D; Porter, B. “Memory-based goal schema recognition” FLAIRS 2009

• [Tecuci-AAAI-06] Using an Episodic Memory Module for Pattern Capture and Recognition.

• [Tecuci-Diss] Tecuci 2007 A Generic Memory Module for Events, PhD Diss., Univ. of Texas

• [ICBO] Palla et. al. “A Metadata approach to querying multiple biomedical ontologies” ICBO 2011

• [KCap-11] Palla et. al. “Using Answer Set Programming for Representing and Reasoning with Preferences and Uncertainty in Dynamic Domains”

• [AI-04] Friedland et. al. 2004, Project Halo: Towards a Digital Aristotle, AI Magazine 25(4). 2004.

• [AI-10] Gunning et. al. “Project Halo update – Progress towards digital Aristotle”, AI Mag 2010

• [KR-04a] Barker et. al. 2004, A question-answering system for AP Chemistry: Assessing KR&R Technologies. KR 2004.

• [KR-04b] Friedland et.al. 2004, Towards a Quantitative, Platform-Independent Analysis of Knowledge Systems, KR 2004

• [AAAI-07] Barker et. al. 2007, Learning by Reading: A Prototype System, Performance Baseline and Lessons Learned. AAAI 2007.

• [KCAP-07] Chaw et al. 2007, Capturing a Taxonomy of Failures During Automatic Interpretation of Questions Posed in Natural Language

• [AISB-10] Nuxoll et. al Comparing Forgetting Algorithms for Artificial Episodic Memory Systems, RWWA at AISB 2010

• [KCAP-01] P. Clark et al. Knowledge Entry as the Graphical Assembly of Components. KCAP ’01

• [HALO] Vulcan Inc. Project Halo Website http://projecthalo.com/halotempl.asp?cid=21

• [KM] The Knowledge Machine http://www.cs.utexas.edu/users/mfkb/RKF/km.html

references

29

• Thank you!