Sie sind auf Seite 1von 40

HowMindsWork

MemoriesandLearning
StanFranklin
ComputerScienceDivision&
InstituteforIntelligentSystems
TheUniversityofMemphis

HumanLearning
Moreorless
Continual
Quick
Efficient
Accurate
Robust
Flexible
Effortless
HowMindsWork:Memory&Learning

ProblemswithMachineLearning

Requireslarge,accuratetrainingsets
Littleawarenessofwhatsknownornotknown
Integratesnewknowledgepoorlyintoold
Learnsonetaskatatime
Littletransferoflearnedknowledgetonewtasks
Pooratlearningfromhumanteachers

HowMindsWork:Memory&Learning

Formorehumanlikelearning
inmachines
Designautonomous
softwareagentsand
mobilerobots
usingprinciples
fromhumanlearning

HowMindsWork:Memory&Learning

SomePrinciplesofHumanLearning
Theresnolearningfromscratch
Welearnwhatweattendto
Learningisatrialanderrorprocess
Muchmemoryisassociativeand
contentaddressable

HowMindsWork:Memory&Learning

LessonsforMachineLearning
Buildin
Primitivefeaturedetectors
Preferencesforlearnings
Attentionmechanism
Baselevelactivation
Inversesigmoidaldecay

Makememory
Associative
Contentaddressable

HowMindsWork:Memory&Learning

TypesofHumanLearning
RequiringDistinctMechanisms
Perceptuallearning

Procedurallearning

Identify,classify,
situate

Newskills
Improveskills
Automatize

Episodiclearning
What,where,when

Attentionallearning
Towhattoattend

HowMindsWork:Memory&Learning

ExamplesofLearningMechanisms
Perceptuallearning
viaaSlipnet
Episodiclearningvia
SparseDistributed
Memory
Procedurallearning
viaaSchemaNet

HowMindsWork:Memory&Learning

Amechanismforperceptuallearning
Semanticnetwithactivationpassing
Nodesrepresentfeatures,individuals,
categories,ideas
Linksrepresentrelations,someexcitatory,
someinhibitory
Totalactivation=current+baselevel
Perceptcomposedofnodesoverthreshold
Learningmodifiesbaselevelactivation

HowMindsWork:Memory&Learning

Amechanismforepisodiclearning
Sparsedistributedmemory
Fewhardlocations,HUGEaddressspace
Eachhardlocationcontributes
totheencodingofmanydifferentevents
Eacheventisencoded
tomanydifferenthardlocations
Associativeandcontentaddressable
Psychologicalproperties
Knowswhenitdoesntknow
Tipofthetonguephenomena

HowMindsWork:Memory&Learning

10

AMechanismforProceduralLearning
Procedurallearningviaaschemanet
Activation=current+baselevel
Procedurallearning
Reinforcesbaselevelactivation
Formsnewschemas
Schemasinstantiate,activate,bind
Incrementallearningproduces
newbehaviorstreams

HowMindsWork:Memory&Learning

11

CognitiveCycles
Everyautonomousagentsoperatesinthe
worldbyfrequent,probablycascading,
senseprocessactcycles
Learningtakesplaceduringeachcycle
Learningisafunctionofattention
andofarousallevel.
Feelingsandemotionsmodulatelearning

HowMindsWork:Memory&Learning

12

HumanCognitiveCycleProcessing
HypothesisHumancognitiveprocessingisvia
acontinuingsequenceofCognitiveCycles
Duration

Eachcognitivecycletakesroughly200ms

Cascading Severalcyclesmayhavepartsrunning
simultaneouslyinparallel
Seriality

Consciousnessmaintainsserialorder
andtheillusionofcontinuity

Start
Cyclemaystartwithactionselection
insteadofperception

HowMindsWork:Memory&Learning

13

HowMindsWork:Memory&Learning

14

PerceptualAssociativeMemory
Abilitytointerpretincomingstimuliby
recognizingindividuals
categorizingthem
notingsituations

Ubiquitousamonganimalspecies
Animalsofallsortscanidentifyfood
sources,potentialmates,potential
predators,etc.
HowMindsWork:Memory&Learning

15

ExamplesofPAM
Pigeonscancategorizephotosusingsuch
conceptsastree,fish,andhuman
Honeybeescanidentifyletters
independentlyofsize,color,positionorfont
AfricanGreyParrot(Alex)canidentifysuch
featuresassize,number,color,andmaterial
of(setsof)objectsneverbeforeseen

HowMindsWork:Memory&Learning

16

PerceptualLearningPremises
(Almost?)ubiquitousamonganimals
Evolutionarilyolderthansemanticmemory
Distinctneuralmechanisms
Consciousawarenesssufficient
Facilitatedbyfeelingsandemotions
Decaysbyaninversesigmoidfunction
Firststepinasensecognizeactcycle
HowMindsWork:Memory&Learning

17

Sensation
Sensoryreceptorsaredirectedbyaction
Saccadesoftheeyes
Sniff
Turingofanear
Sendingofanecholocationsignal

Theenvironmentimpingesonreceptors

HowMindsWork:Memory&Learning

18

Perception
Filterssensoryinputbasedonexpectation
Simultaneouslyattachesmeaningtoit
Identifiesindividuals,categories,situations
andfeelings
Producesaperceptincludingindividuals,
categories,relations,ideas,andsome
interpretedsensorydata,i.e,qualia

HowMindsWork:Memory&Learning

19

PerceptioninIDAsCycle
Inputarrivesthroughsenses
Perceptioncodeletsfindfeaturesand
activateappropriatenodesintheslipnet
Activationpassesfromnodetonode
untiltheslipnetstabilizes
Streamsfromdifferentsensesconverge&
bitsofmeaningcombineintolargerchunks
Nodesoverthresholdformthepercept
Sensorystimulireceived&interpreted,
producinginitialmeaning
HowMindsWork:Memory&Learning

20

EachNodeaFeatureDetector
Primitivefeaturedetectordirect
connectiontoreceptivefield
Higherlevelfeaturedetectorscombine
lowerlevelfeaturedetectors
Objectnodesdetectfeaturesofobject
Categorynodesdetectmembersas
features,aswellasotherfeatures
HowMindsWork:Memory&Learning

21

PerceptualLearning
Usingthebroadcastcontents
ofconsciousness
Strengthen(orweaken)existing
objects,categories,ideas,relations,
feelings,etc.
Createnewobjects,categories,
situations

HowMindsWork:Memory&Learning

22

ContentsofConsciousness
Slipnetnodesareperceptualsymbols
Uniformrepresentationthroughout
theIDAmodel
Slipnetnodescompriseconscious
contents

HowMindsWork:Memory&Learning

23

ModifyingExistingNodes
Currentactivation
fromothernodes
startingwithprimitivefeaturedetectors
rapiddecay

Baselevelactivation
Inversesigmoidaldecay

Totalnodeactivationafunctionofcurrent
andbaselevelactivations
Perceptuallearningmodifiesbaselevel
activationofeachnodeinconsciouscontents
HowMindsWork:Memory&Learning

24

InverseSigmoidalDecay
Decay
rate
Baselevelactivation

Lowbaselevelactivation=rapiddecay
Saturatedbaselevelactibation=
almostnodecay
HowMindsWork:Memory&Learning

25

LearningNewNodes
Objectnodesfromnotingcontiguity
ofmotionoffeatures
Categorynodesfromnoting
similarityofobjects
Requiresspecificattentioncodelets
Agenerate&testproceduredueto
inversesigmoidaldecay
HowMindsWork:Memory&Learning

26

HowMindsWork:Memory&Learning

27

TransientEpisodicMemory
Recordswhat,when,how,feelings,actions
Contentaddressablefrompartialcues
Consciouslynotedeventsareencoded
Modulatedbyfeelingsandemotions
Moreaffect,strongerencoding
Moreaffect,moreofteninconsciousness

Decaysinhoursoraday
HowMindsWork:Memory&Learning

28

WritingtoTEM
Usessparsedistributedmemory
Perceptualsymbols=universal
representation
Primitivefeaturedetectors=subsetsof
dimensions
Translatehigherlevelfeaturedetectors
toprimitivefeaturedetectorsforwriting

HowMindsWork:Memory&Learning

29

ReadingfromTEM
Contentsofworkingmemory
formthecue
Mustbetranslatedto
primitivefeaturedetectors
ReturnfromSDMrouted
throughperceptionto
produceperceptualsymbols
HowMindsWork:Memory&Learning

30

DeclarativeMemory
Autobiographical+semantic
ReadfromjustasTEM
WriteisconsolidationfromTEM
Inversesigmoidaldecay
Candecayveryquickly
Canlastfordecades

HowMindsWork:Memory&Learning

31

HowMindsWork:Memory&Learning

32

ProceduralLearning
Learningnewtasksinstructionist
Reinforcingoldtasksselectionist
Learningviaconsciousness
Primitiveeffectors(motorneurons&
muscles)notlearned
Veryshorttermtoverylongterm

HowMindsWork:Memory&Learning

33

Implementation
Procedurallearningviaaschemanet
Schema(context,action,result)
=behaviorcodeletinprimingmode
Primitiveeffector=emptyschema
(onlyaction)
Linksfromschematoderivedschema

HowMindsWork:Memory&Learning

34

ActivationofaSchema
Modifyasaresultofconsciouscontent
Baselevelactivationreinforceifaction
succeeded
Currentactivationdependson
Relevanceofcontexttocurrentsituation
Relevanceofresulttocurrentgoalsor
feelings

Totalactivation=baselevel+current
HowMindsWork:Memory&Learning

35

DecayofSchemaActivation
Baselevelactivationinversesigmoidal
Lowlevelschemasdecayveryrapidly
Saturatedschemasdecayhardlyatall
Learningcanbeshortorlongterm

Currentactivationdecaysrapidly

HowMindsWork:Memory&Learning

36

SelectionistLearning
Selectionist=reinforcement
Consciousbroadcastsaysexpectationmet
Affectvalencepositive
Increasebaselevelactivationinproportion
toaffectlevel
Valencenegativeorexpectationnotmet
Decreasebaselevelactivationin
proportiontoaffectlevel
HowMindsWork:Memory&Learning

37

InstructionistLearning
Behavior=goalcontext=schemawith
parallelcompoundaction
Mergetwoschemaintoathird
Behaviorstream=goalcontexthierarchy=
schemawithsequentialcompoundaction
Mergetwoschemaintoathird

HowMindsWork:Memory&Learning

38

AttentionalLearning
Builtinattentioncodeletsfor
Temporalsequence(=causality)
Similarity(=categorization)
Contiguityofmotion(=objectformation)

Expectationcodeletspawnedwith
eachexecutedbehavior(codelet)
Intentioncodeletproducedwitheach
volitionaldecision
Otherslearned?
HowMindsWork:Memory&Learning

39

EmailandWebAddresses
StanFranklin
franklin@memphis.edu
www.cs.memphis.edu/~franklin
ConsciousSoftwareResearchGroup
www.csrg.memphis.edu/

HowMindsWork:Memory&Learning

40

Das könnte Ihnen auch gefallen