Post on 31-Dec-2015
description
-Artificial Neural Network- Hopfield Neural Network(HNN)
朝陽科技大學資訊管理系李麗華 教授
朝陽科技大學 李麗華 教授2
Assoicative Memory (AM) -1
• Def: Associative memory (AM) is any device that associates a set of predefined output patterns with specific input patterns.
• Two types of AM:– Auto-associative Memory: Converts a
corrupted input pattern into the most resembled input.
– Hetro-associative Memory: Produces an output pattern that was stored corresponding to the most similar input pattern.
朝陽科技大學 李麗華 教授3
Models: It is the associative mapping of an input vector X
into the output vector V.
EX: Hopfield Neural Network (HNN)
EX: Bidirectional Associative Memory (BAM)
Assoicative Memory (AM) - 2
Assoicative
Memory
X1
X2
X3
:
Xn
v1
v2
v3
:
vm
朝陽科技大學 李麗華 教授4
Introduction
• Hopfield Neural Network(HNN) was proposed by Hopfield in 1982.
• HNN is an auto-associative memory network.• It is a one layer, fully connected network.
X1 X2 Xn…
…
朝陽科技大學 李麗華 教授5
HNN Architecture• Input : Xi -1, +1﹛ ﹜• Output : same as input(∵single layer network)
• Transfer function : Xi new=
• Weights :
• Connections :
+1 net j > 0Xi if netj = 0
-1 net j < 0
pj
pi
pij XXW 0iiW
X1 X2 Xn…
…
(Xi 是指前一個 X 值 )
朝陽科技大學 李麗華 教授6
HNN Learning Process• Learning Process :
a. Setup the network, i.e., design the input nodes & connections.
b. Calculate and derived the weight matrix
C. Store the weight matrix. The learning process is done when the weight matrix is derived.
pj
pi
pij XXW
iiW
We shall obtain a nxn weight matrix, Wnxn
.
朝陽科技大學 李麗華 教授7
HNN Recall Process
• Recalla. Read the nxn weight matrix, W
nxn.
b. Input the test pattern X for recalling.
c. Compute new input ( i.e. output )
d. Repeat process c. until the network converge
( i.e. the net value is not changed or the error is very small )
+1 net j > 0Xj
old if net j = 0+1 net j < 0
X j :
( or net = W X ‧ i )iiji
j XWnet
X new
朝陽科技大學 李麗華 教授8
Example: Use HNN to memorize patterns (1)
• Use HNN to memorize the following patterns. Let the Green color is represented by “1” and white color is represented by “-1”. The input data is as shown in the table
P X1 X2 X3 X4 X5 X6
X1 1 -1 1 -1 1 -1
X2 -1 1 -1 1 -1 1
X3 1 1 1 1 1 1
X4 -1 -1 -1 -1 -1 -1
X1 X2 X3 X4
朝陽科技大學 李麗華 教授9
• Wii=0
1
1
1
1
1,1,1,112W
2112 WW
1
1
1
1
1,1,1,113W
3113 WW
1
1
1
1
1,1,1,114W
4114 WW
1
1
1
1
1,1,1,115W 5115 WW
1
1
1
1
1,1,1,116W 6116 WW
4
0
4
0
0
0
26
25
24
23
22
21
W
W
W
W
W
W
0
4
0
0
4
0
56
46
45
36
35
34
W
W
W
W
W
W
P X1 X2 X3 X4 X5 X6
X1 1 -1 1 -1 1 -1
X2 -1 1 -1 1 -1 1
X3 1 1 1 1 1 1
X4 -1 -1 -1 -1 -1 -1
Example: Use HNN to memorize patterns (2)
朝陽科技大學 李麗華 教授10
1
1
1
1
1
1
004040
000404
400040
040004
404000
040400
W
004040
000404
400040
040004
404000
040400
W
tXWnet ‧Recall
1,1 ,1,1 ,1 ,1 tX
1
1
1
1
1
1
004040
000404
400040
040004
404000
040400
Example: Use HNN to memorize patterns (3)
The pattern is recalled as:
-Artificial Neural Network- Bidirectional Associative Memory (BAM)
朝陽科技大學資訊管理系李麗華 教授
朝陽科技大學 李麗華 教授12
Introduction
• Bidirectional Associative Memory (BAM) was proposed by Bart Kosko in 1985.• It is a hetro-associative memory network. • It allows the network to memorize from a set of pattern Xp to recall another set of pattern
Yp
Y1 Y2Ym‧‧‧‧‧‧
‧‧‧‧‧‧‧pX 1
pX 2pnX
朝陽科技大學 李麗華 教授13
Assoicative Memory (AM) 1
• Def: Associative memory (AM) is any device that associates a set of predefined output patterns with specific input patterns.
• Two types of AM:– Auto-associative Memory: Converts a
corrupted input pattern into the most resembled input.
– Hetro-associative Memory: Produces an output pattern that was stored corresponding to the most similar input pattern.
朝陽科技大學 李麗華 教授14
Models: It is the associative mapping of an input vector X
into the output vector V.
EX: Hopfield Neural Network (HNN)
EX: Bidirectional Associative Memory (BAM)
Assoicative Memory (AM) 2
Assoicative
Memory
X1
X2
X3
:
Xn
v1
v2
v3
:
vm
朝陽科技大學 李麗華 教授15
BAM Architecture
① Input layer :
② Output layer :
③ Weights :
④ Connection :
1,1 piX
1,1 jY
p
pj
piij YXW
Y1 Y2Ym‧‧‧‧‧‧
‧‧‧‧‧‧‧pX 1
pX 2pnX
It’s a 2-layer, fully connected, feed forward & feed back network.
朝陽科技大學 李麗華 教授16
BAM Architecture (cont.)
⑤ Transfer function :
i
jiji
i
ioldi
i
newi YWnetwhere
net
netifX
net
X ‧
01
0
01
j
ijij
j
joldj
j
newj WXnetwhere
net
netifY
net
Y ‧
01
0
01
朝陽科技大學 李麗華 教授17
BAM Example(1/4)
Test pattern
Y1 Y 2 Y3 Y4
1 -1 1 -1 1 -1 1 -1 1 -1
-1 1 -1 1 -1 1 -1 1 -1 1
1 1 1 1 1 1 -1 -1 -1 1-1
-1 -1 -1 -1 -1 -1 1 1 1 1
2X 3X 4X 5X 6X1X
●●●○●○
● ○●○● ○
○●○●○●
● ●● ● ●●
○ ○○ ○ ○○
朝陽科技大學 李麗華 教授18
BAM Example(2/4)
1. Learning– Set up network– Setup weights
p
pj
piij YXW
Y1 Y 2 Y3 Y4
1 -1 1 -1 1 -1 1 -1 1 -1
-1 1 -1 1 -1 1 -1 1 -1 1
1 1 1 1 1 1 -1 -1 -1 1-1
-1 -1 -1 -1 -1 -1 1 1 1 1
2X 3X 4X 5X 6X1X
朝陽科技大學 李麗華 教授19
BAM Example(3/4)2. Recall
①Read network weights②Read test pattern③Compute Y
iiji
j XWnet
01
0
01
j
jj
j
newj
net
netifY
net
Y
④ Compute X
⑤ Repeat (3) & (4) until converge
01
0
01
i
ii
inewi
net
netifX
net
X
ijjj WYnet j
朝陽科技大學 李麗華 教授20
BAM Example(4/4)• 聚類之 Application
∵
W4*6
11 11 1 1
1,1,1,14*6
W
(1)
(2) 二次都相同
test pattern (1 1 1 -1 1 -1)1*6
●●●○●○