-Artificial Neural Network- Chapter 9 Self Organization Map(SOM) 朝陽科技大學 資訊管理系...
-
Upload
justin-chandler -
Category
Documents
-
view
229 -
download
11
Transcript of -Artificial Neural Network- Chapter 9 Self Organization Map(SOM) 朝陽科技大學 資訊管理系...
-Artificial Neural Network- Chapter 9 Self Organization Map(SOM)
朝陽科技大學資訊管理系李麗華 教授
朝陽科技大學 李麗華 教授 2
Introduction• It’s proposed by Kohonen in 1980.• SOM is an unsupervised two layered network that can o
rganize a topological map from a random starting point.
SOM is also called Kohnen’s self organizating feature map.
The resulting map shows the natural relationships among the patterns that are given to the network.
• The application is good for clustering analysis.
朝陽科技大學 李麗華 教授 3
Network Structure– One input layer– One competitive layer which is usually a 2-Dim grid
Input layer : f(x) : xOutput layer : competitive layer with topological map relationship Weights : randomly assigned
X1X2
Yk1 Yk2Ykj
j
Ykj
Y1kY11 Y12
Yjk
( Xij,Yij )
W1jk Wijk
k
朝陽科技大學 李麗華 教授 4
Concept of NeighborhoodCenter : the winning node C is the center.Distance :
R Factor (鄰近係數): RFj : f ( rj,R ) =e(- rj /R)
e(-rj/R)→ 1 when rj = ∮ e(-rj/R)→ when r∮ j= ∞
e(-rj/R)→ 0.368 when rj = R The longer the distance, the smaller the neighborhood area.
R Factor Adjustment : Rn=R-rate Rn-1 , R-rate<1.0
22yyxxj CNCNr N is the node is output N
to C rj is the distance from N to C
R : the radius of the neighborhood rj : distance from N to C
朝陽科技大學 李麗華 教授 5
Learning1. Setup network2. Randomly assign weights to W3. Set the coordinate value of the output layer N ( x,y )4. Input a training vector X5. Compute the winning node6. Update weight W with R factor△7. ηn=η-rate ηn-1
Rn=R-rate ηn-1
8. Repeat from 4 to 7 until converge
朝陽科技大學 李麗華 教授 6
Reuse the network
1. Setup the network
2. Read the weight matrix
3. Set the coordriate value of the output layer N ( x,y )
4. Read input vector
5. Compute the winning node
6. Output the clustering result Y.
朝陽科技大學 李麗華 教授 7
Computation process
X1~Xn ( Input Vector ) Njk ( Output )
jkkjkjnetnet
min**
i
ijkijk WXnet 2
1 j=j*&k=k* ifφ others
3. Yjh =
2. Compute winning node
1. Setup network
朝陽科技大學 李麗華 教授 8
Computation process (cont.)
△Wijk=η ( Xi - Wijk ). RF
jk
When Rr
JKjkeRF /
Wijk= W△ ijk+Wijk
rjk= 222** *)(*)( kkjjNN kjjk
4. Update Weights
7. Repeat until converge
ηn=η-rate‧ηn-1 Rn= R-rate R‧ n-16.
5. Repeat 1-4 for all input
朝陽科技大學 李麗華 教授 9
Example• Let there be one 2-Dim clustering problem.
• The vector space include 5 different clusters and each has 2 sample sets.
X1 X2
1 -0.9 -0.8
2 -0.8 0.6
3 0.9 0.6
4 0.7 -0.4
5 -0.2 0.2
6 -0.7 -0.6
7 -0.9 0.8
8 0.7 0.6
9 0.8 -0.8
10 0.1 -0.2
-1
-1
1
1
1
6
4
9
5
10
83
72
朝陽科技大學 李麗華 教授 10
SolutionSetup an 2X9 network
Randomly assign weights
Let R=2.0 η=1.0
代入第一個 pattern[-0.9, -0.8]
RFjk= R
rj
e
X1 X2
Wi00 -0.2 -0.8
Wi01 0.2 -0.4
Wi02 0.3 0.6
Wi10 -0.4 0.6
Wi11 -0.3 0.2
Wi12 -0.6 -0.2
Wi20 0.7 0.2
Wi21 0.8 -0.6
Wi22 -0.8 -0.6
Wi10
Wi11
Wi02
Wi20
Wi12
Wi22 Wi00
Wi01Wi21
-1
1
-1
1
• net00=[(-0.9+0.2)2+(-0.8+0.8) 2]=0.49
• net01=[(-0.9-0.2) 2+(-0.8+0.4) 2]=1.37
• net02=[(-0.9+0.3) 2+(-0.8-0.6) 2]=2.32
• net10=[(-0.9+0.4) 2+(-0.8-0.6) 2]=1.71
• net11=[(-0.9+0.3) 2+(-0.8-0.2) 2]=1.36
• net12=[(-0.9+0.6) 2+(-0.8+0.2) 2]=0.45
• net20=[(-0.9+0.7) 2+(-0.8-0.2) 2]=1.04
• net21=[(-0.9-0.8) 2+(-0.8+0.6) 2]=2.93
• net22=[(-0.9+0.8) 2+(-0.8+0.6) 2]=0.05
kjjknet
,
]min[ net
22
【MIN】Winning Node
朝陽科技大學 李麗華 教授 11
Solution (cont.)Update weight
282.2)20()20( 22
j*=2 k*=2
r00= RF00=0.243
22 )21()20(
22 )22()20(
r01=RF01=
r02 =
r22= 22 )22()22( RF22= 1
W00=η (X‧ 1-W00) RF‧ 00
1.0 × ( -0.9+0.2 ) × ( 0.243 ) = - 0.17
-0.17
Wi10
Wi11
Wi02
Wi20
Wi12
Wi22
Wi00
Wi01
Wi21
-1
1
-1
1