Clustering _ishii_2014__ch10
-
Upload
kota-mori -
Category
Data & Analytics
-
view
476 -
download
0
Transcript of Clustering _ishii_2014__ch10
前置き K Means 法 凸クラスタリング法 まとめ
自己紹介
名前 森浩太 @KotaMori1
仕事 データ分析・コンサルティング ・マーケティング
専門領域 計量経済学
過去の研究 自殺と報道効果・新聞市場の分析
本日の報告は、私個人の資格で行うものであり、所属する組織
の見解等を代表するものではありません
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
モチベーション
クラスタ Cluster
特徴空間上で近接する
データのかたまり
クラスリング Clustering
データの分布状況からク
ラスタを見つける分析
1次元データ
2次元データ
3次元以上 → 数式で!
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
モチベーション
クラスタ Cluster
特徴空間上で近接する
データのかたまり
クラスリング Clustering
データの分布状況からク
ラスタを見つける分析
1次元データ
2次元データ
3次元以上 → 数式で!
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
モチベーション
クラスタ Cluster
特徴空間上で近接する
データのかたまり
クラスリング Clustering
データの分布状況からク
ラスタを見つける分析
1次元データ
2次元データ
3次元以上 → 数式で!
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
モチベーション
クラスタ Cluster
特徴空間上で近接する
データのかたまり
クラスリング Clustering
データの分布状況からク
ラスタを見つける分析
1次元データ
2次元データ
3次元以上 → 数式で!
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
モチベーション
クラスタ Cluster
特徴空間上で近接する
データのかたまり
クラスリング Clustering
データの分布状況からク
ラスタを見つける分析
1次元データ
2次元データ
3次元以上 → 数式で!
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
モチベーション
クラスタ Cluster
特徴空間上で近接する
データのかたまり
クラスリング Clustering
データの分布状況からク
ラスタを見つける分析
1次元データ
2次元データ
3次元以上 → 数式で!K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
概要
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
目次
前置き
K Means法
凸クラスタリング法
まとめ
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
K Means法
K Means法
I 各観測値を、c 個のプロトタイプのいずれかに割り当てる
I プロトタイプ数 c は既知とした上で、最適なプロトタイプの位置と割り当て方法を推計する
データ {xk}nk=1. xk ∈ Rd for each k
割り当て {δk}nk=1. δk ∈ {0, 1}d for each k
δjk =
{1 if xk belongs to type j
0 otherwise.
プロトタイプ {pj}cj=1. pj ∈ Rd for each j
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
K Means法
K Means法
I 各観測値を、c 個のプロトタイプのいずれかに割り当てる
I プロトタイプ数 c は既知とした上で、最適なプロトタイプの位置と割り当て方法を推計する
データ {xk}nk=1. xk ∈ Rd for each k
割り当て {δk}nk=1. δk ∈ {0, 1}d for each k
δjk =
{1 if xk belongs to type j
0 otherwise.
プロトタイプ {pj}cj=1. pj ∈ Rd for each j
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
K Means法
K Means法
I 各観測値を、c 個のプロトタイプのいずれかに割り当てる
I プロトタイプ数 c は既知とした上で、最適なプロトタイプの位置と割り当て方法を推計する
データ {xk}nk=1. xk ∈ Rd for each k
割り当て {δk}nk=1. δk ∈ {0, 1}d for each k
δjk =
{1 if xk belongs to type j
0 otherwise.
プロトタイプ {pj}cj=1. pj ∈ Rd for each j
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
K Means法
K Means法
I 各観測値を、c 個のプロトタイプのいずれかに割り当てる
I プロトタイプ数 c は既知とした上で、最適なプロトタイプの位置と割り当て方法を推計する
データ {xk}nk=1. xk ∈ Rd for each k
割り当て {δk}nk=1. δk ∈ {0, 1}d for each k
δjk =
{1 if xk belongs to type j
0 otherwise.
プロトタイプ {pj}cj=1. pj ∈ Rd for each j
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
K Means法 | 最適化
量子化誤差 割り当てたプロトタイプとデータとの距離
ek ≡∑c
j=1 δjk ||xk − pj ||2
ここで、|| · ||はユークリッドノルム(各要素を自乗したものの和)。
最適化 量子化誤差の総和を、p, δについて最小化
Minp,δ
e = Minp,δ
n∑k=1
ek
= Minp,δ
n∑k=1
c∑j=1
δjk ||xk − pj ||2
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
K Means法 | 一階条件
δk に関する最適化 各 k について、量子化誤差を最小にするタイプ j を選択
jk ≡ argminj||xk − pj ||2 (1)
δjk =
{1 if j = jk
0 otherwise(2)
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
K Means法 | 一階条件
pj に関する最適化 各 j に関して、δjk = 1となる観測値の量子化誤差の和を最小化する。
εj ≡n∑
k=1
δjk ||xk − pj ||2
これを pj に関して微分してゼロと置く(次項の公式
参照)。
0 =∂εj∂pj
= −n∑
k=1
2δjk(xk − pj)
これを解いて
pj =
∑nk=1 δjkxk∑nk=1 δjk
=
∑xk∈ωj
xknj
(3)
* (3)は、タイプ内平均となっていることに注意
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
補足 | ベクトル微分公式
∂
∂x||x||2 =
∂
∂xx′x
=∂
∂x
∑d
x2d
=
(∂
∂x1
∑d
x2d ,
∂
∂x2
∑d
x2d , . . . ,
∂
∂xD
∑d
x2d
)′
= (2x1, 2x2, . . . , 2xD)′
= 2x
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
K Means法 | 逐次的な解法
(1) (2) (3) を連立して解くのは困難なので、逐次的に解く
Step 1 (p1, . . . ,pc) を設定
Step 2 δk に関する最適化
jk ≡ argminj||xk − pj ||2
δjk =
{1 if j = jk
0 otherwise
Step 3 pj に関する最適化
p′j =
∑nk=1 δjkxk∑nk=1 δjk
=
∑xk∈ωj
xknj
Step 4 p′j = pj なら終了。そうでなければ、pj ← p′
j と
して Step2へもどる。K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
K Means法 | 逐次的な解法
1. 必ず収束するI 解法の構造上、ステップを追うごとに目的関数は非増加I 目的関数は下に有界
2. 初期値に依存し、大域最適が得られるとは限らない
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
K Means法と混合正規分布
K Means法 混合正規分布
jk = argminj ||xk − pj ||2ハードな割り当て(タイプを 1つだけ選ぶ)
P(ωj |xk) =πj ·p(xk |ωj )∑ci=1 πi ·p(xk |ωi )
ソフトな割り当て(各タイプに確率が
割り振られる)
pj =
∑xk∈ωj
xknj
タイプ内平均
µj =∑n
k=1 P(ωj |xk )xk∑nk=1 P(ωj |xk )
タイプごとの加重平均
混合正規分布においてΣ = σ2Id , σ → 0とすると、割り当てがハードになっていく
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
K Means法と混合正規分布
K Means法 混合正規分布
jk = argminj ||xk − pj ||2ハードな割り当て(タイプを 1つだけ選ぶ)
P(ωj |xk) =πj ·p(xk |ωj )∑ci=1 πi ·p(xk |ωi )
ソフトな割り当て(各タイプに確率が
割り振られる)
pj =
∑xk∈ωj
xknj
タイプ内平均
µj =∑n
k=1 P(ωj |xk )xk∑nk=1 P(ωj |xk )
タイプごとの加重平均
混合正規分布においてΣ = σ2Id , σ → 0とすると、割り当てがハードになっていく
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
K Means法と混合正規分布 | 導出
混合正規分布の割り当てルール(再掲)
p(xk |ωj) = (2π)−d/2|Σj |−1/2 exp[−1
2(xk − µj)
′Σ−1j (xk − µj)
]P(ωj |xk) =
πj · p(xk |ωj)∑ci=1 πi · p(xk |ωi )
ここで Σ = σ2Id とすると
p(xk |ωj) = (2πσ2)−d/2 exp[− 1
2σ2 ||xk − µj ||2]
P(ωj |xk) =πj · exp
[− 1
2σ2 ||xk − µj ||2]∑c
i=1 πi · exp[− 1
2σ2 ||xk − µi ||2]
=1∑c
i=1 πi/πj · exp[ 1
2σ2 (||xk − µj ||2 − ||xk − µi ||2)]
=1
1 +∑
i 6=j πi/πj · exp[ 1
2σ2 (||xk − µj ||2 − ||xk − µi ||2)]
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
K Means法と混合正規分布 | 導出
P(ωj |xk) =1
1 +∑
i 6=j πi/πj · exp[ 1
2σ2 (||xk − µj ||2 − ||xk − µi ||2)]
ここで、σ → 0 とすると
12σ2
(||xk − µj ||2 − ||xk − µi ||2
)→
{+∞ if ||xk − µj ||2 > ||xk − µi ||2
−∞ if ||xk − µj ||2 < ||xk − µi ||2
したがって、
P(ωj |xk)→
{0 if ||xk − µj ||2 > ||xk − µi ||2 for some i
1 if ||xk − µj ||2 < ||xk − µi ||2 for all i 6= j
つまり、最も距離の近いタイプへのハードな割り当てとなる
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
K Means法 | 実験
●●●
●●
●
●●
●●
●
●●
●
●●●
●
●●
●
●
●●●
●
●●
●
●●
●
●●
●●
●●
●●
●
●
●●
●
● ●●●
●●● ●
●●
●
●
●
●●
●●●
●
●●
●
●
●
●
●
●●
●
●
●●
●
● ●
●
●
●
●
●
● ●●
●
●●
●
● ●
●
●●
●
●
●
●
●
●
●●●
●●
●
●●
●
●●
●
●●●●●
●
●
●
●
●●
●
●●
●●
●
● ●
●
●
●
●
●●
●
●●
●● ●
●
●
●●●
●
●
●●●●
●
●
●●
●●
●
●
●● ● ●● ●
●
●
●
●
●●
●
●
●●
●
●
●● ●
●●●
●●
●
●●
●●●
●
●
●
●●
●
●●
●
●●
●●
●
●●
●
●●●
●
●●
●
●
●●●
●
●●
●
●
●
●
●●
●●
●●
●●
●
●
●●
●
● ●
●●
● ●● ●
●
●●
●
●
●●
●●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
● ●●
●
●●
●
●●
●
●●
●
●
●
●
●
●
●●●
●●
●
●●
●
●●
●
●
●●●
●
●
●
●
●
●
●
●
●●
●●
●
● ●
●
●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●●●●
●
●
●●
●●
●
●
●● ● ●● ●
●
●
●
●
●
●
●
●
●●
●
●
● ● ●
●●●
●●
●
●●
●●●
●
●
●
X1
X2
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
K Means法 | 実験
●●●
●●
●
●●
●●
●
●●
●
●●●
●
●●
●
●
●●●
●
●●
●
●●
●
●●
●●
●●
●●
●
●
●●
●
● ●●●
●●● ●
●●
●
●
●
●●
●●●
●
●●
●
●
●
●
●
●●
●
●
●●
●
● ●
●
●
●
●
●
● ●●
●
●●
●
● ●
●
●●
●
●
●
●
●
●
●●●
●●
●
●●
●
●●
●
●●●●●
●
●
●
●
●●
●
●●
●●
●
● ●
●
●
●
●
●●
●
●●
●● ●
●
●
●●●
●
●
●●●●
●
●
●●
●●
●
●
●● ● ●● ●
●
●
●
●
●●
●
●
●●
●
●
●● ●
●●●
●●
●
●●
●●●
●
●
●
●●
●
●●
●
●●
●●
●
●●
●
●●●
●
●●
●
●
●●●
●
●●
●
●
●
●
●●
●●
●●
●●
●
●
●●
●
● ●
●●
● ●● ●
●
●●
●
●
●●
●●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
● ●●
●
●●
●
●●
●
●●
●
●
●
●
●
●
●●●
●●
●
●●
●
●●
●
●
●●●
●
●
●
●
●
●
●
●
●●
●●
●
● ●
●
●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●●●●
●
●
●●
●●
●
●
●● ● ●● ●
●
●
●
●
●
●
●
●
●●
●
●
● ● ●
●●●
●●
●
●●
●●●
●
●
●
●
●
●
X1
X2
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
K Means法 | 実験
●●
●
●
●●●●●●●
●●
●●
●
●
●●
●●● ●
●
●
●
●
●●
●
●
●●
●●●●●
●●
●●
●
●
●
●
● ●●
●
●
●
●
●
●
●
●
X1
X2
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
K Means法 | 実験
●●
●
●
●●●●●●●
●●
●●
●
●
●●
●●● ●
●
●
●
●
●●
●
●
●●
●●●●●
●●
●●
●
●
●
●
● ●●
●
●
●
●
●
● ●
●
X1
X2
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
K Means法 | 実験
●●●
●●
●
●
●
●●
●●●
●
●●
●●●
●
●●●
●●
●
●●
●●
●● ●
●
●●●
●
●●●●
●●● ●● ●
●●
●
●●
●
●
●
●●
●
●●
●
●●
●●●
●
●●
●
●
●
●
●●
●●
●●●
●
●
●●
●
● ●
●●
● ●● ●
●●● ●
●
X1
X2
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
K Means法 | 実験
●●●
●●
●
●
●
●●
●●●
●
●●
●●●
●
●●●
●●
●
●●
●●
●● ●
●
●●●
●
●●●●
●●● ●● ●
●●
●
●●
●
●
●
●●
●
●●
●
●●
●●●
●
●●
●
●
●
●
●●
●●
●●●
●
●
●●
●
● ●
●●
● ●● ●
●●
●
●
●
X1
X2
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
K Means法 | 実験
●●●
●●
●
●●●
●
●●
●
●●●
●
●●
●
●●●
●
●●●
●●
●
●●
●●
●●
● ●
●
●●●
●
●●●●
●●● ●
●● ●
●●●
●
●●
●
●●●
●
●●
●
●●●
●
●●
●
●●●
●
●●
●
●
●
●
●●
●●
●●
●●
●
●
●●
●
● ●
●●
● ●● ●
●
●●
●●
●
●
X1
X2
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
K Means法 | 実験
●●●
●●
●
●●●
●
●●
●
●●●
●
●●
●
●●●
●
●●●
●●
●
●●
●●
●●
● ●
●
●●●
●
●●●●
●●● ●
●● ●
●●●
●
●●
●
●●●
●
●●
●
●●●
●
●●
●
●●●
●
●●
●
●
●
●
●●
●●
●●
●●
●
●
●●
●
● ●
●●
● ●● ●
●
●●
●●
●
●
X1
X2
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
K Means法 | 実験
●●●
●●
●
●●●●
●
●●
●
●●●
●
●●
●
●●●
●
●●●
●●
●
●●
●●
●●
● ●
●
●●●
●
●●●●
●●● ●
●● ●
●●●
●
●●
●
●●
●●
●
●●
●
●●●
●
●●
●
●●●
●
●●
●
●
●
●
●●
●●
●●
●●
●
●
●●
●
● ●
●●
● ●● ●
●
●●
●●
●
●
X1
X2
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
凸クラスタリング法
凸クラスタリング法
I 混合正規分布をベースとする
I すべての観測値をプロトタイプとして(i.e. c = n)、クラスタ i の中心を µi = xi と定める
I Σi = σ2Id として、σは既知として与える。そのため、分布パラメータの更新がない
I 各観測値をクラスタへソフトに(確率的に)割り当てる
I 推計の結果、一部のクラスタだけが「生き残る」。つまりク
ラスタの数が結果として得られる
I データが十分大きく密であれば、観測値中にクラスタの中心
と近いものが存在するので、µi = xi の仮定が正当化される
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
凸クラスタリング法 | 目的関数
対数尤度関数
J(π) ≡n∑
k=1
log
(n∑
i=1
πi fik
)
fik ≡ p(xk |µi , σ) = (2πσ2)−d/2 exp[− 1
2σ2 ||xk − xi ||2]
fik は定数! 目的関数は πのみに依存する
最適化問題
Maxπ
J(π)
subject to:n∑
i=1
πi = 1
0 ≤ π ≤ 1
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
凸クラスタリング法 | 一階条件
ラグランジュ関数
L(π) ≡n∑
k=1
log
(n∑
i=1
πi fik
)+ λ(1−
n∑i=1
πi )
πj に関する一階条件 ∑nk=1 fjk∑ni=1 πi fik
= λ (4)
両辺に πj をかけてから、j ついて合計∑nk=1
∑nj=1 πj fjk∑n
i=1 πi fik= λ
n∑j=1
πj
n∑k=1
1 = λ
n = λ
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
凸クラスタリング法 | 逐次解法
(4) に代入し、の両辺に πj をかけて整理すると、一階条件が得られる。
πj = n−1∑n
k=1 πj · fjk∑ni=1 πi · fik
解析的には解けないが、逐次方法により解くことができる
Step 1 π の初期値を与える
Step 2 π を更新する
πj = n−1∑n
k=1 πj · fjk∑ni=1 πi · fik
Step 3 J(π) の増分が閾値よりも少なければ終了する。そうでなければ Step 2へ戻る
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
凸計画 | 凸関数 | 凸集合
凸計画 | Convex Programming
定義 凸関数を凸集合上で最小化する問題を凸計画と呼ぶ
性質 凸計画では、任意の局所最適解は大域最適解でもある
凸関数 | Convex Function
任意の α ∈ [0, 1] について
f (αx1 + (1− α)x2) ≤ αf (x1) + (1− α)f (x2)
となる関数を凸関数と呼ぶ
凸集合 | Convex Set
集合の任意の要素 x1, x2 と任意の α ∈ [0, 1]について、αx1 +(1− α)x2 もその集合の要素となるような集合を凸集合と呼ぶ
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
凸計画 | 凸関数 | 凸集合凸関数 非凸関数
凸集合 非凸集合
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
凸クラスタリングは凸計画
凸クラスタリングは凸計画であり、したがって任意の局所最適
解は大域最適解になる
Maxπ
J(π) =n∑
k=1
log
(n∑
i=1
πi fik
)
subject to:n∑
i=1
πi = 1
0 ≤ π ≤ 1
証明のステップ
1. Max J(π)は、Min − J(π)としてもよい
2. −J(·)は凸関数3. 実行可能集合は凸集合
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
凸クラスタリングは凸計画 | 証明
2. −J(·)は凸関数任意の π1, π2, α ∈ [0, 1]を考える。fk ≡ (f1k , f2k , . . . , fnk)
′とおくと、
∑ni=1 πi fik = f ′kπと書ける。
−J(απ1 + (1− α)π2) =n∑
k=1
− log(f ′k(απ1 + (1− α)π2))
=n∑
k=1
− log(αf ′kπ1 + (1− α)f ′kπ
2)
− log(·)は凸関数なので
≤n∑
k=1
[−α log(f ′kπ1)− (1− α) log(f ′kπ
2)]
= α[−J(π1)
]+ (1− α)
[−J(π2)
]K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
凸クラスタリングは凸計画 | 証明
3. 実行可能集合は凸集合π1, π2は制約条件を満たしていると仮定する。任意の
α ∈ [0, 1]について、
n∑i=1
[απ1
i + (1− α)π2i
]= α
n∑i=1
π1i + (1− α)
n∑i=1
π2i = 1
また、απ1 ∈ [0, α], (1− α)π2 ∈ [0, 1− α]なので、απ1 + (1− α)π2 ∈ [0, 1]
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
凸クラスタリング | 実験
事実上、σがクラスタの個数をコントロールするσ小
●
●
●
● ●
●
● ●
●●
●
●
●●
●●
●
●
●
●●●
● ●●●
●●
●
●●
●
●
●
●●
●● ●
●●
●
●
●
●
●●
●
●
● ●
●●
●
●
●
●
●●
●
●
●● ●●●●
●
●
●
●
●
●
●●
●
●
●
●● ●●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●●
●
● ●
●
●
●
●●●
●
●●
●
●●
●●
●●
●●
●●
●●●
●
●●
●
●●
●●●●●●
●
●●
● ●●
●●
●●●
●
●
●
●
●
●
●
●●
●
●
● ●
●●
●
●
●●
●
● ●
●●
●
●
●●
●●
●
●
●
●
●●● ●
●●●●
●
●
●
●
●
●
●●
●●
●
●●
●
●
●
●
●●
●
●
● ●
●●
●
●
●
●
●
●●
●
●● ●●●
●
●
●
●
●
●
●
●●
●
●
●
●
● ●●●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●●●
●
●●
●
●●
●●
●
●
●●
●
●●●
●
●
●●
●
●
●
●
●●●●
●
●
●●
● ●●
●●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●
●●
●
●
● ●
●
● ●
●●
●
●
●●
●●
●
●
●
●●●
● ●●●
●●
●
●●
●
●
●
●●
●● ●
●●
●
●
●
●
●●
●
●
● ●
●●
●
●
●
●
●●
●
●
●● ●●●●
●
●
●
●
●
●
●●
●
●
●
●● ●●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●●
●
● ●
●
●
●
●●●
●
●●
●
●●
●●
●●
●●
●●
●●●
●
●●
●
●●
●●●●●●
●
●●
● ●●
●●
●●●
●
●
●
●
●
●
●
●●
●
●
● ●
●●
●
●
●●
●
● ●
●●
●
●
●●
●●
●
●
●
●
●●● ●
●●●●
●
●
●
●
●
●
●●
●●
●
●●
●
●
●
●
●●
●
●
● ●
●●
●
●
●
●
●
●●
●
●● ●●●
●
●
●
●
●
●
●
●●
●
●
●
●
● ●●●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●●●
●
●●
●
●●
●●
●
●
●●
●
●●●
●
●
●●
●
●
●
●
●●●●
●
●
●●
● ●●
●●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
X1
X2
σ中
●
●
●
● ●
●
● ●
●●
●
●
●●
●●
●
●
●
●●●
● ●●●
●●
●
●●
●
●
●
●●
●● ●
●●
●
●
●
●
●●
●
●
● ●
●●
●
●
●
●
●●
●
●
●● ●●●●
●
●
●
●
●
●
●●
●
●
●
●● ●●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●●
●
● ●
●
●
●
●●●
●
●●
●
●●
●●
●●
●●
●●
●●●
●
●●
●
●●
●●●●●●
●
●●
● ●●
●●
●●●
●
●
●
●
●
●
●
●●
●
●
● ●
●●
●
●
●●
●
● ●
●●
●
●
●●
●●
●
●
●
●
●●● ●
●●●●
●
●
●
●
●
●
●●
●●
●
●●
●
●
●
●
●●
●
●
● ●
●●
●
●
●
●
●
●●
●
●● ●●●
●
●
●
●
●
●
●
●●
●
●
●
●
● ●●●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●●●
●
●●
●
●●
●●
●
●
●●
●
●●●
●
●
●●
●
●
●
●
●●●●
●
●
●●
● ●●
●●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●
●●
●
●
● ●
●
● ●
●●
●
●
●●
●●
●
●
●
●●●
● ●●●
●●
●
●●
●
●
●
●●
●● ●
●●
●
●
●
●
●●
●
●
● ●
●●
●
●
●
●
●●
●
●
●● ●●●●
●
●
●
●
●
●
●●
●
●
●
●● ●●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●●
●
● ●
●
●
●
●●●
●
●●
●
●●
●●
●●
●●
●●
●●●
●
●●
●
●●
●●●●●●
●
●●
● ●●
●●
●●●
●
●
●
●
●
●
●
●●
●
●
● ●
●●
●
●
●●
●
● ●
●●
●
●
●●
●●
●
●
●
●
●●● ●
●●●●
●
●
●
●
●
●
●●
●●
●
●●
●
●
●
●
●●
●
●
● ●
●●
●
●
●
●
●
●●
●
●● ●●●
●
●
●
●
●
●
●
●●
●
●
●
●
● ●●●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●●●
●
●●
●
●●
●●
●
●
●●
●
●●●
●
●
●●
●
●
●
●
●●●●
●
●
●●
● ●●
●●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
X1
X2
σ大
●
●
●
● ●
●
● ●
●●
●
●
●●
●●
●
●
●
●●●
● ●●●
●●
●
●●
●
●
●
●●
●● ●
●●
●
●
●
●
●●
●
●
● ●
●●
●
●
●
●
●●
●
●
●● ●●●●
●
●
●
●
●
●
●●
●
●
●
●● ●●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●●
●
● ●
●
●
●
●●●
●
●●
●
●●
●●
●●
●●
●●
●●●
●
●●
●
●●
●●●●●●
●
●●
● ●●
●●
●●●
●
●
●
●
●
●
●
●●
●
●
● ●
●●
●
●
●●
●
● ●
●●
●
●
●●
●●
●
●
●
●
●●● ●
●●●●
●
●
●
●
●
●
●●
●●
●
●●
●
●
●
●
●●
●
●
● ●
●●
●
●
●
●
●
●●
●
●● ●●●
●
●
●
●
●
●
●
●●
●
●
●
●
● ●●●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●●●
●
●●
●
●●
●●
●
●
●●
●
●●●
●
●
●●
●
●
●
●
●●●●
●
●
●●
● ●●
●●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●
●●
●
●
● ●
●
● ●
●●
●
●
●●
●●
●
●
●
●●●
● ●●●
●●
●
●●
●
●
●
●●
●● ●
●●
●
●
●
●
●●
●
●
● ●
●●
●
●
●
●
●●
●
●
●● ●●●●
●
●
●
●
●
●
●●
●
●
●
●● ●●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●●
●
● ●
●
●
●
●●●
●
●●
●
●●
●●
●●
●●
●●
●●●
●
●●
●
●●
●●●●●●
●
●●
● ●●
●●
●●●
●
●
●
●
●
●
●
●●
●
●
● ●
●●
●
●
●●
●
● ●
●●
●
●
●●
●●
●
●
●
●
●●● ●
●●●●
●
●
●
●
●
●
●●
●●
●
●●
●
●
●
●
●●
●
●
● ●
●●
●
●
●
●
●
●●
●
●● ●●●
●
●
●
●
●
●
●
●●
●
●
●
●
● ●●●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●●●
●
●●
●
●●
●●
●
●
●●
●
●●●
●
●
●●
●
●
●
●
●●●●
●
●
●●
● ●●
●●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
X1
X2
σ特大
●
●
●
● ●
●
● ●
●●
●
●
●●
●●
●
●
●
●●●
● ●●●
●●
●
●●
●
●
●
●●
●● ●
●●
●
●
●
●
●●
●
●
● ●
●●
●
●
●
●
●●
●
●
●● ●●●●
●
●
●
●
●
●
●●
●
●
●
●● ●●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●●
●
● ●
●
●
●
●●●
●
●●
●
●●
●●
●●
●●
●●
●●●
●
●●
●
●●
●●●●●●
●
●●
● ●●
●●
●●●
●
●
●
●
●
●
●
●●
●
●
● ●
●●
●
●
●●
●
● ●
●●
●
●
●●
●●
●
●
●
●
●●● ●
●●●●
●
●
●
●
●
●
●●
●●
●
●●
●
●
●
●
●●
●
●
● ●
●●
●
●
●
●
●
●●
●
●● ●●●
●
●
●
●
●
●
●
●●
●
●
●
●
● ●●●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●●●
●
●●
●
●●
●●
●
●
●●
●
●●●
●
●
●●
●
●
●
●
●●●●
●
●
●●
● ●●
●●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●
●●
●
●
● ●
●
● ●
●●
●
●
●●
●●
●
●
●
●●●
● ●●●
●●
●
●●
●
●
●
●●
●● ●
●●
●
●
●
●
●●
●
●
● ●
●●
●
●
●
●
●●
●
●
●● ●●●●
●
●
●
●
●
●
●●
●
●
●
●● ●●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●●
●
● ●
●
●
●
●●●
●
●●
●
●●
●●
●●
●●
●●
●●●
●
●●
●
●●
●●●●●●
●
●●
● ●●
●●
●●●
●
●
●
●
●
●
●
●●
●
●
● ●
●●
●
●
●●
●
● ●
●●
●
●
●●
●●
●
●
●
●
●●● ●
●●●●
●
●
●
●
●
●
●●
●●
●
●●
●
●
●
●
●●
●
●
● ●
●●
●
●
●
●
●
●●
●
●● ●●●
●
●
●
●
●
●
●
●●
●
●
●
●
● ●●●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●●●
●
●●
●
●●
●●
●
●
●●
●
●●●
●
●
●●
●
●
●
●
●●●●
●
●
●●
● ●●
●●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
X1
X2
K Mori ch10 Clustering
前置き K Means 法 凸クラスタリング法 まとめ
まとめ
K Mori ch10 Clustering