嵌入式視覺 Pattern Recognition for Embedded Vision Template matching Statistical / Structural...
-
Upload
bennett-jasper-kennedy -
Category
Documents
-
view
283 -
download
0
Transcript of 嵌入式視覺 Pattern Recognition for Embedded Vision Template matching Statistical / Structural...
嵌入式視覺Pattern Recognition for
Embedded Vision
•Template matching•Statistical / Structural Pattern Recognition •Neural networks
Embedded Vision System
Image acquisitionImage ProcessingFeature ExtractionDecision Making(Pattern Recognition)
Pattern Recognition model
1. Template matching
2. Statistical Pattern Recognition:
based on underlying statistical model of patterns and pattern classes.
3. Structural (or syntactic) Pattern Recognition :
pattern classes represented by means of formal structures as grammars, automata, strings, etc.
4. Neural networks:
classifier is represented as a network of cells modeling neurons of the human brain (connectionist approach).
Pattern Representation
• A pattern is represented by a set of d features, or attributes, viewed as a d-dimensional feature vector.
1 2( , , , )Tdx x xx
PreprocessingPreprocessingFeature
MeasurementFeature
MeasurementClassificationClassification
testpattern
Classification Mode
PreprocessingPreprocessing
FeatureExtraction/Selection
FeatureExtraction/Selection
LearningLearningtrainingpattern
Training Mode
Two Process for Pattern Recognition system
Generic concepts for PR
y x
nx
x
x
2
1Feature vector
- A vector of observations (measurements). - is a point in feature space .
Hidden state
- Cannot be directly measured.
- Patterns with equal hidden state belong to the same class.
Xx
x X
Yy
Task
- To design a classifer (decision rule)
which decides about a hidden state based on an onbservation.YX :q
Pattern
Pattern Representation by Feature Vector for Character
Recognition
X=[x1, x2, … , xn], each xj a real number
Xj may be object measurement
Xj may be count of object parts
Example: object rep. [#holes, Area, moments, ]
Example
x
2
1
x
x
height
weight
Task: identity recognition.
The set of hidden state is
The feature space is
},{ JHY2X
Training examples
)},(,),,{( 11 ll yy xx
1x
2x
Jy
Hy Linear classifier:
0)(
0)()q(
bifJ
bifH
xw
xwx
0)( bxw
Pattern Recognition system
Image processing
Feature extraction
Classifier
Classassignment
• Image acquisition and image processing.• Feature extraction aims to create discriminative features good for classification.• Classifier.• Learning algorithm sets PR from training examples-- supervised learning
Learning algorithm
Object
Feature extraction
Task: to extract features which are good for classification.
Good features: • Objects from the same class have similar feature values.
• Objects from different classes have different values.
“Good” features “Bad” features
Feature extraction methods
km
m
m
2
1
nx
x
x
2
11φ
2φ
nφ
km
m
m
m
3
2
1
nx
x
x
2
1
Feature extraction Feature selection
Problem can be expressed as optimization of parameters of featrure extractor .
Supervised methods: objective function is a criterion of separability (discriminability) of labeled examples, e.g., linear discriminat analysis (LDA).
Unsupervised methods: lower dimesional representation which preserves important characteristics of input data is sought for, e.g., principal component analysis (PCA).
φ(θ)
ClassifierA classifier partitions feature space X into class-labeled regions such that
||21 YXXXX }0{||21 YXXX and
1X 3X
2X
1X1X
2X
3X
The classification consists of determining to which region a feature vector x belongs to. Borders between decision boundaries are called decision regions.
CSE803 Fall 2014 13
Decision-Tree Classifier
Uses subsets of features in seq.
Feature extraction may be interleaved with classification decisions
Can be easy to design and efficient in execution
14
Decision Trees
#holes
moment ofinertia
#strokes #strokes
best axisdirection
#strokes
- / 1 x w 0 A 8 B
01
2
< t t
2 4
0 1
060
90
0 1
15
Classification using nearest class mean
Compute the Euclidean distance between feature vector X and the mean of each class.
Choose closest class, if close enough (reject otherwise)
Unsupervised learning
Input: training examples {x1,…,x} without information about the hidden state.
Clustering: goal is to find clusters of data sharing similar properties.
Classifier
Learning algorithm
θ
},,{ 1 xx },,{ 1 yy
Classifier
ΘY)(X: L
YΘX :q
Learning algorithm(supervised)
A broad class of unsupervised learning algorithms:
Example of unsupervised learning algorithm
k-Means clustering:
Classifier
||||minarg)q(,,1
iki
y mxx
Goal is to minimize
1
2)q( ||||
ii ixmx
ij
ji
iII
,||
1xm })q(:{ ij ji xI
Learning algorithm
1m
2m
3m
},,{ 1 xx
},,{ 1 kmmθ
},,{ 1 yy