1 Soft computing and Application Fractal Image Compression 鄭志宏 義守大學 資工系...

Post on 27-Dec-2015

232 views 0 download

Transcript of 1 Soft computing and Application Fractal Image Compression 鄭志宏 義守大學 資工系...

1

Soft computing and ApplicationFractal Image Compression

鄭志宏義守大學 資工系 高雄縣大樹鄉

J. H. Jeng

Department of Information Engineering

I-Shou University, Kaohsiung County

2

Outline

Fractal Image Compression (FIC) Encoder and Decoder Transform Method Genetic Algorithm (GA) Spatial Correlation GA FIC Particle Swarm Optimization (PSO) PSO FIC with Edge Property

3

Multimedia vs 心經 眼耳鼻舌身意 色聲香味觸法 眼: Text, Graphics, Image, Animation, Video 耳: Midi, Speech, Audio 鼻:電子鼻 , 機車廢氣檢測 舌:成份分析儀 , 血糖機 , Terminator III 身:壓力 , 溫度感測器 , 高分子壓電薄膜 意: Demolition Man

7-th “Sensor”

4

Digital Image Compression

Finite Set• a, b, c, … ASCII

• 你 , 我 , … Big 5 Geometric Pattern

• Circle --- (x,y,r)

• Spline --- control points and polynomials Fractal Image

• Procedure, Iteration Natural Image

• JPEG, GIF

5

Fractal Image –having details in every scale

6

Mandelbrot Set (0)

7

Mandelbrot Set (1)

8

Mandelbrot Set (2)

9

Mandelbrot Set (3)

10

Logistic Map (0)

11

Logistic Map (1)

12

Logistic Map (2)

13

Logistic Map (3)

14

Fractal Image

15

321

3

2

1

0

2/1

2/10

02/1

2/1

0

2/10

02/1

2/10

02/1

wwwW

y

x

y

xw

y

x

y

xw

y

x

y

xw

Affine Transformations

16

Local Self-Similarity

17

Fractal Image Compression Proposed by Barnsley in 1985, Realized by Jacquin

in1992 Partitioned Iterated Function System (PIFS) Explore Self-similarity Property in Natural Image Lossy Compression Advantage:

• High compressed ratio

• High retrieved image quality

• Zoom invariant

Drawback:• Time consuming in encoding

18

Domain Pool (D) Range Pool (R)

0r 1r

1922d

6538d

…….

Original Image

…….

……

.

Search for Best Match

19

Expanded Codebook

Search Every Vector in the Domain Pool

For Each Search Entry:• Eight orientations• Contrast adjustment• Brightness adjustment

20

The Best Match

: range block to be encoded

: search entry in the Domain Pool

: eight orientations,

})),(({min)(2

,,,,vqjiupv k

qpkji

v

),( jiu

),( jiuk 81 k

21

Eight Orientations (Dihedral Group)

87654321 ,,,,,,, ttttttttT

1 2

4 3

3 4

2 1

4 1

3 2

1 4

2 3

2 1

3 4

3 2

4 1

4 3

1 2

2 3

1 41t 2t 4t3t

5t 6t 8t7t

90

flip

1 2

34

22

210

0 21 : 1 case

0 21

21 0 :6 case

0 21

21 0 :7 case

0 21

21 0 :8 case

0 21

21 0 :5 case

210

0 21 : 2 case

210

0 21 : 3 case

210

0 21 : 4 case

Rotate 0º

Rotate 90º

Rotate 270º

Rotate 180º

Flip of case 1

Flip of case 6

Flip of case 7

Flip of case 4

Matrix Representations

23

])),((,[

]),(),(,[

21

0

1

0

2

1

0

1

0

1

0

1

0

2

N

i

N

jkkk

N

i

N

j

N

i

N

jkk

k

jiuuuN

jivjiuvuN

p

1

0

1

0

1

0

1

02

),(),(1 N

i

N

jkk

N

i

N

jk jiupjiv

Nq

Contrast and Brightness

})),(({min),(2

8..1vqjiupji kkkk

k

24

Affine Transform and Coding Format

q

j

i

z

y

x

p

dc

ba

z

y

x

W kk

kk

00

0

0

kkkk dcba ,,,p : contrast scale q : intensity offset

z : The gray level of a pixel

yx, : The position of a pixel

ji, : dihedral group: position

) 7 , 5 , 3 ,8 ,8(

) ,, , ,( qpTji k

25

De-Compression

Make up all the Affine Transformations Choose any Initial Image Perform the Transformation to Obtain a New

Image and Proceed Recursively Stop According to Some Criterions

26

The Decoding Iterations

Init Image Iteration=1 Iteration=2

Iteration=3 Iteration=4 Iteration=8

27

Original 256256 Lena image Encoding time = 22.4667 minutes PSNR=28.515 dB

Full Search Coder

28

58081116256 2

Domain block=1616 down to 8*8

#Domain blocks =

#MSE= 580818 = 464648

Contrast and Brightness Adjustment

Domain Pool (D) Range Pool (R)

0r 1r

1922d

6538d

…….

Original Image

…….

……

.

10248/256 2

Image Size = 256256

Range block = 88

#Range block =

Complexity

29

Deterministic

Contrast and Brightness: Optimization The Dihedral Group: Transform Method

})),(({min),(2

8..1vqjiupji kk

k

)},({min,

jiji

30

Non-Deterministic

Classification Method Correlation Method Soft Computing Method

})),(({min),(2

8..1vqjiupji kk

k

)},({min,

jiji

31

Transform Method – DCT (Energy)

Discrete Cosine Transform (DCT)• Parallel Structure

• Zonal Filter

• Pre- and Post- Classification

N

x

N

y N

ny

N

mxyxfnCmC

NnmF

0 0 2

12cos

2

12cos,

2,

0,1

0,2

1)(

m

mmC

32

Images in the Frequency Domain

158158158163161161162162

157157157162163161162162

157157157160161161161161

155155155162162161160159

159159159160160162161159

156156156158163160155150

156156156159156153151144

155155155155153149144139

),( yxf

01122423

11201001

11110202

11102111

00011027

011022911

1003361723

132251211260

),( vuF

162162161161163158158158

162162161163162157157157

161161161161160157157157

159160161162162155155155

159161162160160159159159

150155160163158156156156

144151153156159156156156

139144149153155155155155

),( yxg

01122423

11201001

11110202

11102111

00011027

011022911

1003361723

132251211260

),( vuG

33

+

a00,a02,a04,a06a20,a22,a24,a26a40,a42,a44,a46a60,a62,a64,a66

1

2-

+

++

a01,a03,a05,a07a21,a23,a25,a27a41,a43,a45,a47a61,a63,a65,a67

a10,a12,a14,a16a30,a32,a34,a36a50,a52,a54,a56a70,a72,a74,a76

a11,a13,a15,a17a31,a33,a35,a37a51,a53,a55,a57a71,a73,a75,a77

+

++

++

+ 3

4-

+

++

+

+

+++

-

-

+

+

+

b00,b02,b04,b06b20,b22,b24,b26b40,b42,b44,b46b60,b62,b64,b66

5

6

-

+

++

b01,b03,b05,b07b21,b23,b25,b27b41,b43,b45,b47b61,b63,b65,b67

b10,b12,b14,b16b30,b32,b34,b36b50,b52,b54,b56b70,b72,b74,b76

b11,b13,b15,b17b31,b33,b35,b37b51,b53,b55,b57b71,b73,b75,b77

+

+ ++

+

+

7

8-

+

+++

+

+++

-

-

+

+

Parallel Architecture using DCT

34

Baseline DCT Zonal filter 4

time PSNR time PSNR Time PSNR

Lena 22.42 28.90 6.46 28.93 3.80 28.18

Baboon 22.42 20.15 6.46 20.16 3.80 19.44

F16 22.42 25.50 6.46 24.54 3.80 24.39

Pepper 22.42 29.86 6.46 29.86 3.80 29.27

Experiment Result

• Windows 95, Intel Pentium 133MHZ.

• Tested Image: Lena 256 *256

• Range Block Size: 8*8

35

(a). Baseline method, time used=22.42 min, PSNR=28.90(b). Fast algorithm, time used=6.46 min, PSNR=28.93(c). The fast algorithm with Zonal filters (21 coefficients) , time used=3.8 min, PSNR=28.18

(a) (b) (c)

Experiment Result

36

Transform Method – DHT (VLSI)

Hadamard Transform (HT)• Parallel Structure• +/- Computation• Harr Wavelet

1

0

1

0

1

0

1

0 11),(1

),(N

y

N

x

nbybmbxb

kk

r

kkk

r

lllyxf

NnmF

Nr 2log

37

5445822018519

233018614153721

7251595118141758

361851522203928

1638572531115817

4840392432431664

292317401932821

565582354323714

1u

247642406221826444

1423981365241668286394

2289627417826214144372

16222762081764294306

126322644012032198214

6042815422211492564

42114481125214874210

44410212303027781360

1F

DHT

565582354323714

292317401932821

4840392432431664

1638572531115817

361851522203928

7251595118141758

233018614153721

5445822018519

8u

247642406221826444

1423981365241668286394

2289627417826214144372

16222762081764294306

126322644012032198214

6042815422211492564

42114481125214874210

44410212303027781360

8F

DHT

Images in the Hadamard Domain

38

Range block (8*8) Range block (4*4)Image Information Method Baseline DHT Baseline DHT

PSNR 28.15 28.15 34.21 34.21Encoding Time (min) 33.86 14.63 47.32 17.51

Lenna256*256

Speed Up Ratio 1 2.31 1 2.70

Baseline vs. Fast DHT Method

• Windows 98, Intel Pentium II 400MHZ.

• Fix Bit Rate: 0.4844 (8*8)

39

Original Image : Lena 256*256 Fractal Baseline Method : Block Size= 8*8 PSNR= 29.25

Fast DHT Method : Block Size= 8*8 PSNR= 29.25

Experiment Result

40

Soft Computing

Machine Learning• ANN, FNN, RBFN, CNN

• Statistical Learning, Support Vector Machine Global Optimization Techniques

• Genetic Algorithm, Particle Swarm Optimization, Ant Colony Optimization (Survival of the Fittest)

• Simulated Annealing (Physics)

• Branch and Bound, Tabu Search To infinity and beyond

41

Evolutionary Computation

Genotype and Phenotype• Genetic Algorithms (GA)

• Memetic Algorithm (MA)

• Genetic Programming (GP)

• Evolutionary Programming (EP)

• Evolution Strategy (ES) Social Behavior

• Ant Colony Optimization (ACO)

• Particle Swarm Optimization (PSO)

Genetic Algorithm

Developed by John Holland in 1975 Mimicking the natural selection and natural

genetics Advantage:

• Global search technique

• Suited to rough landscape Drawback:

• Final solution usually not optimal

42

43

Schema Genetic Algorithm

Schema: 11 ## 00 ( # : don’t care) Chromosome: 110000, 110100, 111000, and 111100

Schema Theorem:Short schema with better than average costs occur exponentially more frequently in the next generation.Schema with worse than average costs occur less frequently in the next generation.

44

1. Chromosome Formation

length:

xtl ytl dlxt yt d

dtt llllyx

45

2. Fitness Function

v : range block

u : sub-sample domain block

MSEf

1

7

0

7

0

2)),(),((64

1

j i

jivjiuMSE

3. Selection

Rank-proportionate selective mechanism

ranking

“superior clan”

“inferior clan”

population

temporary held

replaced by temporaryoffsprings

46

4. Crossover

uniform crossover to temporary offsprings

parent1

parent2

mask

offspring1

offspring2

0 1

47

5. Mutation

xt yt dLSB’sMSB’s LSB’sMSB’s

xt yt dLSB’sMSB’s LSB’sMSB’s

superior clan temporary offsprings

population of next generation

48

6. Stopping Criterion

a) A pre-specified number of iterations. b) The same best chromosome is re-selected for

many times. c) The MSE of best chromosome < a pre-

specified threshold.

49

GA Parameters

Population size: 200 superior clan: 100 Crossover rate: 0.6 Mutation rate: 0.1 Stopping Criterion:

• Maximum iterations: 200

• Repeated times: 30

• MSE threshold=20

50

Experimental Results (Lena)

Full search method:• MSE: 83.54

• PSNR: 28.91dB

• Encoding time: 6667 sec Proposed method:

• MSE: 108.38

• PSNR: 27.78dB

• Encoding time: 192 sec

51

Retrieved Image (Lena)

full search proposed method

52

Experimental Results (Pepper)

Full search method:• MSE: 67.37

• PSNR: 29.85dB

• Encoding time: 6667 sec Proposed method:

• MSE: 85.61

• PSNR: 28.81dB

• Encoding time: 213 sec

53

Retrieved Image (Pepper)

full search proposed method

54

Spatial Correlation Genetic Algorithm (1)

Two stage GA: 1. spatial correlation

1Dr Vr 2Dr

Hr jr

Hd

Vd1Dd

2Dd

HS

VS 1DS

2DS

W

L

55

Spatial Correlation Genetic Algorithm (2)

Two stage GA: 2. Full Search GA If the best MSE found in the first stage is

greater than a pre-specified threshold

56

GA Parameters Spatial Correlation GA:

• Population size: 16(4*4)

• Length:32, Width:16

• Crossover rate: 0.6

• Mutation rate: 0.02

• Generation:15

Full Search GA:• Population size:160

• Crossover rate: 0.6

• Mutation rate: 0.02

• Generation:15

57

Experimental Results (Full Search)

Original image Full search methodtime used=3141.88 secPSNR=28.91bit rate= 0.4844

58

Experimental Results (SCGA)

Full GA methodtime used=23.00 secPSNR=27.44bit rate= 0.4844

Spatial Correlation methodtime used=13.61 secPSNR=27.41Hit block=495, bit rate= 0.4396 50T 59

60

Particle Swarm Optimization (PSO) Particle Swarm Optimization

• Introduced in 1995 by Kennedy and Eberhart • Swarm Intelligence• Simulation of a social model• Population-based optimization• Evolutionary computation

Social Psychology Principles• Bird flocking• Fish schooling• Elephant Herding

61

Animals and the Society

a Flock of: Birds, Seagulls, Stars a School of: Fish, Dolphins, Shrimps a Herd of: Elephants, Horses, Oxen, Cattle,

Camels, Deer, or Swine

62

Searching Space

Initial state

PSO Example

Searching Space

After optimization

63

System Model

: velocities of each particle on dimensions : particle’s number : particle in dimension hyperspace : weighting : learning constant : random number between 0 and 1 : position of Pbest : position of Gbest : position of each particle

)(diV

i

dw

GcPc

PR)(d

ip

)(d

iG

)(d

ix

d

d

)()( )()()()()()( di

diGG

di

diPP

di

di xGRcxPRcVwV

)()()( di

di

di Vxx

GR

64

Simple Examples

65

PSO for Fractal Image Compression For a Given Range, Search the Most Similar

Domain Block in the Search Space

• Particle formation: (i, j, k) Evaluation Function

• MSE between domain block and range block

Stopping criteria• Fixed number of rounds or MSE<20

})),(({min),(2

8..1vqjiupji kk

k

)},({min,

jiji

66

Example of System Parameters PSO parameter

• Population size: 43• Searching rounds: 27

Stopping Criterion• Fixed number of rounds• MSE<20

67

Experimental Result

Comparative Result for Lena• Full Search: Lena Pepper

MSE: 83.54, 67.37 PSNR: 28.91dB, 29.85dB Time: 8120 sec, 8120 sec

• PSO Method MSE: 107.01, 85.61 PSNR: 27.82dB, 28.81dB Time: 101 sec 100 sec

80 Times Faster, 1.1 dB Decay in PSNR

68

Lena Image

Retrieved image(Full search method)

Retrieved image (PSO method)

69

Edge-Property Adapted PSO for FIC

Hybrid Method vs Fused Methods

Visual-Salience Tracking Edge-type Classifier, 5 Edge Types Predict the Best k (Dihedral Transformation) Intuitively Direct the Swarm Velocity Direction

according to Edge Property

70

Dihedral Transformation

87654321 ,,,,,,, ttttttttT

1 2

4 3

3 4

2 1

4 1

3 2

1 4

2 3

2 1

3 4

3 2

4 1

4 3

1 2

2 3

1 41t 2t 4t3t

5t 6t 8t7t

90

flip

1 2

34

71

Edge Property vs Frequency Domain

Discrete Cosine Transform (DCT)• F(1,0): Energy Variation across Vertical Line

• F(0,1): Energy Variation across Horizontal Line

N

x

N

y N

ny

N

mxyxfnCmC

NnmF

0 0 2

12cos

2

12cos,

2,

0,1

0,2

1)(

m

mmC

72

Classification Scheme

S: smooth block D: diagonal or sub-diagonal block H: horizontal or vertical block Scheme

• If |F(1,0)|<Ts and |F(0,1)|<Ts, type=S

• If | |F(1,0)|-|F(0,1)| |<Td, type=D

• Else type=H

73

Prediction Table

74

Intuitive Vector

75

Comparative Results (Lena)

76

Thanks

Every job is the self-portrait of those who did it.Autograph yourself with quality.

件件工作 反應自我 凡經我手 必為佳作