Author : 1131037005 컴퓨터 공학과 김홍연 An Energy Efficient Hierarchical Clustering...

Post on 21-Jan-2016

214 views 0 download

Transcript of Author : 1131037005 컴퓨터 공학과 김홍연 An Energy Efficient Hierarchical Clustering...

1

Author :

1131037005 컴퓨터 공학과 김홍연

An Energy Efficient Hierarchical Clustering Algorithm for Wire-

less Sensor Networks.Seema Bandyopadhyay, Edward J. Coyle

2

A Wireless network consisting of a large number of small sensors with low-power transceivers can be an effective tool for gathering data in a variety of environments.

Weakness of the sensor : limited energy resources.

Abstract.

C

S1

S2

S3S4

S5

S6

S7

S8

S : sensorC : single processing center

3

Clustering. Sensors communicate information only to clusterheads and

then the clusterheads communicate the aggregated infor-mation to the processing center, may save energy.

This paper propose a distributed, randomized clustering al-gorithm to organize the sensors in a wireless sensor network into clusters.

Abstract.

Plane

Cluster Head

Sensor

Cluster

4

In this paper, We propose a fast, randomized, distributed algorithm for

orga-nizing the sensors in a wireless sensor network in a hierarchy of clusters with an objective of minimizing the energy spent in com-municating the information to the in-formation processing center.

Necessity. For wireless sensor networks with a large number of en-

ergy-constrained sensors, it is very important to design a fast algori-thm to organize sensors in clusters to minimize the energy used to communicate information from all nodes to the processing center.

Fast, Organize, Minimize energy.

Introduction.2 page, 3~11 lines

5

Probability . Each sensor in the network becomes a clusterhead () with

probability and advertises itself as a clusterhead to the sensors within its radio range.

.

Hops . Any sensor that receives such advertisements and is not it-

self a clusterhead joins the cluster of the closest cluster-head.

Only hops away from the clusterhead.

Others. Any sensor that is neither a clusterhead nor has joined any

cluster itself becomes a clusterhead. .

Single-level clustering algorithm.Section 3 - A

6

Parameters . Since the objective of our work is to organize the sensors in

clusters to minimize this energy consumption, We need to find the values of the parameters and of our

algorithm that would ensure minimization of energy consump-tion.

Assumptions. The sensors in the WSN are distributed as per homogeneous spatial Poisson process

of intensity in 2-dimensional space. All sensors transmit at the same power level and hence have the same radio

range . Data exchanged between two communicating sensors not within each others’ radio

range is forwarded by other sensors. A distance of between any sensor and its clusterhead is equivalent to hops. Each sensor uses 1 unit of energy to transmit or receive 1 unit of data. A routing infrastructure is in place; hence, when a sensor communicates data to an-

other sensor, only the sensors on the routing path forward the data. The communication environment is contention- and error- free; hence, sensors do

not have to retransmit any data.

Single-level clustering algorithm.Section 3 - B

7

Computation of the optimal probability .

: the number of sensors in a square area of side . Mean : , : square area of side .

: the length of the segment from a sensor located at to the processing center. The processing center is at the center of the square.

The total length of the segments from all these s to the pro-cessing center : .

Single-level clustering algorithm.Section 3 – B – 1)

𝐷𝑖¿ The total length of the segments from all sensor nodes.

+++++++

=

8

Computation of the optimal probability . The clusterheads and the non-clusterheads are distributed

as per independent homogeneous spatial Poisson processes and of intensity and respectively.

Each non-clusterhead joins the cluster of the cloest cluster-head to form a Voronoi tessellation.

If is the random variable denoting the number of process points in each Voronoi cell.

is the total length of all segments connecting the process points to the nucleus in a Voronoi cell.

Single-level clustering algorithm.Section 3 – B – 1)

Voronoi Cell

nucleus

9

Computation of the optimal probability . : the total energy used by the sensors in a Voronoi cell to

communicate one unit of data to the clusterhead.

: the total energy spent by all the sensors communicating 1 unit of data to their respective clusterheads.

Single-level clustering algorithm.Section 3 – B – 1)

¿𝐸 [𝐶1] ¿𝐸 [𝐶2]

10

Computation of the optimal probability . : the total energy spent by the clusterheads to communi-

cate the aggregated information to the processing center.

: the total energy spent in the system.

Single-level clustering algorithm.Section 3 – B – 1)

(5) (6)

(4) (3)

11

Computation of the optimal probability . Removing the conditioning on yields.

is minimized by a value of that is a solution of

The above equation has three roots, two of which are imagi-nary.

The second derivative of the above function is positive for the only real root of (9) and hence it minimizes the energy spent.

Single-level clustering algorithm.Section 3 – B – 1)

12

Computation of the optimal probability . The only real root of (9) is given by

where .

Single-level clustering algorithm.Section 3 – B – 1)

13

Computation of the maximum number of hops. Our main reason for limiting was to be able to fix a period-

icity for the clusterheads at which they should communi-cate to the processing center.

The number of hop : Setting will also ensure that there will be very few forced clus-

terheads in the network. Forced clusterhead : Any sensor that is neither a clusterhead nor has

joined any cluster itself becomes a clusterhead

Single-level clustering algorithm.Section 3 – B – 2)

𝑅max=distance¿Assumption d) A distance of between any sensor and its clusterhead is equivalent to hops.

14

Computation of the maximum number of hops. Probabilistic approach.

The probability of any point of process being more than dis-tance away from all points of process is very small.

The probability of any sensor being more than hops away from all volunteer clusterheads very small.

Let be the radius of the minimal ball centered at the nu-cleus of a Voronoi cell, which contains the Voronoi cell.

We define to be the probability that is greater than a certain value , i.e. .

Then, it can be proved that . If is the value of such that is less than , then,

Single-level clustering algorithm.Section 3 – B – 2)

15

Computation of the maximum number of hops. Equation (11) means that the expected number of sensors

that will not join any cluster is if we set

To ensure minimum energy consumption, we will use a very small value for , which implies that the probability of all sen-sors being within hops from at least one volunteer clusterhead is very high.

Single-level clustering algorithm.Section 3 – B – 2)

16

Simulation Experiments and Results. . .

Single-level clustering algorithm.Section 3 – C

17

Simulation Experiments and Results.

Single-level clustering algorithm.Section 3 – C

18

Procedure. The sensors communicate the gathered data to level-1 clusterheads. The level-1 s aggregate this data and communicate the aggregated

data or estimates based on the aggregated data to level-2 s and so on.

Hierarchical clustering algorithm.Section 4

Level 3

Level 2

Level 1

19

Algorithm. Each sensor decides to become a level-1 with certain

proba-bility and advertises itself as a clusterhead to the sensors with-in its radio range .

This advertisement is forwarded to all the sensors within hops of the advertising .

Each sensor that receives an advertisement joins the clus-ter of the closest level-1 ; the remaining sensors become forced level-1 s.

Level-1 s then elect themselves as level-2 s with a certain probability and broadcast their decision of becoming a level-2 .

This decision is forwarded to all the sensors within hops. The level-1 s that receive the advertisements from level-2 s

joins the cluster of the closest level-2 . All other level-1 s become forced level-2 s.

Hierarchical clustering algorithm.Section 4 - A

20

Optimal parameters for the algorithm. Assumptions.

: the number of members in a level- cluster. : the sum of distances between the members of a level- clus-

ter and their level- . : the number of hops from a member to its in a typical level-

cluster. : the total number of level- s. : the total cost of communicating information from all level- s

to the level-() s. : the total cost of communicating information from the sensors

to the data processing center through the hierarchy of cluster-heads generated by the clustering algorithms.

By properties of the Poisson process, level- s, are governed by homogeneous Poisson processes of intensities,

Hierarchical clustering algorithm.Section 4 - B

21

Optimal parameters for the algorithm. The sum of distance of level-() s from a level- , in a typical

level- cluster or the sum of distance of sensors from a level-1 is given by,

The expected number of level-() s in a typical level- cluster is given by,

Hierarchical clustering algorithm.Section 4 - B

22

Optimal parameters for the algorithm. The expected number of hops between a level-() and its

level- in a typical level- cluster is given by,

The expected number of level- s is given by,

Hierarchical clustering algorithm.Section 4 - B

23

Optimal parameters for the algorithm. The expected total cost of communicating information from

all the level-() s to their respective level- s, , is given by,

Hierarchical clustering algorithm.Section 4 - B

24

Optimal parameters for the algorithm. The expected total cost of communicating information from

sensors to the processing center in the clustered environ-ment is given by,

Hierarchical clustering algorithm.Section 4 - B

Eq (6)

Eq (16)

Eq (14)

Eq (15)

25

Optimal parameters for the algorithm. By un-conditioning on , we find;

Hierarchical clustering algorithm.Section 4 - B

Eq (8)

26

Optimal parameters for the algorithm. The optimal probabilities are obtained, following the same

arguments as in section 3-B.2, can be calculated according to the equation,

denotes the probability that the number of hops between a member and the clusterhead in a level- cluster is more than .

Hierarchical clustering algorithm.Section 4 - B

Eq (12)

27

Numerical Results and Simulations.

Hierarchical clustering algorithm.Section 4 - C