0 Stimmen dafür0 Stimmen dagegen

12 Aufrufe63 SeitenOct 08, 2017

© © All Rights Reserved

PDF, TXT oder online auf Scribd lesen

© All Rights Reserved

Als PDF, TXT **herunterladen** oder online auf Scribd lesen

12 Aufrufe

© All Rights Reserved

Als PDF, TXT **herunterladen** oder online auf Scribd lesen

- Neuromancer
- The E-Myth Revisited: Why Most Small Businesses Don't Work and
- How Not to Be Wrong: The Power of Mathematical Thinking
- Drive: The Surprising Truth About What Motivates Us
- Chaos: Making a New Science
- The Joy of x: A Guided Tour of Math, from One to Infinity
- How to Read a Person Like a Book
- Moonwalking with Einstein: The Art and Science of Remembering Everything
- The Wright Brothers
- The Other Einstein: A Novel
- The 6th Extinction
- The Housekeeper and the Professor: A Novel
- The Power of Discipline: 7 Ways it Can Change Your Life
- The 10X Rule: The Only Difference Between Success and Failure
- A Short History of Nearly Everything
- The Kiss Quotient: A Novel
- The End of Average: How We Succeed in a World That Values Sameness
- Made to Stick: Why Some Ideas Survive and Others Die
- Algorithms to Live By: The Computer Science of Human Decisions
- The Universe in a Nutshell

Sie sind auf Seite 1von 63

Bhattacharya 1

INTRODUCTION TO IMAGE

CLASSIFICATION

Lecture 7

Concept of Image Classification

2

to categories or classes of interest

vegetation, bare soil, rocky areas, cloud, shadow,

Concept of Image Classification

3

symbols

Number of bands = n;

Number of classes = L

a single class in the set of classes D

GNR401 Dr. A. Bhattacharya

Concept of Image Classification

4

or categories, the relationship between the data and

the classes into which they are classified must be well

understood

trained

Classification techniques were originally developed

out of research in Pattern Recognition field

Concept of Image Classification

5

involves the process of the computer program

learning the relationship between the data and the

information classes

Learning techniques

Feature sets

Types of Learning

6

Supervised Learning

Learning process designed to form a mapping from one set

of variables (data) to another set of variables (information

classes)

A teacher is involved in the learning process

Unsupervised learning

Learning happens without a teacher

Exploration of the data space to discover the scientifc laws

underlying the data distribution

Features

7

on which the elements are assigned to various

classes.

measurements made by sensors in different

wavelengths of the electromagnetic spectrum

visible/ infrared / microwave/texture features

Features

8

temperature, blood pressure, lipid profile, blood

sugar, and a variety of other data collected through

pathological investigations

or quantitative.

(positive) or absence of heart disease (negative)

Supervised Classification

9

domain knowledge using which the classifier can be

guided to learn the relationship between the data

and the classes.

class can be identified using this prior knowledge

Partially Supervised Classification

10

For some classes, and not for others,

For some dates and not for others in a multitemporal

dataset,

can be employed for partially supervised

classification of images

Unsupervised Classification

11

experience of an analyst is missing, the data can

still be analyzed by numerical exploration, whereby

the data are grouped into subsets or clusters based

on statistical similarity

Supervised vs. Unsupervised

12

Classifiers

Supervised classification generally performs better

than unsupervised classification IF good quality

training data is available

preliminary analysis of data prior to supervised

classification

Role of Image Classifier

13

discriminates one class against others

classes (multiclass)

another class (two class)

Discriminant Function

14

x and class ck, k=1,,L

Denote g(ck,x) as gk(x) for simplicity

Multiclass Case

gk(x) > gl(x), l = 1,,L, l k x ck

g(x) > 0 x c1; g(x) < 0 x c2

Example of Image Classification

15

Recognition of characters or digits from bitmaps of

scanned text

Distinguishing between text and graphics in scanned

document

Prototype / Training Data

16

experienced interpreter), small sets of sample pixels

are selected for each class.

important for proper representation of the total

pixel population in terms of the samples

Statistical Characterization of Classes

17

function (pdf) denoted by p(x | ck)

indicated by p(x | ck)

class ck given that the pixels feature vector is x

Supervised Classification Algorithms

18

informational classes, e.g.:

Parallelpiped

Maximum Likelihood (ML)

Support Vector Machines (SVM)

Artificial Neural Networks (ANN)

Supervised Classification Principles

19

thematic classes forest, marshy vegetation,

agricultural land, turbid water, clear water, open soils,

manmade objects, desert etc.

small sets of pixels in each class that are reliably

selected by a human analyst through experience or

with the help of a map of the area

Supervised Classification Principles

20

Mean vector

Covariance matrix

Minimum and maximum gray levels within each band

Conditional probability density function p(Ci|x) where Ci is

the ith class and x is the feature vector

classified should be specified by the user

Prototype Pixels for Different Classes

21

of pixels belonging to each class

governed by the mathematical theory of sampling

belonging to different classes

22

Parallelepiped Classifier - Example of

23

a Supervised Classifier

Assign ranges of values for each class in each band

Really a feature space classifier

class

falls within the corresponding box

Parallelepiped Classifier

24

Parallelepiped Classifier

25

Advantages/Disadvantages of

26

Parallelpiped Classifier

Does NOT assign every pixel to a class. Only the

pixels that fall within ranges.

classes (if there are many unclassified pixels)

Problems when class ranges overlapmust develop

rules to deal with overlap areas.

Minimum Distance Classifier

27

The method:

Calculate the statistical (Euclidean) distance from each

pixel to class mean vector

Assign each pixel to the class it is closest to

Minimum Distance Classifier

28

Minimum Distance Classifier

29

Algorithm

Estimate class mean vector and covariance matrix from training

samples

mi = SjCi Xj ; Ci = E{(X - mi ) (X - mi )T } | X Ci}

Compute distance between X and mi

X Ci if d(X, mi) d(X,mj) j

maxk P(Ck|X) < Tmin

Minimum Distance Classifier

30

from a class mean (still picks closest class) unless the

Tmin condition is applied

different ways Euclidean, Mahalanobis, city block,

Maximum Likelihood Classifier

31

different classes conditional on the available

features, and assigns the pixel to the class with the

highest likelihood

Likelihood Calculation

32

taken as the conditional probability P(Ci|x).

probability of class Ci given the pixel vector x.

probability of a class given the feature vector.

Instead, it is computed indirectly in terms of the

conditional probability of feature vector x given that

it belongs to class Ci.

Likelihood Calculation

33

P(x|Ci)

x is assigned to class Cj such that

the image

P(x) is the multivariate probability density function of

feature x.

GNR401 Dr. A. Bhattacharya

Likelihood Calculation

34

estimation is said to be non-parametric estimation.

estimation is said to be parametric.

The training data x with the class already given, can be used

to estimate the conditional density function P(x|Ci)

Likelihood Calculation

35

distributed in practical parametric classifiers.

Gaussian distribution is mathematically simple to

handle.

Each class conditional density function P(x|Ci) is

represented by its mean vector mi and covariance

matrix Si

1 ( x i )T Si 1 ( x i )

p (x | Ci ) L/2 1/ 2

e

(2 ) | Si |

GNR401 Dr. A. Bhattacharya

Assumption of Gaussian Distribution

36

That implies each class has a mean mi that has the highest

likelihood of occurrence

vector x deviates from the mean vector mi

Smaller the variance, steep will be the decrease, and larger

the variance, slower will be the decrease.

Likelihood Calculation

37

Taking logarithm

1 of the Gaussian distribution,

L 1we get

gi ( x) ( x i )t Si1 ( x i ) ln 2 ln Si ln P(i )

2 2 2

different.

t 1

The term ( x i Si ( x i )

)

is known as the Mahalanobis distance between x and mi (after

Prof. P.C. Mahalanobis, famous Indian statistician and founder

of Indian Statistical Institute)

Interpretation of Mahalanobis

38

distance

x and y is

t 1

d M ( x, y ) ( x y ) S ( x y )

If the covariance matrix is k.I, (I is the unit matrix) then the

Mahalanobis distance reduces to a scaled version of the

Euclidean distance.

Mahalanobis distance reduces the Euclidean distance according

to the extent of variation within the data, given by the

covariance matrix S

Advantages/Disadvantages of

39

Maximum Likelihood Classifier

Normally classifies every pixel no matter how far it is from a

class mean

Slowest method more computationally intensive

Normally distributed data assumption is not always true, in

which case the results are not likely to be very accurate

Thresholding condition can be introduced into the classification

rule to separately handle ambiguous feature vectors

Nearest-Neighbor Classifier

40

Non-parametric in nature

The algorithm is:

training samples

x is assigned to the class of the nearest training

sample (in the feature space)

This method does not depend on the class statistics

like mean and covariance.

41

K-NN Classifier

42

Simple in concept, time consuming to implement

For a pixel to be classified, find the K closest training

samples (in terms of feature vector similarity or

smallest feature vector distance)

Among the K samples, find the most frequently

occurring class Cm

Assign the pixel to class Cm

K-NN Classifier

43

closest samples), i=1,2,,L (number of classes)

Note that ki K

i

gi(x) = ki

The classifier rule is

Assign x to class Cm if gm(x) > gi(x), for all i, im

K-NN Classifier

44

are closest to the feature vector of pixel x. Therefore the

discriminant function is refined further as

ki

j

1/

j 1

d ( x , xi )

gi ( x) L ki

j

1/ d ( x, x

l 1 j 1

l )

The distances of the nearest neighbours to the feature vector of the pixel

to be classified are taken into account

K-NN Classifier

45

prior probabilities can be taken into account:

ki p (i )

gi ( x) L

l 1

k p ( )

l l

For each pixel to be classified, the feature space distances to

all training pixels are to be computed before the decision is

made, due to which this procedure is extremely computation

intensive, and is not used when the dimensionality (number of

bands) of the feature space is large, e.g., with hyperspectral

data.

Spectral Angle Mapper

46

matrix, its inverse, and the distance for each pixel

(X m)TS-1(X m) is highly time consuming and if the

covariance matrix is close to singular then its inverse can be

unstable, leading to erroneous results

In such cases, alternate methods can be applied, such as

Spectral Angle Mapper

S.A.M. Principle

47

angle between the class vector and the pixel feature

vector x is given by

cosq = [vi.x] / [|vi||x|]

For small values of q, the value of cosq is large

The likelihood of x to belong to different classes can

be ranked according to the value of cosq.

S.A.M. Advantage

48

affected by minor changes in vi or x.

Mahalanobis distance computation involved in ML

method

Feature Data

49

samples, N band training data, the given image is

classified after training the classifier using training

data

affected?

Feature Selection/Reduction

50

based on some criteria

performing some transformation on the input data;

e.g., PCT

large training data

GNR401 Dr. A. Bhattacharya

Mixed Pixels

51

contain parts of many landuse classes

e.g., tree, bare soil, grass

classes within a pixel

Mixed Pixels

52

compared as in hyperspectral remote sensing

pure class

each class assuming a particular model for mixing

GNR401 Dr. A. Bhattacharya

Feature Selection/Reduction

53

based on some criteria

performing some transformation on the input data;

e.g., PCT

large training data

GNR401 Dr. A. Bhattacharya

Motivation to Consider Feature

54

Selection

Image classification

each element should have useful features

Class separability in feature space

55

Band 2 Band 2

Band 1 Band 1

Low separability (with one feature)

56

p(X|Ci)

Each class is represented

by normal distribution, but

with close means and

high variances between

means

mi mj

Important Features

57

Shape Features

Spectral Features

Texture Features

Transform Features

Unsupervised Classification

58

experience of an analyst is missing, the data can

still be analyzed by numerical exploration, whereby

the data are grouped into subsets or clusters based

on statistical similarity

Unsupervised Classification

59

teacher

understand the structure of the data using statistical methods

such as clustering algorithms

Clustering Algorithms

60

space where L is the number of bands (The letter K is

reserved for the number of clusters!)

to form clusters

than to members of another cluster In other words,

they possess low intra-cluster variability and high inter-

cluster variability

K-Means

61

Iterative algorithm

Number of clusters K is known by user

Most popular clustering algorithm

Initialize randomly K cluster mean vectors

Assign each pixel to any of the K clusters based on

minimum feature distance

After all pixels are assigned to the K clusters, each

cluster mean is recomputed.

Iterate till cluster mean vectors stabilize

ISODATA ALGORITHM

62

(the last A added to make the acronym sound

better)

Developed in biology in the 1960s by Ball and Hall

See Tou and Gonzalezs classic Pattern Recognition

Principles for an excellent exposition to clustering

algorithms

User specified parameters for

63

ISODATA

Generalization of K-Means algorithm

Consists of many user-specified parameters

Minimum size of cluster

Maximum size of cluster

Maximum intra-cluster variance

Minimum separation between pairs of clusters

Maximum number of clusters

Minimum number of clusters

Maximum number of iterations

- Introducing Machine Learning With matlabHochgeladen vonMahmoud Hassanien
- CLUSTERING.pptxHochgeladen vonSyedDabeerAli
- Application of K-Means Clustering Algorithm for Classification of NBA GuardsHochgeladen vonATS
- Document Clustering Using Compound WordsHochgeladen vonwer615371899
- UNCOVERING FEATURES NOISE WITH K-MEANS ALGORITHM USING ENTROPY MEASUREHochgeladen vonijsret
- Textural Feature Extraction of Natural Objects for Image ClassificationHochgeladen vonAI Coordinator - CSC Journals
- 1-s2.0-S1876610212004146-mainHochgeladen vonsuchi87
- K-Means Clustering in ExcelHochgeladen vonalanpicard2303
- IJETTCS-2014-08-08-89Hochgeladen vonAnonymous vQrJlEN
- Continuous k MeansHochgeladen vonmicman
- ijciaHochgeladen voninvincible_shalin6954
- [IJETA-V5I6P3]:Ei Ei Phyo, Ei Ei MyatHochgeladen vonIJETA - EighthSenseGroup
- l 13 Hier ClusterHochgeladen vonMayurMistry
- Zhong - 2005 - Efficient Online Spherical K-means ClusteringHochgeladen vonswarmbees
- Song_Genre_Classification_through_Quanti.pdfHochgeladen vonGalina Alexeeva
- Standard and Genetic k-means Clustering Techniques in Image SegmentationHochgeladen vonKumaran Gnanasekaran
- Clustering With K-Means in Python _ the Data Science LabHochgeladen vonhamedfazelm
- K-meansHochgeladen vonKurushNishanth
- K Means AlgorithmHochgeladen vonAsir Mosaddek Sakib
- k meanHochgeladen vonShivram Dwivedi
- A Meta-heuristic Approach for Image Segmentation using Firefly AlgorithmHochgeladen vonseventhsensegroup
- Exact and Approximation Algorithms for Clustering PDFHochgeladen vonNathan
- ir 13 1Hochgeladen vonapi-299622755
- ClusteringHochgeladen vonnitin
- Tutte CoseHochgeladen vonMauro Piazza
- Evolutionary ClusteringHochgeladen vonAdam Asshidiq
- data miningHochgeladen vonJijeesh Baburajan
- Letter of Interest for Position at BTHHochgeladen vonAslee Rana
- An Enhanced K-Medoid Clustering AlgorithmHochgeladen vonEditor IJRITCC
- Lesson8 ClusteringHochgeladen vonprabhudeen

- neural_networks_-_basics_matlab.pdfHochgeladen vonWesley Doorsamy
- cvpr2004-keypoint-rahuls.pdfHochgeladen vonShabeeb Ali Oruvangara
- Getting Started With Data ion | Tally Remote Support | Tally Corporate Services | SQL to TallyHochgeladen vonalbimorkal
- CEX-IMT-Work.pdfHochgeladen vonShabeeb Ali Oruvangara
- Content based Image RetrievalHochgeladen vonShabeeb Ali Oruvangara
- FRS.pdfHochgeladen vonShabeeb Ali Oruvangara
- wavelets.pptxHochgeladen vonShabeeb Ali Oruvangara
- Bayesian Classifier implementation using MATLABHochgeladen vonShabeeb Ali Oruvangara
- D- Filter BankHochgeladen vonsrmanohara
- shabeeb_pr2.pdfHochgeladen vonShabeeb Ali Oruvangara
- lab-report.pdfHochgeladen vonShabeeb Ali Oruvangara
- ellipse fitting.pdfHochgeladen vonShabeeb Ali Oruvangara
- presenattion-1Hochgeladen vonShabeeb Ali Oruvangara
- ref-1Hochgeladen vonShabeeb Ali Oruvangara
- EC303 Applied Electromagnetic Theory Syllabus(1)Hochgeladen vonShabeeb Ali Oruvangara
- Enum ItemHochgeladen vonDouglas Ortlieb
- image processing mahalanobis distanceHochgeladen vonShabeeb Ali Oruvangara
- Academic Calendar 2017-18.pdfHochgeladen vonJoona John
- 3.Logic GatesHochgeladen vonShabeeb Ali Oruvangara
- Research MethodologyHochgeladen vonShabeeb Ali Oruvangara
- ESalsabeel Dhul Qa'Ad 1432Hochgeladen vonShabeeb Ali Oruvangara
- ESalsabeel Dhul Hajj 1432Hochgeladen vonShabeeb Ali Oruvangara
- Maximally Decimated Filter BanksHochgeladen vonShabeeb Ali Oruvangara
- Image Classification Features City_landscapeHochgeladen vonShabeeb Ali Oruvangara

- Cluster Analysis With RHochgeladen vonReaderRat
- Kmeans-ICDM06Hochgeladen vonDrAmit Kumar Chandanan
- Using an unsupervised approach of Probabilistic Neural Network(PNN) for land use classification from multitemporal satellite imagesHochgeladen vonAhmed Iounousse
- Resumo_IGU_Brisbane 2006.pdfHochgeladen vonMonica Kofler
- Clustering - Fuzzy C-meansHochgeladen vonSunilAjmeera
- An Analysis of Various Algorithms For Text Spam Classification and Clustering Using RapidMiner and WekaHochgeladen vonijcsis
- R in Action, Second EditionHochgeladen vonDreamtech Press
- Data Mining Techniques in Fraud DetectionHochgeladen vonIntroCPA
- Data mining & analysisHochgeladen vonФранческо Леньяме
- A Survey of Binary Similarity and Distance MeasuresHochgeladen vonInformatika Universitas Malikussaleh
- MlibHochgeladen vonAkshat Upneja
- Practical Data Analysis Cookbook - Sample ChapterHochgeladen vonPackt Publishing
- Exact and Approximation Algorithms for Clustering PDFHochgeladen vonNathan
- Classification ToolboxHochgeladen vonbadi143
- Image Segmentation Digital Image ProcessingHochgeladen vonnaveednad2003556
- The Configuration of Symbolic Boundaries Against Immigrants in EuropeHochgeladen vonDana Stoica
- Data MiningHochgeladen vonjsarishti
- Improved K-means clustering on HadoopHochgeladen vonEditor IJRITCC
- A Survey on the Classification Techniques In Educational Data MiningHochgeladen vonATS
- Predicting NDUM Student's Academic Performance Using Data Mining Techniques (2009)Hochgeladen vonZaihisma_Che_C_2828
- A Technical Study and Analysis on Fuzzy Similarity Based Models For Text ClassificationHochgeladen vonLewis Torres
- sonar ThesisHochgeladen vonPedro
- Header Based Classification of Journals Using Document Image Segmentation and Extreme Learning MachineHochgeladen vonAI Coordinator - CSC Journals
- UpgradHochgeladen vonnsathishk
- 02 - Pixel ConnectivityHochgeladen vonAí Quốc Đào
- Mishra 2016Hochgeladen vonAnonymous qrJACASiJ
- (Terrorism, Security, And Computation) Kishan G. Mehrotra,Chilukuri K. Mohan,HuaMing Huang (Auth.)- Anomaly Detection Principles and Algorithms-Springer International Publishing (2017)Hochgeladen vonVinicius Mello
- Data Warehousing and Data Mining Lab ManualHochgeladen vonReddy Avula
- From Data Mining to KDD_1996_Fayyad, Piatesky-Shapiro and Smyth_2015!06!22Hochgeladen vonEnio Basualdo
- J Neurophysiol 2011 Yeo Jn.00338Hochgeladen vonnoracristina

## Viel mehr als nur Dokumente.

Entdecken, was Scribd alles zu bieten hat, inklusive Bücher und Hörbücher von großen Verlagen.

Jederzeit kündbar.