Sie sind auf Seite 1von 6

4/19/2015

LeastmeansquaresfilterWikipedia,thefreeencyclopedia

Leastmeansquaresfilter
FromWikipedia,thefreeencyclopedia

Leastmeansquares(LMS)algorithmsareaclassofadaptivefilterusedtomimicadesiredfilterbyfindingthefiltercoefficients
thatrelatetoproducingtheleastmeansquaresoftheerrorsignal(differencebetweenthedesiredandtheactualsignal).Itisa
stochasticgradientdescentmethodinthatthefilterisonlyadaptedbasedontheerroratthecurrenttime.Itwasinventedin1960by
StanfordUniversityprofessorBernardWidrowandhisfirstPh.D.student,TedHoff.

Contents
1Problemformulation
1.1Relationshiptotheleastmeansquaresfilter
1.2Definitionofsymbols
2Idea
3Derivation
4Simplifications
5LMSalgorithmsummary
6Convergenceandstabilityinthemean
7Normalisedleastmeansquaresfilter(NLMS)
7.1Optimallearningrate
7.2Proof
8Seealso
9References
10Externallinks

Problemformulation

Relationshiptotheleastmeansquaresfilter
TherealizationofthecausalWienerfilterlooksalotlikethesolutiontotheleastsquaresestimate,exceptinthesignalprocessing
domain.Theleastsquaressolution,forinputmatrix andoutputvector is

http://en.wikipedia.org/wiki/Least_mean_squares_filter

1/6

4/19/2015

LeastmeansquaresfilterWikipedia,thefreeencyclopedia

TheFIRleastmeansquaresfilterisrelatedtotheWienerfilter,butminimizingtheerrorcriterionoftheformerdoesnotrelyon
crosscorrelationsorautocorrelations.ItssolutionconvergestotheWienerfiltersolution.Mostlinearadaptivefilteringproblems
canbeformulatedusingtheblockdiagramabove.Thatis,anunknownsystem
istobeidentifiedandtheadaptivefilter
attemptstoadaptthefilter
tomakeitascloseaspossibleto
,whileusingonlyobservablesignals
,
and
but
,
and
arenotdirectlyobservable.ItssolutioniscloselyrelatedtotheWienerfilter.

Definitionofsymbols
isthenumberofthecurrentinputsample
isthenumberoffiltertaps
(Hermitiantransposeorconjugatetranspose)

estimatedfilterinterpretastheestimationofthefiltercoefficientsafternsamples

Idea
ThebasicideabehindLMSfilteristoapproachtheoptimumfilterweights
,byupdatingthefilterweightsinamannerto
convergetotheoptimumfilterweight.Thealgorithmstartsbyassumingasmallweights(zeroinmostcases),andateachstep,by
findingthegradientofthemeansquareerror,theweightsareupdated.Thatis,iftheMSEgradientispositive,itimplies,theerror
wouldkeepincreasingpositively,ifthesameweightisusedforfurtheriterations,whichmeansweneedtoreducetheweights.In
thesameway,ifthegradientisnegative,weneedtoincreasetheweights.So,thebasicweightupdateequationis:
,
where representsthemeansquareerror.Thenegativesignindicatesthat,weneedtochangetheweightsinadirectionoppositeto
thatofthegradientslope.
Themeansquareerror,asafunctionoffilterweightsisaquadraticfunctionwhichmeansithasonlyoneextrema,thatminimises
themeansquareerror,whichistheoptimalweight.TheLMSthus,approachestowardsthisoptimalweightsby
ascending/descendingdownthemeansquareerrorvsfilterweightcurve.

Derivation
TheideabehindLMSfiltersistousesteepestdescenttofindfilterweights
definingthecostfunctionas

where

istheerroratthecurrentsamplenand

whichminimizeacostfunction.Westartby

denotestheexpectedvalue.

Thiscostfunction(
)isthemeansquareerror,anditisminimizedbytheLMS.ThisiswheretheLMSgetsitsname.Applying
steepestdescentmeanstotakethepartialderivativeswithrespecttotheindividualentriesofthefiltercoefficient(weight)vector

where isthegradientoperator

http://en.wikipedia.org/wiki/Least_mean_squares_filter

2/6

4/19/2015

LeastmeansquaresfilterWikipedia,thefreeencyclopedia

Now,
isavectorwhichpointstowardsthesteepestascentofthecostfunction.Tofindtheminimumofthecostfunction
weneedtotakeastepintheoppositedirectionof
.Toexpressthatinmathematicalterms

where isthestepsize(adaptationconstant).Thatmeanswehavefoundasequentialupdatealgorithmwhichminimizesthecost
function.Unfortunately,thisalgorithmisnotrealizableuntilweknow

Generally,theexpectationaboveisnotcomputed.Instead,toruntheLMSinanonline(updatingaftereachnewsampleisreceived)
environment,weuseaninstantaneousestimateofthatexpectation.Seebelow.

Simplifications
Formostsystemstheexpectationfunction
estimator

where

mustbeapproximated.Thiscanbedonewiththefollowingunbiased

indicatesthenumberofsamplesweuseforthatestimate.Thesimplestcaseis

Forthatsimplecasetheupdatealgorithmfollowsas

IndeedthisconstitutestheupdatealgorithmfortheLMSfilter.

LMSalgorithmsummary
TheLMSalgorithmfora thorderalgorithmcanbesummarizedas
Parameters:

filterorder
stepsize

Initialisation:
Computation: For

Convergenceandstabilityinthemean
AstheLMSalgorithmdoesnotusetheexactvaluesoftheexpectations,theweightswouldneverreachtheoptimalweightsinthe
absolutesense,butaconvergenceispossibleinmean.Thatis,eventhoughtheweightsmaychangebysmallamounts,itchanges
abouttheoptimalweights.However,ifthevariancewithwhichtheweightschange,islarge,convergenceinmeanwouldbe
misleading.Thisproblemmayoccur,ifthevalueofstepsize isnotchosenproperly.
http://en.wikipedia.org/wiki/Least_mean_squares_filter

3/6

4/19/2015

LeastmeansquaresfilterWikipedia,thefreeencyclopedia

If ischosentobelarge,theamountwithwhichtheweightschangedependsheavilyonthegradientestimate,andsotheweights
maychangebyalargevaluesothatgradientwhichwasnegativeatthefirstinstantmaynowbecomepositive.Andatthesecond
instant,theweightmaychangeintheoppositedirectionbyalargeamountbecauseofthenegativegradientandwouldthuskeep
oscillatingwithalargevarianceabouttheoptimalweights.Ontheotherhandif ischosentobetoosmall,timetoconvergetothe
optimalweightswillbetoolarge.
Thus,anupperboundon isneededwhichisgivenas
where

isthegreatesteigenvalueoftheautocorrelationmatrix

algorithmbecomesunstableand

.Ifthisconditionisnotfulfilled,the

diverges.

Maximumconvergencespeedisachievedwhen

where
isthesmallesteigenvalueofR.Giventhat islessthanorequaltothisoptimum,theconvergencespeedisdetermined
by
,withalargervalueyieldingfasterconvergence.Thismeansthatfasterconvergencecanbeachievedwhen
iscloseto
,thatis,themaximumachievableconvergencespeeddependsontheeigenvaluespreadof .
Awhitenoisesignalhasautocorrelationmatrix
where isthevarianceofthesignal.Inthiscasealleigenvaluesare
equal,andtheeigenvaluespreadistheminimumoverallpossiblematrices.Thecommoninterpretationofthisresultisthereforethat
theLMSconvergesquicklyforwhiteinputsignals,andslowlyforcoloredinputsignals,suchasprocesseswithlowpassorhigh
passcharacteristics.
Itisimportanttonotethattheaboveupperboundon onlyenforcesstabilityinthemean,butthecoefficientsof

canstill

growinfinitelylarge,i.e.divergenceofthecoefficientsisstillpossible.Amorepracticalboundis

where

denotesthetraceof

.Thisboundguaranteesthatthecoefficientsof

donotdiverge(inpractice,thevalueof

shouldnotbechosenclosetothisupperbound,sinceitissomewhatoptimisticduetoapproximationsandassumptionsmadeinthe
derivationofthebound).

Normalisedleastmeansquaresfilter(NLMS)
Themaindrawbackofthe"pure"LMSalgorithmisthatitissensitivetothescalingofitsinput
.Thismakesitveryhard(if
notimpossible)tochoosealearningrate thatguaranteesstabilityofthealgorithm(Haykin2002).TheNormalisedleastmean
squaresfilter(NLMS)isavariantoftheLMSalgorithmthatsolvesthisproblembynormalisingwiththepoweroftheinput.The
NLMSalgorithmcanbesummarisedas:
Parameters:

filterorder
stepsize

Initialization:
Computation: For

Optimallearningrate
http://en.wikipedia.org/wiki/Least_mean_squares_filter

4/6

4/19/2015

LeastmeansquaresfilterWikipedia,thefreeencyclopedia

Itcanbeshownthatifthereisnointerference(

),thentheoptimallearningratefortheNLMSalgorithmis

andisindependentoftheinput
andthereal(unknown)impulseresponse
),theoptimallearningrateis

Theresultsaboveassumethatthesignals

and

.Inthegeneralcasewithinterference(

areuncorrelatedtoeachother,whichisgenerallythecaseinpractice.

Proof
Letthefiltermisalignmentbedefinedas

,wecanderivetheexpectedmisalignmentforthenext

sampleas:

Let

and

Assumingindependence,wehave:

Theoptimallearningrateisfoundat

,whichleadsto:

Seealso
Recursiveleastsquares
ForstatisticaltechniquesrelevanttoLMSfilterseeLeastsquares.
SimilaritiesbetweenWienerandLMS
http://en.wikipedia.org/wiki/Least_mean_squares_filter

5/6

4/19/2015

LeastmeansquaresfilterWikipedia,thefreeencyclopedia

Multidelayblockfrequencydomainadaptivefilter
Zeroforcingequalizer
Kerneladaptivefilter

References
MonsonH.Hayes:StatisticalDigitalSignalProcessingandModeling,Wiley,1996,ISBN0471594318
SimonHaykin:AdaptiveFilterTheory,PrenticeHall,2002,ISBN0130484342
SimonS.Haykin,BernardWidrow(Editor):LeastMeanSquareAdaptiveFilters,Wiley,2003,ISBN0471215708
BernardWidrow,SamuelD.Stearns:AdaptiveSignalProcessing,PrenticeHall,1985,ISBN0130040290
WeifengLiu,JosePrincipeandSimonHaykin:KernelAdaptiveFiltering:AComprehensiveIntroduction,JohnWiley,2010,
ISBN0470447532
PauloS.R.Diniz:AdaptiveFiltering:AlgorithmsandPracticalImplementation,KluwerAcademicPublishers,1997,ISBN0
792399129

Externallinks
LMSAlgorithminAdaptiveAntennaArrays(http://www.antennatheory.com/arrays/weights/lms.php)www.antenna
theory.com
LMSNoisecancellationdemo(http://www.advsolned.com/example_ale_nc.html)www.advsolned.com
Retrievedfrom"http://en.wikipedia.org/w/index.php?title=Least_mean_squares_filter&oldid=653139450"
Categories: Digitalsignalprocessing Filtertheory Stochasticalgorithms
Thispagewaslastmodifiedon23March2015,at10:43.
TextisavailableundertheCreativeCommonsAttributionShareAlikeLicenseadditionaltermsmayapply.Byusingthissite,
youagreetotheTermsofUseandPrivacyPolicy.WikipediaisaregisteredtrademarkoftheWikimediaFoundation,Inc.,a
nonprofitorganization.

http://en.wikipedia.org/wiki/Least_mean_squares_filter

6/6

Das könnte Ihnen auch gefallen