Sie sind auf Seite 1von 4

AlgorithmicComplexity

Introduction
Algorithmiccomplexityisconcernedabouthowfastorslowparticularalgorithmperforms.Wedefine
complexityasanumericalfunctionT(n)timeversustheinputsizen.Wewanttodefinetimetakenbyan
algorithmwithoutdependingontheimplementationdetails.ButyouagreethatT(n)doesdependonthe
implementation!Agivenalgorithmwilltakedifferentamountsoftimeonthesameinputsdependingonsuch
factorsas:processorspeedinstructionset,diskspeed,brandofcompilerandetc.Thewayaroundistoestimate
efficiencyofeachalgorithmasymptotically.WewillmeasuretimeT(n)asthenumberofelementary"steps"
(definedinanyway),providedeachsuchsteptakesconstanttime.

Letusconsidertwoclassicalexamples:additionoftwointegers.Wewilladdtwointegersdigitbydigit(orbit
bybit),andthiswilldefinea"step"inourcomputationalmodel.Therefore,wesaythatadditionoftwonbit
integerstakesnsteps.Consequently,thetotalcomputationaltimeisT(n)=c*n,wherecistimetakenby
additionoftwobits.Ondifferentcomputers,additonoftwobitsmighttakedifferenttime,sayc1andc2,thus
theadditonoftwonbitintegerstakesT(n)=c1*nandT(n)=c2*nrespectively.Thisshowsthatdifferent
machinesresultindifferentslopes,buttimeT(n)growslinearlyasinputsizeincreases.

Theprocessofabstractingawaydetailsanddeterminingtherateofresourceusageintermsoftheinputsizeis
oneofthefundamentalideasincomputerscience.

AsymptoticNotations
Thegoalofcomputationalcomplexityistoclassifyalgorithmsaccordingtotheirperformances.Wewill
representthetimefunctionT(n)usingthe"bigO"notationtoexpressanalgorithmruntimecomplexity.For
example,thefollowingstatement

T(n)=O(n2)
saysthatanalgorithmhasaquadratictimecomplexity.

Definitionof"bigOh"

Foranymonotonicfunctionsf(n)andg(n)fromthepositiveintegerstothepositiveintegers,wesaythatf(n)=
O(g(n))whenthereexistconstantsc>0andn0>0suchthat

f(n)c*g(n),forallnn0

Intuitively,thismeansthatfunctionf(n)doesnotgrowfasterthang(n),orthatfunctiong(n)isanupperbound
forf(n),forallsufficientlylargen

Hereisagraphicrepresentationoff(n)=O(g(n))relation:
Examples:

1=O(n)
n=O(n2)
log(n)=O(n)
2n+1=O(n)

The"bigO"notationisnotsymmetric:n=O(n2)butn2O(n).

Exercise.Letusproven2+2n+1=O(n2).Wemustfindsuchcandn0thatn2+2n+1c*n2.Letn0=1,
thenforn1

1+2n+n2n+2n+n2n2+2n2+n2=4n2
Therefore,c=4.

ConstantTime:O(1)
Analgorithmissaidtoruninconstanttimeifitrequiresthesameamountoftimeregardlessoftheinputsize.
Examples:

array:accessinganyelement
fixedsizestack:pushandpopmethods
fixedsizequeue:enqueueanddequeuemethods

LinearTime:O(n)

Analgorithmissaidtoruninlineartimeifitstimeexecutionisdirectlyproportionaltotheinputsize,i.e.time
growslinearlyasinputsizeincreases.Examples:

array:linearsearch,traversing,findminimum
ArrayList:containsmethod
queue:containsmethod

LogarithmicTime:O(logn)
Analgorithmissaidtoruninlogarithmictimeifitstimeexecutionisproportionaltothelogarithmoftheinput
size.Example:
binarysearch

Recallthe"twentyquestions"gamethetaskistoguessthevalueofahiddennumberinaninterval.Eachtime
youmakeaguess,youaretoldwhetheryourguessisstoohighortoolow.Twentyquestionsgameimploiesa
strategythatusesyourguessnumbertohalvetheintervalsize.Thisisanexampleofthegeneralproblem
solvingmethodknownasbinarysearch:

locatetheelementainasorted(inascendingorder)arraybyfirstcomparingawiththemiddleelement
andthen(iftheyarenotequal)dividingthearrayintotwosubarraysifaislessthanthemiddleelement
yourepeatthewholeprocedureintheleftsubarray,otherwiseintherightsubarray.Theprocedure
repeatsuntilaisfoundorsubarrayisazerodimension.

Note,log(n)<n,whenn.AlgorithmsthatruninO(logn)doesnotusethewholeinput.

QuadraticTime:O(n2)
Analgorithmissaidtoruninlogarithmictimeifitstimeexecutionisproportionaltothesquareoftheinput
size.Examples:

bubblesort,selectionsort,insertionsort

Definitionof"bigOmega"

Weneedthenotationforthelowerbound.Acapitalomeganotationisusedinthiscase.Wesaythatf(n)=
(g(n))whenthereexistconstantcthatf(n)c*g(n)forforallsufficientlylargen.Examples

n=(1)
n2=(n)
n2=(nlog(n))
2n+1=O(n)

Definitionof"bigTheta"
Tomeasurethecomplexityofaparticularalgorithm,meanstofindtheupperandlowerbounds.Anewnotation
isusedinthiscase.Wesaythatf(n)=(g(n))ifandonlyf(n)=O(g(n))andf(n)=(g(n)).Examples

2n=(n)
n2+2n+1=(n2)

AnalysisofAlgorithms

Thetermanalysisofalgorithmsisusedtodescribeapproachestothestudyoftheperformanceofalgorithms.In
thiscoursewewillperformthefollowingtypesofanalysis:

theworstcaseruntimecomplexityofthealgorithmisthefunctiondefinedbythemaximumnumberof
stepstakenonanyinstanceofsizea.
thebestcaseruntimecomplexityofthealgorithmisthefunctiondefinedbytheminimumnumberofsteps
takenonanyinstanceofsizea.
theaveragecaseruntimecomplexityofthealgorithmisthefunctiondefinedbyanaveragenumberof
stepstakenonanyinstanceofsizea.
theamortizedruntimecomplexityofthealgorithmisthefunctiondefinedbyasequenceofoperations
appliedtotheinputofsizeaandaveragedovertime.
Example.Letusconsideranalgorithmofsequentialsearchinginanarray.ofsizen.

ItsworstcaseruntimecomplexityisO(n)
ItsbestcaseruntimecomplexityisO(1)
ItsaveragecaseruntimecomplexityisO(n/2)=O(n)

AmortizedTimeComplexity
Consideradynamicarraystack.Inthismodelpush()willdoubleupthearraysizeifthereisnoenoughspace.
Sincecopyingarrayscannotbeperformedinconstanttime,wesaythatpushisalsocannotbedoneinconstant
time.Inthissection,wewillshowthatpush()takesamortizedconstanttime.

Letuscountthenumberofcopyingoperationsneededtodoasequenceofpushes.

push() copy oldarraysize newarraysize


1 0 1
2 1 1 2
3 2 2 4
4 0 4
5 4 4 8
6 0 8
7 0 8
8 0 8
9 8 8 16

Weseethat3pushesrequires2+1=3copies.

Weseethat5pushesrequires4+2+1=7copies.

Weseethat9pushesrequires8+4+2+1=15copies.

Ingeneral,2n+1pushesrequires2n+2n1+...+2+1=2n+11copies.

Asymptoticallyspeaking,thenumberofcopiesisaboutthesameasthenumberofpushes.

2n+11
limit=2=O(1)
n2n+1

Wesaythatthealgorithmrunsatamortizedconstanttime.

VictorS.Adamchik,CMU,2009

Das könnte Ihnen auch gefallen