Sie sind auf Seite 1von 4

The CIS121 Emergency Ohgodhelpme

1. Insertion Sort
(a) O(n
2
)
2. Merge Sort
(a) O(nlog n)
(b) In sorting, nlog n is the best running time for any sort
3. Binary Search
(a) O(log n)
4. Recurrence Relations
(a) Masters Theorem
T(n) = aT

n
b

+ (n
k
)
i. Case 1: if a > b
k
, then T(n) (n
log
b
a
)
ii. Case 2: if a = b
k
, then T(n) (n
k
log n)
iii. Case 3: if a < b
k
, then T(n) (n
k
)
(b) Example
T(n) = 2T

n
2

+ nlog n
T(n) = 2

2T

n
2
2

+
n
2
log
n
2

+ nlog n
T(n) = 2
2
T

n
2
2

+ nlog
n
2
+ nlog n
T(n) = 2
2

2T

n
2
3

+
n
2
2
log
n
2
2

+ nlog
n
2
+ nlog n
T(n) = 2
3
T

n
2
3

+ nlog

n
2
2

+ nlog

n
2

+ nlog n
T(n) = 2
k
T

n
2
k

+ n
k1

i=0
log
n
2
i
Bottoms out when
n
2
k
1 k log n.
T(n) = 2
log n
T(1) + n
log n1

i=0
(log n log 2)
T(n) cknlog n
T(n) = O(nlog
2
n)
1
5. Maximum Contiguous Subsequence
(a) Divide and conquer : O(nlog n)
(b) Lemma: Given a smaller problem, I can do it for free
6. Stacks (Linked List)
(a) Push : O(1)
(b) Pop : O(1)
7. Queues (Array)
(a) Enqueue : O(1)
(b) Dequeue : O(1)
(c) Resize when able, because memory is cheap, but not exponential
8. Binary Heaps
(a) Back a Priority Queue
(b) Complete binary tree - must be lled on bottommost level from left
(c) MaxHeapify(): (log n)
(d) RemoveMax(): O(log n)
(e) IncreaseKey(): O(log n)
(f) Insert(): O(log n)
(g) BuildHeap() : (n)
9. Breadth-rst search
(a) Visit all neighbors before moving on to the next node
(b) O(n + m)
(c) Lemma: In a BFS Tree, if an edge exists between two nodes, then they must be within the
same later or within one layer of each other
10. Adjacency Matrix
(a) Space: (n
2
)
(b) Edge Exists: O(1)
(c) Visit neighbors from one node: O(n)
11. Adjacency List
(a) Space: O(n + m)
(b) Edge Exists: O(deg(u))
(c) Visit all neighbors: (deg(u))
12. Depth-rst search
(a) Go as deep as you can, and then work your way back up until everything has been explored.
(b) O(n + m)
(c) Parenthesis Theorem: The discovery and nish time of a vertex v must either be within the
discovery and nish time of vertex u or completely outside of d(u) and f(u)
(d) Whitepath Theorem: At d(u), if v is a descendant of u, then there exists a "whitepath" from
u to v
2
(e) Consider edge (u, v)e. When e is rst explored at vertex u, what is vs color?
i. White: e is a tree edge
ii. Gray: e is a back edge
iii. Black: e is a forward or cross edge
(f) Lemma: Suppose G is undirected. Then all edges are either tree edges or back edges.
13. Shortest paths
(a) Dijkstras Algorithm
i. Take into account the overall edge distance in building the shortest paths
ii. Does not work with negative edge weights
iii. O(n + m) log n with min heaps
14. Minimum Spanning Trees
(a) Prims Algorithm
i. Repeatedly grow the tree by one edge, choosing the edge with the least edge weight
ii. Negative edge weights okay
iii. O(E log V )
(b) Kruskals Algorithm
i. Sort edges in increasing order of weights, and then begin adding edges as long as a cycle
is not formed
ii. O(E log V )
(c) Reverse Delete Algorithm
i. Sort edges in decreasing order of weights, and then begin removing edges as long as the
graph remains connected
ii. O(E log V (log log V )
3
)
(d) Lemma: If an edge is the minimum cost edge that crosses the cut, then that edge belongs to
every MST
15. Strongly connected components
(a) Kosarajus Algorithm
i. Reverse all edges and run DFS to get the nishing times. Now run DFS o of the nishing
times in increasing order.
ii. O(V + E)
16. Topological Sort
(a) Run DFS and make f[], then order vertices in decreasing order of f
(b) Lemma: In a DAG, there is always a vertex with outdegree 0 and another with indegree 0
(c) Lemma: A directed graph G is acyclic i G has no back edges
17. Union-nd
(a) Improves the runtime eciency of Kruskals Algorithm; reduced from O(mlog n) to O(n(n)),
where is the inverse Ackermans function, and for all intents and purposes is 5.
(b) Find: O(log n)
(c) Union: O(log n)
18. Binary search trees
(a) Build a sorted array from a BST by performing an in-order traversal
3
(b) Search(): O(height)
(c) Insert(): O(height)
(d) Delete(): O(height)
(e) Successor(): O(height)
19. AVL trees
(a) Balanced BST with strictly logarithmic height
(b) Lemma: Height of an AVL is O(log n)
20. Hashing
(a) Used to implement dictionaries
(b) Search(): O(1)average
(c) Insert(): O(1)average
(d) Delete(): O(1)average
(e) Bad mapping leads to collisions
(f) But it is possible to map dierent items to the same key with chaining, which we can limit
by playing with the load factor
(g) Universal hash functions are hash functions that guarantee a low rate of collison while also
making each hash for an element unique; ongoing problem in computer science
21. Tries
(a) Find(): O(Pattern)
(b) Insert(): O(Pattern)
(c) Delete(): O(Pattern)
4

Das könnte Ihnen auch gefallen