You are on page 1of 291

# By V V L Divakar Allavarapu

## UNIT II Predicate Logic

Quantifiers
There are two types of quantifiers 1. Universal quantifier ( pronounced as For All) 2. Existential quantifier ( pronounced as There Exists)

Universal Quantifier
1. All Kings are persons x: king(x) person(x) 2. All people are literal x: person(x) literate(x) 3. All men are people x: man(x) person(x) 4. All Pompieans were romans x: pompiens(x) roman(x)

Existential Quantifier ()

## 1. There is some people who wrote games

x : person(x) wrote(x, games) 2. There is a person who wrote chess x : person(x) wrote(x, chess) 3. Everyone is loyal to someone. x y : loyalto(x,y)

Predicate Sentences
1. Marcus was a man man(Marcus) 2. Marcus was a Pompeian Pompeian(Marcus) 3. All Pompeians were Romans x: Pompeian(x) Roman(x) 4. Caeser was a ruler ruler(Caeser)

Predicate Sentences
5. All Romans were either loyal to Caesar or hated him. x : Roman(x) loyalto(x,Caeser) v hate(x, Caeser) 6. Everyone is loyal to someone. x: y : loyalto(x,y)

Predicate Sentences
7. People only try to assassinate rulers they are not loyal to. x: y: Person(x) & ruler(y)& tryassassinate(x,y) ~loyalto(x,y) 8. Marcus tried to assassinate Caesar. tryassisanate(Marcus, Caesar) 9. All Men are people. x : man(x) person(x)

Predicate Sentences
Answer the Question Was Marcus loyal to Caesar? We need to prove ~Loyalto(Marcus,Caesar) OR Loyalto(Marcus,Caesar)

Predicate Sentences
1. Marcus was a man (1) 2. All men are people (9) 3.Conclusion Marcus was a person 4. Marcus tried to assassinate Caesar(8) 5. Caesar was a ruler (4) 6. people only try to assassinate rulers they are not loyal to.(7) Conclude from (3, 4 and 5) Marcus not loyal to Caesar.

Predicate Sentences
man(Marcus) | (Predicate 9) person(Marcus) | (Predicate 8) person(Marcus) & Tryassassinate(Marcus,Caesar) | (Predicate 4)
(Predicate 1)

Predicate Sentences

## instance and isa Relationship

Knowledge can be represented as classes, objects, attributes and Super class and sub class relationships. Knowledge can be inference using property inheritance. In this elements of specific classes inherit the attributes and values.

## instance and isa Relationship

Attribute instance is used to represent the relationship Class membership (element of the class) Attribute isa is used to represent the relationship Class inclusion (super class, sub class relationship)

## instance and isa Relationship

1. man(Marcus) 2. Pompeian(Marcus) 3. x: Pompeian(x) Roman(x) 4. Ruler(Caesar) 5. x : Roman(x) loyalto(x,Caeser) v hate(x, Caeser)

## Using instance Attribute

1. instance (Marcus,man) 2. instance (Marcus,Pompeian) 3. x: instance(x,Pompeian) instance(x,Roman) 4. instance(Caesar, Ruler) 5. x : instance(x,Roman) loyalto(x,Caeser) v hate(x, Caeser)

## Using isa Attribute

1. instance (Marcus,man) 2. instance (Marcus,Pompeian) 3.isa (Pompeian, Roman) 4. instance(Caesar, Ruler) 5. x : instance(x,Roman) loyalto(x,Caeser) v hate(x, Caeser) 6. x: y: z: instance(x, y) & isa(y, z)instance(x, z)

## Computable functions and Predicates

Some of the computational predicates like Less than, Greater than used in knowledge representation. It generally return true or false for the inputs. Examples: Computable predicates gt(1,0) or lt(0,1) gt(5,4) or gt(4,5) Computable functions: gt(2+4, 5)

cont...
1. marcus was a man
man(Marcus) 2. Marcus was a pompeian Pompeian(Marcus) 3. Marcus was born in 40 A.D born(marcus, 40) 4. All men are mortal x: men(x) mortal(x) 5. All Pompeians died when the volcano erupted in 79 A.D erupted(volcano,79) & x :pompeian(x)died(x, 79)

cont...
6. No mortal lives longer than150 years x: t1: t2: mortal(x) & born(x,t1) & gt(t2-t1,150) dead(x,t1) 7. It is Now 1991 Now=1991 8. Alive means not dead x: t: [ alive(x,t) ~dead(x,t)] & [~dead(x,t)alive(x,t)] 9. If someone dies then he is dead at all later times x: t1: t2: died(x,t1) & gt(t2,t1) dead(x1,t2)

cont...
1. man(Marcus)
2. Pompeian(Marcus) 3. born(marcus, 40) 4. x: men(x) mortal(x) 5. erupted(volcano,79) 6. x :pompeian(x)died(x, 79) 7. x: t1: t2: mortal(x) & born(x,t1) & gt(t2-t1,150) dead(x,t1) 8. Now=1991 9. x: t: [ alive(x,t) ~dead(x,t)] & [~dead(x,t)alive(x,t)] 10. x: t1: t2: died(x,t1) & gt(t2,t1) dead(x1,t2)

cont...

Is Marcus alive?

cont...
~alive(Marcus, Now) | (9, Substitution) Dead(Marcus, Now) | (10, Substituation) pompeian(Marcus) & gt(now ,t1) |(5, Substituation) pompeian(Marcus) & gt(now ,79) | (2) gt(now,79) | (8, substitute Equals) gt( 1991, 79) | True

cont...
Disadvantage: Many steps required to prove simple conclusions

Verity of processes such as matching and substitution used to prove simple conclusions

Resolution

Resolution is a proof procedure by refutation. To prove a statement using resolution it attempt to show that the negation of that statement.

Conversion to Conjunctive Normal Form All Romans who know Marcus either hate
Caesar or think that any one hates any one is crazy. x : [Roman(x) & known(x,Marcus)] [hate(x, Caeser) v ( y: z :hate(y,z) thinkcrazy(x,y))] CNF Equivalent: ~Roman(x) v ~known(x,Marcus) v hate(x, Caeser) v ~hate(y,z) v thinkcrazy(x,y)

## Algorithm : converting to CNF

1. Eliminate ab = ~a v b x : ~[Roman(x) & known(x,Marcus)] v [hate(x, Caeser) v ( y: z : ~hate(y,z) v thinkcrazy(x,y)]

CNF...
2. Reduce the scope of ~ ~(~p) = p ~(a & b)= ~a v ~b ~(a v b)= ~a & ~b x : [~Roman(x) v ~known(x,Marcus)] v [hate(x, Caeser) v ( y: z : ~hate(y,z) v thinkcrazy(x,y)]

CNF...
3. Make each quantifier bind to a unique variables x : P(x) v x Q(x) by eliminate unique variables x : P(x) v y Q(y)

CNF...
4. Move all quantifiers to the left of the formula x : y: z : [~Roman(x) v ~known(x,Marcus)] v [hate(x, Caeser) v ( ~hate(y,z) v thinkcrazy(x,y)]

CNF...
5. Eliminate existential quantifier() by substituting variable reference to y: president(y) president(F1) x: y: father-of(x, y) x: father-of(F(x), x)

CNF...
6. Drop the prefix [~Roman(x) v ~known(x,Marcus)] v [hate(x, Caeser) v ( ~hate(y,z) v thinkcrazy(x,y)] 7. Convert statement into conjunction of disjunct (a & b) v c = (a v b) & (b v c)

## Resolution in Propositional Logic

Given Axioms: 1. P 2. (P & Q) R 3. (S v T) Q 4. T Step 1: Convert all Axioms into clause form 1. P 2. ~P v ~Q v R 3. (a) ~S v Q (b) ~T v Q 4. T

Propositional Resolution
Step 2: Negate the proposition we want to prove and add it to the existing clauses example : Form Above we want to prove R so ~R add it to clauses

Propositional Resolution...
step 3: select some clauses and try to prove our assumption is wrong.
(~R) & (~P v ~Q v R) [clause 2] | ~P v ~Q | [clause 1] (~P v ~Q) & P | ~Q | [clause 3(b)] (~Q) &(~T v Q) | ~T | [clause 4] (~T) & (T) | Contradiction

Unification
Unification is the process of finding substitutions that make different logical expressions look identical Proposition Logic: R & ~R Predicate Logic: man(Marcus) & ~man(Marcus) man(Marcus) & ~man(Spot)

Cont
Solution for this problem is matching and substitution. Example: Unify P(x, x) P(y, z) Here x, y, z are variables

## Non Monotonic Reasoning

Example
ABC Murder Story: Abbott(A), Babbitt(B), Cabot(C) be suspects in a Murder case. 1. A has alibi, in the register of respected Hotel. 2. B also has alibi, for his brother-in-law testified that B was visiting at that time. 3. C pleads alibi too, claiming to watch live match in the ground(But we have only his words).

Example...
So We can believe: 1. That A did not commit the crime 2. That B did not commit the crime 3. That A or B or C did Conclusion ?

Example...

But C have been caught by Live television. So new belief is 4. That C did not commit the crime.

Monotonic Reasoning
1. It is complete with respect to the domain interest. 2. It is consistent 3. Knowledge increase monotonically when new facts can be added. Ex: KB1 = KBL KB2 = KBL U F ( F is some facts) than KB1 is sub set of KB2

## Non Monotonic Reasoning

1. It is may not be complete, allow inference to be made on the basis of lake of knowledge. 2. It is inconsistent 3. Knowledge may decrease when new facts can be added.

Approaches

Approaches to handle these problems 1. Non Monotonic Reasoning (Belief) 2. Statistical Reasoning (Certainty)

## Logics for Non Monotonic Reasoning

Deferent Reasonings

1. Default Reasoning a) Non Monotonic Logic(NML) b) Default Logic(DL) 2. Minimalist Reasoning a) Closed World Assumption (CWA)

## Non Monotonic Logic

This is the Predicate logic, augmented with model operator M, which can be read as is consistent

NML Example

x: y: Related(x, y) & M GetAlong(x, y) WillDefend(x, y) For all X and Y are related and if the fact that X gets along with Y is consistent with everything else that is believed, than conclude that X will defend Y

NML Example...

1. x: Republican(x) & M ~Pacifist(x) ~Pacifist(x) 2. x: Quaker(x) & M Pacifist(x) Pacifist(x) 3. Republican(Marcus) 4. Quaker(Marcus)

Default Logic
It is an alternative logic. In this rules are represented in the form of A: M B / C

## If A is provable and it is consistent to assume B then conclude C.

Abduction Resoning
Deduction: x: A(x) B(x) Given A(marcus) we conclude B(marcus) Abduction: It is the reverse process It is given B(Marcus) we conclude A(Marcus) But it is wrong some times

## Closed World Assumption(CWA)

It is a simple kind of minimalist reasoning. Courses offered: CS 101, CS 203, CS 503 How many courses will be offered? Answer ? or ?

CWA...
May be one to infinity. Reason is that course assertions are do not deny unmentioned courses are also offered. (incomplete information) Courses are different from each other.

CWA...
The assumption is that the provided information is complete. So not asserted to be true are assumed to be false. Example: Airline KB Application Is there any flight from Vskp to Hyd? ~Connect(Vskp, Hyd) is asserted when we can not prove Connect(Vskp, Hyd)

Implementation Issues
1. How to update Knowledge incrementally ? 2. Many facts are eliminated when new Knowledge become available. How should it be manged? 3. Theories are not computationally effective? These issues can be handled by search control. Depth first search ? Breadth first search?

## Depth First Search

Cronological Backtracking
It is a depth first search backtracking. It makes a guess at something, thus it creating a branch in the search space. If our guess is wrong, back up there and try alternative. It is leaving everything after guess

Example

We need to Know the fact 'F'. Which can be derived by making some assumption 'A' and derive 'F'. It also derives some additional facts 'G' and 'H' from 'F' Later we derive new facts 'M' and 'N', They are independent of 'A' and 'F'.

Example...

A F G H M N

Example...

At some time a new fact invalidates 'A'. In cronological backtracking invalidates all F, G, H, M, N even M, N not depend on assumption.

Exmple 2
Problem: Finding a time at which three busy people can all attend a meating Assumption: Meating held on wednesday Found a fact: All are free at 2:00 So choose 2:00 is the meating time.

Example...
Assume day= Wed

After Many steps that only time all people availabe is 2:00 PM FAIL(a special conference has all the rooms booked on Wed)

Repete same time finding process and again decide on 2:00 PM. For same reasons. Try to find room SUCCEED

Problem

Based on the order they geerated by search process insted of responcibility of inconsistancy, we may waste a great effort

## Dependency Directed Backtracking

It makes a guess at something, associate each node one or more justification in the search space.

## Two Approaches for Dependency Directed Backtracking

Justification based Truth Maintenance systems(JTMS) Logical based Truth Maintenance systems(JTMS)

## Justification based Truth Maintenance Systems (JTMS)

JTMS...
JTMS has an ability to provided dependency directed backtracking and so to support nonmonotonic reasoning.
Example: ABC Murder Story

Initially our believe that A is the primary suspect. Because he was a beneficiary and he had no alibi.

contd...

## Using Default Logic: Beneficiary(x): M ~Alibi(x) / Suspect(x)

Dependency Network
Suspect A[IN] IN List + - (OUT List) Alibi Abbott

Benificiary A

Abort should be a suspect when it is belived that he is a benificiary and it is not belived that he has an alibi

Dependency Network...

There are three assertions: 1. Suspect A(Primary Murder suspect) 2. Benificiary A(He is benificiary of the victim) 3. Alibi Abbott(A was at a hotel at the time)

Dependency Network...
Suspect A[OUT] + Benificiary A [IN] + Registered A [IN] + Far Away [IN] + Alibi A [IN] _ Registered Forged A [OUT]

Abort should not be a suspect when it is belived that he is a benificiary and it is belived that he has an alibi

Dependency Network...
Suspect B[OUT] + Benificiary B [IN] + Say So B-I-L [IN] + Alibi B [IN] _ Lies B-I-L [OUT]

B should not be a suspect when it is belived that he is a benificiary and it is belived that he has an alibi

Dependency Network...
Suspect C[IN] + Benificiary C [IN] + Alibi C [OUT] + +

Tells truth Cabot [OUT] Abort should not be a suspect when it is belived that he is a benificiary and it is belived that he has an alibi

Dependency Network...
Suspect C[OUT] + Benificiary C [IN] + + Alibi C [IN] +

+ _ C seen in TV TV Forgery Tells truth Cabot [IN] [IN] [OUT] C should be a suspect when it is belived that he is a benificiary and it is belived that he has no alibi

Dependency Network...
Contradiction[IN] Sespect A Sespect A Sespect Other

Sespect A

## Logical Based Truth Maintainance System (LTMS)

It is similar to the JTMS. In JTMS the nodes in the network are treated as atoms. Which assumes no rerelationships among them except the ones that are expliucitly stated in the justifications. Example: we can represent Lies B-I-L and not Lies B-I-L and labled both of them IN. No contruduction will be detected automatically.

LTMS...

In LTMS contradiction will be detected automatically. In this we need not create explicit contradiction

## Breadth First Search

Statistical Reasoning

Basic Probability

## 1. 0 <= p(a) <= 1 2. p(a)+p(~a) = 1 3. p(a V b) = p(a) + p(b) - p(a & b)

Prior Probability
It is associated the degree of belief in the absence of any other information

## P(A) = 0.3 P(Cavity)=0.1 It is used only when no other related information.

Conditional Probability
Once we have obtained some evidence concerning previously unknown random variable conditional probabilities should be used. P(a|b)= 0.2 Probability of a with known evidence b P(Cavity|Toothache) = 0.8

Conditional Probability...

Product Rule: P(a & b)= P(a|b) P(b) P(a & b)= P(b|a) P(a)

P(a|b)=P(a&b) / P(b)

Bayes Theorem
Bayes rule states: The probability of the hypothesis(H) to be true with Known observations(E) is P(H|E) = P(H&E) / P(E) => P(H|E) = P(E|H)P(H) / P(E) For N events if P(A1)+P(A2)+.......+P(An)=1 P(Ai)|B)= P(B|Ai)*P(Ai) P(B|A1)*P(A1)+....+P(B|An)*P(An)

Bayes Theorem...
Doctor know that cavity causes the patent has toothache say 50%. Prior probability that any patent has toothache 1/20 and cavity 1/1000. P(Toothache|Cavity)=0.5 P(Cavity)=0.001 P(Toothache)=0.05 Finding P(Cavity|Toothache) = 0.5 * 0.001/0.05 = 0.01

Bayes Network
S: Sprinkler was on last night W: Grass is wet R: It rained last night
Sprinkler Rain

## Wet P(wet|Sprinkler) = 0.9 P(Rain|wet) =0.7

Bayes Network
P(C)= 0.5 Cloudy C t f P(S) .10 .50 Sprinkler Rain C t f P(R) .80 .20

## Wet S R t t t f f t f f P(W) .99 .90 .90 .00

Bayes Theorem
Disadvantages: 1. Too many probabilities need to be provided 2. Space to store all probabilities 3. The time required to compute the probabilities 4. This theory is good for well structured situation in which all the data is available and the assumptions are satisfied. Unfortunately these conditions may not occur in reality.

## Certainty Factors and Rule-based systems

cont...
In MYCIN Expert system each rule is associated with certainty factor which is the measure of the evidence to be believed. MYCIN Rule looks like: If 1. The stain of the organism is gram positive and 2. The Morphology is Coccus and 3. The Growth is Clumps then suggestive evidence 0.7 that is staphylococcus.

Certainty Factor

Certainty factor CF[h,e] is defined in two components: MB[h,e] A measure (b/w 0 and 1) of belief in hypothesis h given evidence e. MD[ h,e] A measure (b/w 0 and 1) of disbelief in hypothesis h given evidence e. CF[h,e] = MB[h,e] - MD[h,e]

Cont
In MYCIN model, e for two evidences e1 and e2 supporting hypothesis h. The measure of belief MB is MB[h,e1&e2] = MB[h,e1] +MB[h,e2] *(1-MB[h,e1]) MD[h,e1&e2] = MD[h,e1] +MD[h,e2] *(1MD[h,e1])

Cont
If MD[h,e1&e2] = 0 Or MB[h,e1&e2]=1 All the evidences (e1 and e2) approves the hypothesis (h) Or MD[h,e1&e2] = 1 Or MB[h,e1&e2]=0 All the evidences (e1 and e2) disproves the hypothesis (h)

Example for CF
Set of rules r1,r2, r7 are given a support of evidence for the hypothesis H is conclusion that it is an elephant e1: r1:It has a tail 0.3 e2: r2:It has a trunk 0.8 e3: r3:It has a heavy body 0.4 e4: r4:It has four legs 0.2 e5: r5:It has black colour 0.1 e6: r6:It has stripes 0.6 e7: r7:It has long flat ears 0.6

For rule 1: MB= 0.3 and MD=0 Inclusion of effect of rule 2 gives the value of MB and MD as MB=0.3+0.8 * (1- 0.3) =0.86 and MD=0 For rule 3 inclusion MB=0.86+0.4 * (1- 0.8) =0.94 and MD=0 For rule 4 inclusion MB=0.94+0.2 * (1- 0.94) =0.952 and MD=0

Cont

Cont
For rule 5 inclusion MB=0.952+0.1* (1- 0.952) =0.9568 and MD=0 For rule 6 inclusion 0.9568 and MD=0.6 For rule 7 inclusion MB= 0.9568 +0.6 (1- 0.9568) =0.98272 and

## Dempster Shafer Theory

DST is design to deal with the distinction between Uncertainty and ignorance. It is very useful to handle epistemic information as well as ignorance or lack of information

cont...

It is represented in the Belief and Plausibility Belief measures the strength of the evidence range from 0 to 1 Plausibility is denoted to be pl(s)= 1-bel(~s)

## Weak-Slots and Filler Structures

Introduction
Knowledge can be represented in slot-and filler system as a set of entities and their attributes The structure is useful beside the support of Inheritance

It enables attribute values to be retrieved quickly Properties of relations are easy to describe It allows ease consideration of object oriented programming.

Introduction
A slot is an attribute value pair in its simplest form A filler is a value that a slot can take -- could be a numeric, string (or any data type) value or a pointer to another slot A weak slot and filler structure does not consider the content of the representation

Semantic Nets

## Semantic nets consists of

Nodes denoting the objects. Links denoting relations between objects. Link labels that denotes particular relations.

Example
Mammal

isa

has-part

Person

Nose

instance

Blue

Uniformcolor

team

Pee-Wee-Reese

Brooklyn-Dodgers

## Representing Binary predicates

Some of the predicates can be asserted from the above figure are:

## Representing Nonbinary predicates

Three or more place predicate can be represented by creating one new object.
Example : score( Cubs, Dodgers, 5-3)
Game
isa

Cubs

Visiting team

G23

score

5-3

Home-team

Dodgers

Representing a sentence
John gave the book to marry

Give
instance agent

BK23
instance objec t

john

EV7

BK23

beneficiary

Mary

Relating Entities
john
height

72

Bill

height

52

If we want to relate these two entities with the fact John is taller then Bill.

Relating Entities
john Bill

height

height

Greater-than

H1

H2

## Relating two entities by creating two objects H1 and H2 with grater-than

Relating Entities
john
height Greaterthan

Bill
height

H1

H2

value

value

72

52

## Partitioned Semantic Nets

Consider simple statement The dog bitten the mail carrier can be represented
Dogs Bite Mall-carrier

isa

isa

isa

assailant

victim

## Partitioned Semantic Nets

If you want to represent quantified expressions in semantic nets, one way is to partition the semantic net into a hierarchical set of spaces Consider the sentence
Every dog has bitten a mail carrier x : dog( x ) -> y: mail-carrier( y ) v bite( x,y)

## Partitioned Semantic Nets

SA

GS
isa
rm fo

Dogs
isa isa

Bite
isa assaila nt victi m

Mallcarrier
isa S1

g
A

## Partitioned Semantic Nets

The above figure g is the instance of special class GS of general statement Every general statement has two attributes

Form which states the relation that is being asserted On or more universal quantifier connections

## Partitioned Semantic Nets

Every dog in town has bitten the constable can be represented
Dogs GS Towndog
isa isa isa assaila nt S1 victi m isa SA

Bite

constables

r fo

g
A

## Partitioned Semantic Nets

Every dog has bitten every mail carrier can be represented

SA

Dogs
isa

Bite
isa assailant

Mall-carrier
isa S1 victim

b
form

GS

isa

Frames
Semantic nets initially we used to represent labeled connections between objects As tasks became more complex the representation needs to be more structured The more structured system it becomes more beneficial to use frames

Frames
A frame is a collection of attributes or slots and associated values that describe some real world entity Each frame represents:

## Frame system example

Person Isa : Cardinality : * handed Adult-Male Isa: : Cardinality : *handed ML-Baseball-Player Isa : Cardinality : *height : *bats handed *batting-average *team : *uniform-colour Fielder Isa : Player Cardinality : *batting-average Mammal 6,000,000,000 : Right Person 2,000,000,000 : Right Adult-Male 624 6-1 : Equal to : : ML-Baseball 376 : .262 .252

## Frame system example

Pee-Wee-Reese Instance Height : Bats Batting-Average Team Uniform-color ML-Baseball Team Isa Cardinality *team-size *Manager Brooklyn-Dodgers Instance Team-size Manager Players : : : Fielder 5-10 : Right : .309 : Brooklyn-Dodgers : Blue Team 26 : 24 : : ML-Baseball-Team : 24 : Leo-Durocher {Pee-Wee-Reese,...}

## Class of All Teams As a Metaclass

Class Instance Isa : : Class Class : Class Class {The number of teams that exist} {Each team has a size} : : : 26 Mammal Class Team {The number of baseball

Team Instance Isa : Cardinality : *team-size : ML-Baseball-Team Isa Instance Isa Cardinality : teams that exist} *team-size : team} Manager

## 24 {The default 24 players per :

cont...
Brooklyn-Dodgers Instance Team Isa : Team-size Manager *uniform-colour Pee-Wee-Reese Instance Dodgers Instance Uniform-colour : Batting-Average : ML-BaseballML-Baseball-Team : 24 : Leo-Durocher : Blue : Brooklyn-

## Classes and Metaclasses

ML-Baseball-Team Team Class(set of sets) ML-BaseballPlayer Pee-WeeReese BrooklynDodgers

## Representing Relationships among classes

Classes can be related to each other Class1 can be a subset of Class2 Mutually-disjoint-with relates a class to one or more other classes that are guaranteed to have no elements in common with it Is-covered-by a set S of mutually disjoint classes than S is called as partition of the class

## Representing Relationships among classes

ML-Baseball-Player
isa isa isa isa isa

Pitcher

Catcher

Fielder

Americanleaguer

Nationalleaguer

instance instance

Three-Finger-Brown

Cont
ML-Baseball-Player Is-covered-by leager} Pitcher Isa : Mutually-disjoint-with Fielder Isa : Mutually-disjoint-with Catcher Isa : Mutually-disjoint-with National-Leaguer Isa Three-Finger-Brown Instance Instance : : : ML-Baseball-Player : {Catcher, Fielder} ML-Baseball-Player : {Pitcher,Catcher} ML-Baseball-Player : {Pitcher, Fielder} ML-Baseball-Player Pitcher National-Leaguer : {Pitcher, Catcher, Fielder} {National-Leaguer, American-

Slot-Values as Objects
John height Bill height : : 72

We could attempt to compare slots by creating slots themselves into objects. we use Lambda() notation for creating objects

Cont
John height: Bill height: 72; x( x.height > Bill.height ) x( x.height < John.height )

Inheritance
Bird
isa

fly:yes
isa

Ostrich fly:no
instance

Pet-Bird
instance

Fifi

fly:?

Cont
Republicanpacifist: false

## Quaker pacifist :no

instance

instance

Dick pacifist:?

Solution
The solution to this problem instead of using path length but use inferential distance. Class1 is closer to class2 then to class3 , if and only if class1 has an inference path through class2 to class3 (class2 is between class1 and class3)

Property inheritance
The set of competing values for a slot S in a frame F contains all those values
Can be derived from some frame X that is above F in the isa hierarchy Are not contradicted by some frame Y that has a shorter inferential distance to F than X does

Bird
isa

fly : yes
isa

Ostrich
is a

fly:no

Pet-Bird

PlumedOstrich
isa

White-Plumed Ostrich
instance

instance

Fifi

fly:?

Cont
Republican pacifist:false
isa

## Quaker pacifist :no

instance

ConservativeRepublican
instance

Dick

pacifist:?

## Reasoning Capabilities of Frames

Consistency checking to verify that when a slot value is added to a frame Propagation of definition values along isa and instance links Inheritance of the default values along isa and instance links

Frame Languages
The idea of Frame system as a way to represent declarative knowledge has been encapsulated in a series of frame oriented knowledge representation languages
KRL [Bobrow and Winograd in 1922] FRL [Roberts and Goldstein, 1977] RLL, KL-ONE, KRYPTON, NIKL, CYCL, Conceptual Graphs, THEO and FRAMEKIT

## UNIT IV Strong-Slots and Filler Structures

Conceptual Dependency
In semantic network and Frame systems may have specialized links and inference procedure but there is no rules about what kind of objects and links are good in general for knowledge representation Conceptual Dependency is a theory of how to represent events in natural language sentences
Facilitates drawing inferences from sentences Independent of the language in which sentences were originally stated

Conceptual Dependency
CD provides
a structure into which nodes representing information can be placed a specific set of primitives at a given level of granularity

## Sentences are represented as a series of diagrams

The agent and the objects are represented The actions are built up from a set of primitive acts which can be modified by tense.

Primitive Acts
ATRANS PTRANS PROPEL MOVE GRASP INGEST EXPEL Transfer of an abstract relationship (e.g.,give) Transfer of the physical location of an object (e.g.,go) Application of physical force to an object (e.g.,push) Movement of a body part by its owner (e.g.,kick) Grasping of an object by an actor (e.g.,clutch) Ingestion of an object by an animal (e.g.,eat) Expulsion of something from the body of an animal (e.g.,cry)

Conceptual Dependency
MTRANS Transfer of mental information (e.g.,tell) MBUILD Building new information out of old (e.g.,decide) SPEAK ATTEND Production of sounds (e.g.,say)

## Focusing of a sense organ toward a. stimulus (e.g.,listen)

Premitive Concepts
conceptual categories provide building blocks which are the set of allowable dependencies in the concepts in a sentence
PP -- Real world objects(picture producers) ACT -- Real world actions PA -- Attributes of objects(Modifiers of PP) AA -- Attributes of actions(Modifiers of actions) T -- Times LOC Locations

Example
Raju ATRANS
p

book

R to from

man Raju

Raju gave the man a book Arrows indicate the direction of dependency Letters above indicate certain Relationship Double arrows () indicate two-way links between the actor (PP) and action (ACT) o -- object. R -- recipient-donor. I -- instrument e.g. eat with a spoon D -- destination e.g. going home.

Modifiers
The use of tense and mood in describing events is extremely important modifiers are: p past delta -- timeless f -- future c -- conditional t -- transition / -- negative ts-- start transition ? interrogative tf-- finished transition k -- continuing the absence of any modifier implies the present tense.

Conceptual Dependency
Arrows indicate the direction of dependency The Double arrow() has an object (actor), PP and action, ACT. I.e. PP ACT. The triple arrow( ) is also a two link but between an object, PP, and its attribute, PA. I.e. PP to PA. It represents isa type dependencies.

1.
2. 3. 4.

PP PA

## John PTRANS John ran John height (>average) John is John

boy nice dog Poss_by john o John PROPEL p o JohnATRANS o book

doctor

5.

PP

Johns dog

6. 1.

ACT ACT

PP o o

PP
PP PP

## John John took the book Mary from mary.

8.

AC T

P I John John INGEST o do Ice cream o spoon P P P P PP PA P D John PTRANS o fertilizer plants field bag

John ate ice cream with a spoon John fertilized the field

9.

ACT D 10.PP

Size>x Size=x

11.

(a)(b)

12.

yesterday
13.

John

PTRANS

14.

PP

woods
CP frog ears

MTRANS

## I heard a frog in the woods

one INGEST

SMOKE R

one cigarette

one

tf
INGE ST dea d aliv e

smo ke

I cigare tte

Joh n

## Bil l Bill nos e Possby Joh n

Bill

MTRA NS

o do 1 o do 2 cf do 1 broke n

Joh n

belie ve Joh n

## brok en Bill threatened John With a broken nose.

Using these primitives involves fewer inference rules. Many inference rules are already represented in CD structure. The holes in the initial structure help to focus on the points still to be established.

Knowledge must be decomposed into fairly low level primitives. Impossible or difficult to find correct set of primitives. A lot of inference may still be required. Representations can be complex even for relatively simple actions

Scripts
Scripts generally used to represent knowledge about common sequence of events Script is a structure that describes a stereotyped sequence of events in a particular context A script consists of a set of slots associated with some information

Components of Scripts
Entry conditions Conditions that must, in general, be satisfied before the events described in the script can occur. Result Conditions that will , in general, be true after the events described in the script have occurred. Props Slots representing objects that are involved in the events described in the script. The presence of these objects can be inferred even if they are not mentioned explicitly.

Roles Slots representing people who are involved in the events described in the script. The presence of these people ,too, can be inferred even if they are not mentioned explicitly. If specific individuals are mentioned, they can be inserted into the appropriate slots. Track The specific variation on a more general pattern that is represented by this particular script. Different tracks of the same script will share many but not all components. Scenes The actual sequences of events that occur. The events are represented in conceptual dependency formalism.

Planning

Contents
Introduction to Planning Blocks world Problem Components of Planning system
Greens approach STRIPS

Sussman Anomaly

## Non Linear Planning Using Constraint Posting

TWEAK Algorithm

Hierarchical Planning

Planning
Planningproblemsarehardproblems Theyarecertainlynontrivial Method which we focus on ways of decomposing the original problem into appropriate subparts and on ways of handling interactions among the subparts during the problem-solving process are often called as planning Planning refers to the process of computing several steps of a problem-solving procedure before executing any of them

## Block World Problem

There are number of square blocks, all the same size. They can be stacked one upon another There is a robot arm that can manipulate the blocks Robot arm can hold one block at a time All block are the same size

Robot Actions
UNSTACK(A,B)Pick up block A from its current position on block B. The arm must be empty and block A must have no block on top of it. STACK(A,B)Place block A on block B. The arm must already be holding and the surface of B must be clear.

Robot Actions
PICKUP(A)Pick up block A from the table and hold it. The arm must be empty and there must be nothing on top of block A. PUTDOWN(A)Put block A down on the table. The arm must have been holding block A.

Set of Predicates
ON(A,B)- Block A is on block B. ONTABLE(A)- Block A is on the table. CLEAR(A)-There is nothing on top of block A. HOLDING(A)- The arm is holding block A. ARMEMPTY- The arm is holding nothing.

Logical Statements
[x:HOLDING(x)]ARMEMPY x: ONTABLE(x) y:ON(x,y) x:[ y:ON(y,x)]CLEAR(x)

## Components of Planning System

Chose best rule to apply Apply the chosen rule Detecting when a solution has been found Detecting dead ends Repairing an almost correct solution

## 1. Chose Best Rule

Isolate set of differences between the desired goal state and the current state Identify those rules that are relevant to reducing those differences (Means-EndsAnalysis) If several rules found then choose best using heuristic information

2. Apply Rules
In simple systems, applying rules is easy. Each rule simply specified the problem state that would result from its application. In complex systems, we must be able to deal with rules that specify only a small part of the complete problem state. One way is to describe, for each action, each of the changes it makes to the state description.

## Green's Approach (Applying Rules)

The changes to a state produced by the application of a rule

## Green's Approach (Applying Rules)

UNSTACK(x,y)= [CLEAR(x,s) ON(x,y,s)] [HOLDING(x,DO(UNSTACK((x,y),s) CLEAR(y,DO(UNSTACK(x,y),s))] Initial State of the problem is S0 If we execute UNSTACK(A,B) in state S0 The state that results from the unstacking operation is S1 => HOLDING(A,S1) CLEAR (B,S1)

## Green's Approach (Applying Rules)

Advantages Resolution can be applied on state description Disadvantages Many rules required to represent problem Difficult to represent complex problems

STRIPS(Applying Rules)
STRIPS approach each operator described by set of lists of predicates STRIPES has three lists are ADD, DELETE, PRECONDITION
A list of things that become TRUE called ADD A list of things that become FALSE called DELETE A set of prerequisites that must be true before the operator can be applied

## STRIPS (Applying Rules)

STACKS(x,y)
P:CLEAR(y) HOLDING(x) D:CLEAR(y) HOLDING(x) A:ARMEMPTYON(x,y)

UNSTACK(x,y)
P:ON(x,y) CLEAR(x) ARMEMPTY D:ON(x,y) ARMEMPTY A:HOLDING(x) CLEARC(y)

## STRIPS (Applying Rules)

PICKUP(x)
P:CLEAR(x) ONTABLE(x) ARMEMPTY D:ONTABLE(x) ARMEMPTY A:HOLDING(x)

PUTDOWN(x)
P:HOLDING(x) D:HOLDING(x) A:ONTABLE(x) ARMEMPTY

## STRIPS (Applying Rules)

If a new attribute is introduced we do not need to add new axioms for existing operators. Unlike in Green's method we remove the state indicator and use a database of predicates to indicate the current state Thus if the last state was: ONTABLE(B)ON(A,B)CLEAR(A) after the unstack operation the new state is ONTABLE(B)CLEAR(B)HOLDING(A)CLEAR(A)

3. Detecting a Solution
A planning system has succeeded in finding a solution to a problem when it has found a sequence of operators that transform the initial problem state into the goal state In simple problem-solving systems we know the solution by a straightforward match of the state description But in complex problem different reasoning mechanisms can be used to describe the problem states, that reasoning mechanisms could be used to discover when a solution had been found

## 4. Detecting Dead Ends

A Planning system must be able to detect when it is exploring a path that can never lead to a solution. Above same reasoning mechanism can be used to detect dead ends In Search process reasoning forward from initial state , it can prune any path that lead to a state from which goal state cannot be reached Similarly backward reasoning, some states can be pruned from the search space

## 5.Repairing an Almost Correct Solution

In completely decomposable problems can be solve the sub problems and combine the sub solutions yield a solution to the original problem. But try to solving nearly decomposable problems one way is use Means-Ends Analysis technique to minimize the difference between initial state to goal state One of the better way to represent knowledge about what went wrong and then apply a direct patch

## Goal Stack Planning

Use of goal stack for solving compound goals by taking advantage of STRIPS method Problem solver makes use of a single stack that contains both goals ,operators and database Database describes the current situation Set of Operators described as PRECONDITIO, ADD and DELETE lists

## Goal Stack Planning

Simple Blocks world problem

Example
We can describe the start state or Goal stack as
ON(B, A) ONTABLE(A) ONTABLE(C) ONTABLE(D) ARMEMPTY

## We can describe the goal state or Goal stack as

ON(C,A)ON(B,D)ONTABLE(A)ONTABLE(D)

Example
Decompose the problem into four different sub problems in the goal stack 1. ON(C,A)
2. ON(B,D) 3.ONTABLE(A) 4.ONTABLE(D)

## and ONTABLE(D) are already true in the initial state.

ONTABLE(A)

Example
Depending on the order in which we want to solve the sub problems. There are two different orders are
(1) ON(C,A) ON(B,D) ON(C,A) ON(B,D) OTAD (2) ON(B,D) ON(C,A) ON(C,A) ON(B,D) OTAD

## OTAD is an abbreviation of ONTABLE(A) ONTABLE(D)

Example
At each step of the problem solving process the top goal on the stack will be solved until the goal stack is empty One last check, the original goal is compared to the finial state derived from chosen operator Choose first alternative, predicate on top of the goal stack is ON(C,A)

Example
First check to see whether ON(C,A) is true in the current state. It is not, so find an operator that could cause it to be true If you Apply STACK(C,A) operator will lead the state to ON(C,A) goal STACK(C,A) ON(B,D) ON(C,A) ON(B,D) OTAD

Example
In order to apply STACK(C,A) operator, its preconditions must hold, so we stack those sub goals CLEAR(A) HOLDING(C) Resultant goal stack is CLEAR(A) HOLDING(C) CLEAR(A) HOLDING(C) STACK(C,A) ON(B,D) ON(C,A) ON(B,D) OTAD

Example
Check whether CLEAR(A) is true. It is not. The only operator that could makes true is UNSTACK(B,A). So apply it to goal stack
UNSTACK(B,A) HOLDING(C) CLEAR(A) HOLDING(C) STACK(C,A) ON(B,D) ON(C,A) ON(B,D) OTAD

Set of preconditions should be satisfied when you are applying UNSTACK(B,A) operator are ON(B, A) CLEAR(B) ARMEMPTY

Example
So goal stack is
ON(B, A) CLEAR(B) ARMEMPTY ON(B, A) CLEAR(B) ARMEMPTY
UNSTACK(B,A) HOLDING(C) CLEAR(A) HOLDING(C) STACK(C,A) ON(B,D) ON(C,A) ON(B,D) OTAD

Example
Compare the top element of the Goal stack ON(B,A) is satisfied. We see that it is satisfied. So pop it off. Consider the next goal CLEAR(B), it is also satisfied. So pop it off. Consider the next goal ARMEMPTY, it is also satisfied. So pop it off. Now apply top element of goal stack, UNSTACK(B,A) operator and pop it off.

Example
The DATABASE corresponding to the world modal at this point
ONTABLE(A) ONTABLE(C) ONTABLE(D) HOLDING(B) CLEAR(A)

## Goal Stack now is,

HOLDING(C) CLEAR(A) HOLDING(C) STACK(C,A) ON(B,D) ON(C,A) ON(B,D) OTAD

Example
Now attempt to satisfy the goal HOLDING(C) The two operators might make this true i.e PICKUP(C) and UNSTACK(C,x). I am considering only first operator. The goal stack is
PICKUP(C) CLEAR(A) HOLDING(C) STACK(C,A) ON(B,D) ON(C,A) ON(B,D) OTAD

Example
So preconditions for PICKUP(C) is
ONTABLE(C) CLEAR(C) ARMEMPTY ONTABLE(C) CLEAR(C) ARMEMPTY PICKUP(C)
CLEAR(A) HOLDING(C) STACK(C,A) ON(B,D) ON(C,A) ON(B,D) OTAD

Example
The top element of the goal stack is ONTABLE(C) satisfied. The next element CLEAR(C) also satisfied . So pop them from goal stack The next element ARMEMPTY is not satisfied since HOLDING(B) is true
ARMEMPTY ONTABLE(C) CLEAR(C) ARMEMPTY PICKUP(C) CLEAR(A) HOLDING(C) STACK(C,A) ON(B,D) ON(C,A) ON(B,D) OTAD

Example
There are two operators to make ARMEMPTY is true, STACK(B,x) and PUTDOWN(B) which operator should we choose? If you look ahead in the goal stack the block B onto D So we choose to apply STACK(B,D) by binding D to x.

## So the goal stack now is CLEAR(D) HOLDING(B) CLEAR(D) HOLDING(B) STACK(B,D)

PICKUP(C) CLEAR(A) HOLDING(C) STACK(C,A) ON(B,D) ON(C,A) ON(B,D) OTAD

Example

## ONTABLE(C) CLEAR(C) ARMEMPTY

Example
Both CLEAR(D) and HOLDING(B) satisfied so pop them from goal stack and apply STACK(B,D). The resultant database
ONTABLE(A) ONTABLE(C) ONTABLE(D) ON(B,D) ARMEMPTY
The goal stack now is PICKUP(C) CLEAR(A) HOLDING(C) STACK(C,A) ON(B,D) ON(C,A) ON(B,D) OTAD

Example
Now all the preconditions for PICKUP(C) are satisfied, so it can be executed Then all the preconditions for STACK(C,A) also satisfied so execute it. So pop these the operators Check for next predicate ON(B,D), so it is already satisfied, pop it off. One last check for combined goal ON(C,A) ON(B,D) OTAD Also satisfied...

Example
The problem solver now it halt and return the plan 1. UNSTACK(B,A) 2. STACK(B,D) 3. PICKUP(C) 4. STACK(C,A)

SUSSMAN ANOMALY
Try to solve the below problem

Example
Two ways to solve the above problem (1) (2) ON(A,B) ON(B,C) ON(B,C) ON(A,B) ON(A,B) ON(B,C) ON(A,B) ON(B,C) Choose alternative 1

Example
ON(C,A) CLEAR(C) ARMEMPTY ON(C,A) CLEAR(C) ARMEMPTY UNSTACK(C,A) ARMEMPTY CLEAR(A) ARMEMPTY PICKUP(A) CLEAR(B) HOLDING(A) STACK(A,B) ON(B,C) ON(A,B) ON(B,C)

Example
All the preconditions of UNSTACK(C,A) are satisfied, so pop it off and apply this operator So the goal stack now is
ARMEMPTY CLEAR(A) ARMEMPTY PICKUP(A) CLEAR(B) HOLDING(A) STACK(A,B) ON(B,C) ON(A,B) ON(B,C)

Example
To satisfy ARMEMPTY of precondition of PICKUP(A) Simply apply operator PUTDOWN(C) and pop all the conditions until ON(B,C) ON(A,B) ON(B,C) The current state is
ONTABLE(B) ONT(A,B) ONTABLE(C) ARMEMPTY

Example
The sequence of operators applied so far is 1. UNSTACK(CA) 2. PUTDOWN(C) 3. PICKUP(A) 4. STACK(A,B)

Example
Then try to achieve the other goal ON(B,C)
ON(A,B) CLEAR(A) ARMEMPTY ON(A,B) CLEAR(A) ARMEMPTY UNSTACK(A,B) ARMEMPTY CLEAR(B) ARMEMPTY PICKUP(B) CLEAR(C) HOLDING(B) STACK(B,C) ON(A,B) ON(B,C)

Example
All the preconditions of UNSTACK(A,B) are satisfied, so pop it off and apply this operator So the goal stack now is ARMEMPTY CLEAR(B) ARMEMPTY PICKUP(B) CLEAR(C) HOLDING(B) STACK(B,C) ON(A,B) ON(B,C)

Example
To satisfy ARMEMPTY of precondition of PICKUP(B) Simply apply operator PUTDOWN(A) and pop all the conditions until ON(A,B) ON(B,C) The current state is

## ON(B,C) ONTABLE(A) ONTABLE(C) ARMEMPTY

Example
But check Remaining goal ON(A,B) ON(B,C) is not satisfied The difference between current state to goal state is ON(A,B) Sequence of operators to be added to the goal stack is 9. PICKUP(A) 10.STACK(A,B)

Example
Now combine the operators and check for goal is satisfied 1. UNSTACK(C,A) 6. PUTDOWN(A) 2. PUTDOWN(C) 7. PICKUP(B) 3. PICKUP(A) 8. STACK(B,C) 4. STACK(A,B) 9.PICKUP(A) 5.UNSTACK(A,B) 10.STACK(A,B)

Example
But the same goal can be achieved using good plan by 1. UNSTACK(C,A) 2. PUTDOWN(C) 3. PICKUP(B) 4. STACK(B,C) 5.PICKUP(A) 6.STACK(A,B)

## Nonlinear Planning Using Constraint Posting

Goal stack planning method solves sub goals one at a time, in order But difficult problems require interactions between sub problems Non linear planning is not composed of a linear sequence of complete sub plans Generally we use heuristic algorithms to solve nonlinear planning

## Nonlinear Planning Using Constraint Posting

If we want to solve the above problem 1. Try to achieve ON(A,B) clearing block A
putting block C on the table. 2. Achieve ON(B,C) by stacking block B on block C. 3. Complete ON(A,B) by stacking block A on block B.

## Nonlinear Planning Systems

HACKER is an automatic programming system (Basic idea) NOHA first non linear planning system The goal stack algorithm of STRIPS was transformed into a goal set algorithm TWEAK used constraint posting as a central technique In constraint posting is to build up a plan by incrementally, partial ordering and binding variables with in the operators

Example
Try to solve the sussmans anomaly using nonlinear planning

Constraint Posting
suggesting operators, trying to order them produce bindings between variables in the operators and actual blocks.

The initial plan consists of no steps There is no order or detail at this stage Gradually more detail constraints about the order of subsets of the steps are introduced until a completely ordered sequence is created

## Heuristics for Planning (Constraint Posting)

1. Step Addition creating new steps (GPS). 2. Promotion constraining a step to come before another step (Sussman's HACKER). 3. Declobbering placing a new step between two steps to revert a precondition (NOAH, NONLIN). 4. Simple Establishment assigning a value to a variable to ensure the preconditions of some step (TWEAK). 5. Separation preventing the assignment certain values to a variables (TWEAK).

Introducing new steps to achieve goals or preconditions is called step addition In our problem, incrementally generate nonlinear plan that is plan with no steps. Means end analysis we choose two steps ON(A,B) and ON(B,C) To achieve the goal add new steps to the problem

CLEAR(B) *HOLDING(A) ---------------------STACK(A,B) ---------------------ARMEMPY ON(A,B) CLEAR(B) HOLDING(A) CLEAR(C) *HOLDING(G) ---------------------STACK(B,C) ---------------------ARMEMPTY ON(B,C) CLEAR(C) HOLDING(B)

Each step with its preconditions above and post conditions below it. Delete post conditions represented with () symbol Unachieved preconditions represented with (*) symbol

To achieve the preconditions of the two steps above we use step addition again
*CLEAR(A) ONTABLE(A) *ARMEMPTY -----------------PICKUP(A) -----------------ONTABLE(A) ARMEMPTY HOLDING(A) *CLEAR(B) ONTABLE(B) *ARMEMPTY ----------------PICKUP(B) -----------------ONTABLE(B) ARMEMPTY HOLDING(B)

Promotion
Promotion first used by sussman in his HACKER program Promotion is posting constraints that one step must precede another Adding PICKUP steps may not satisfy the *HOLDING precondition of STACK step Because there are no ordering constraints present among the steps S1S2 means that step S1 precede S2

Promotion
PICKUP step should precede STACK step so PICKUP(A)STACK(A,B) PICKUP(B)STACK(B,C) The above example in step addition *CLEAR(A) is unachieved because block A is not clear in the initial state *CLEAR(B) is unachieved even B is clear in the initial state. There exist a step STACK(A,B) with post condition CLEAR(B)

Promotion
So we can achieve CLEAR(B) by stating that the PICKUP(B) step must come before the STACK(A,B) PICKUP(B) STACK(A,B) Now turn to two unachieved preconditions *ARMEMPTY and *CLEAR(A) Try to achieve *ARMEMPTY

Promotion
Initial state has an empty arm. Each operators PICKUP(A) and PICKUP(B) has post condition ARMEMPTY
Either operator could prevent the other from executing So order them

## PICKUP(B) PICKUP(A) Original order PICKUP(B) PICKUP(A)STACK(A,B)

Declobbering
Placing a new step in between two old steps Initial state contains an empty arm, so all preconditions of PICKUP(B) is satisfied Result of PICKUP(B) assert ARMEMPTY
This can be solved by inserting another step in between PICKUP(B) and PICKUP(A) to reassert ARMEMPTY STACK(B,C) can be achieved this (Use Heuristic)

## PICKUP(B) STACK(B,C) PICKUP(A)

Simple Establishment
Now try to solve unachieved precondition for PICKUP(A) is *CLEAR(A) from the PICKUP(A), step addition
*ON(x,A) *CLEAR(x) *ARMEMPTY ------------------UNSTACK(x,A) -------------------ARMEMPTY CLEAR(A) HOLDING(A) ON(x,A)

Simple Establishment
Assigning a value to a variable We introduce the variable x because the only post condition we interested is CLEAR(A) X=C in step UNSTACK(x,A)

Simple Establishment
The other preconditions to be satisfied CLEAR(C) and ARMEMPTY, we use promotion to make order UNSTACK(x,A)STACK(B,C) UNSTACK(x,A)PICKUP(A) UNSTACK(x,A)PICKUP(B) The original order so far
UNSTACK(C,A)PICKUP(B) STACK(B,C)PICKUP(A)STACK(A,B)

Example
The step PICKUP(B) requires ARMEMPTY but this is denied by the new UNSTACK(C,A) step Use Declobbering step to plan like PUTDOWN(C) in between the two steps
HOLDING(C) ---------------PUTDOWN(C) ----------------HOLDING(C) ONTABLE(C) ARMEMPTY

Example
The original order so far
UNSTACK(C,A)PUTDOWN(C) STACK(B,C) PICKUP(B)PICKUP(A)STACK(A,B) 1. UNSTACK(C,A) 2. PUTDOWN(C) 3. PICKUP(B) 4. STACK(B,C) 5. PICKUP(A) 6. STACK(A,B)

Example
The above nonlinear planning example we use four steps addition, Promotion, Declobbering and Simple Establishment The other heuristic is Separation makes preventing the assignment certain values to the variables

TWEAK Algorithm
1. 2. InitializeStobethesetofpropositionsinthegoalstate. Repeat I. II. RemovesomeunachievedpropositionPfromS. AchievePbyusingoneoftheheuristics.

## III. Reviewallthesteps,includingadditionalstepstofindall unachievedpreconditions,addthesetoSthesetof unachievedpreconditions. untilthesetSisempty. 3.Completetheplanbyconvertingpartialordersintoatotalorder performingallnecessaryinstantiations,bindingofthe variables.

Hierarchical Planning

Hierarchical Planning
Main difficulty in strips-like planning is complexity, One reason for complexity there is no structure There is no distinction between important and unimportant properties no distinction between important and unimportant operators This observation gives raise to two different ways of abstraction in planning abstraction of situations abstraction of operators

Cont
It is important to be able to eliminate some of the details of the problem until a solution that addresses the main issues is found Early attempts to do this involved the use of macro operators. But in this approach, no details were eliminated from actual descriptions of the operators.

Cont
Consider the example, you want to visit a friend in Europe but you have a limited amount of cash to spend. First preference will be find the airfares, since finding an affordable flight will be the most difficult part of the task. You should not worry about getting out of your driveway, planning a route to the airport etc, until you are sure you have a flight.

## ABSTRIPS (Hierarchical Planning)

ABSTRIPS actually planned in a hierarchy of abstraction spaces, in each of which preconditions at a lower level of abstraction were ignored.
ABSTRIPS approach is as follows:
First solve the problem completely, considering only preconditions whose criticality value is the highest possible. These values reflect the expected difficulty of satisfying the precondition. To do this, do exactly what STRIPS did, but simply ignore the preconditions of lower than peak criticality. Once this is done, use the constructed plan as the outline of a complete plan and consider preconditions at the next-lowest criticality level. Because this approach explores entire plans at one level of detail before it looks at the lower-level details of any one of them, it has been called length-first approach.

## Other Planning Systems

Triangle Table Meta planning Macro-operators Case based planning

## UNIT V Natural Language Processing

Introduction
Language is meant for Communicating about the world. By studying language, we can come to understand more about the world. We look at how we can exploit knowledge about the world, in combination with linguistic facts, to build computational natural language systems.

Introduction
NLP problem can be divided into two tasks: Processing written text, using lexical, syntactic and semantic knowledge of the language as well as the required real world information. Processing spoken language, using all the information needed above plus additional knowledge about phonology as well as enough added information to handle the further ambiguities that arise in speech.

Steps in NLP
Morphological Analysis: Individual words are analyzed into their components and non word tokens such as punctuation are separated from the words. Syntactic Analysis: Linear sequences of words are transformed into structures that show how the words relate to each other. Semantic Analysis: The structures created by the syntactic analyzer are assigned meanings.

Steps in NLP
Discourse integration: The meaning of an individual sentence may depend on the sentences that precede it and may influence the meanings of the sentences that follow it. Pragmatic Analysis: The structure representing what was said is reinterpreted to determine what was actually meant. For example, the sentence Do you know what time it is? should be interpreted as a request to told the time.

Morphological Analysis
Suppose we have an English interface to an operating system and the following sentence is typed: I want to print Bills .init file. Morphological analysis must do the following things: Pull apart the word Bills into proper noun Bill and the possessive suffix s Recognize the sequence .init as a file extension that is functioning as an adjective in the sentence.

Morphological Analysis
This process will also assign syntactic categories to all the words in the sentence. Consider the word prints. This word is either a plural noun or a third person singular verb ( he prints ).

Syntactic Analysis
Syntactic analysis must exploit the results of morphological analysis to build a structural description of the sentence. The goal of this process, called parsing, is to convert the flat list of words that forms the sentence into a structure that defines the units that are represented by that flat list.

Syntactic Analysis
The important thing here is that a flat sentence has been converted into a hierarchical structure and that the structure correspond to meaning units when semantic analysis is performed. Reference markers are shown in the parenthesis in the parse tree. Each one corresponds to some entity that has been mentioned in the sentence.

Syntactic Analysis
S (RM1) NP PRO I (RM2) V Want VP S (RM3) NP PRO I (RM2) V print VP NP (RM4) ADJS Bills (RM5) ADJS .init NP N file

## I want to print Bills .init file.

Semantic Analysis
Semantic analysis must do two important things:
It must map individual words into appropriate objects in the knowledge base or database. It must create the correct structures to correspond to the way the meanings of the individual words combine with each other.

Discourse Integration
Specifically we do not know whom the pronoun I or the proper noun Bill refers to. To pin down these references requires an appeal to a model of the current discourse context, from which we can learn that the current user is USER068 and that the only person named Bill about whom we could be talking is USER073. Once the correct referent for Bill is known, we can also determine exactly which file is being referred to.

Pragmatic Analysis
The final step toward effective understanding is to decide what to do as a results. One possible thing to do is to record what was said as a fact and be done with it. For some sentences, whose intended effect is clearly declarative, that is precisely correct thing to do. But for other sentences, including the one, the intended effect is different.

Pragmatic Analysis
We can discover this intended effect by applying a set of rules that characterize cooperative dialogues. The final step in pragmatic processing is to translate, from the knowledge based representation to a command to be executed by the system. The results of the understanding process is Lpr /wsmith/stuff.init

Syntactic Processing

Syntactic Processing
Syntactic Processing is the step in which a flat input sentence is converted into a hierarchical structure that is called parsing. It plays an important role in natural language understanding systems for two reasons:
Semantic processing must operate on sentence elements. If there is no syntactic parsing step, then the semantics system must decide on its own constituents. Thus it can play a significant role in reducing overall system complexity.

Syntactic Processing
Although it is often possible to extract the meaning of a sentence without using grammatical facts, it is not always possible to do so. Consider the examples: The satellite orbited Mars Mars orbited the satellite In the second sentence, syntactic facts demand an interpretation in which a planet revolves around a satellite, despite the apparent improbability of such a scenario.

Syntactic Processing
Almost all the systems that are actually used have two main components: A declarative representation, called a grammar, of the syntactic facts about the language. A procedure, called parser, that compares the grammar against input sentences to produce parsed structures.

## Grammars and Parsers

The most common way to represent grammars is as a set of production rules. First rule below can be read as A sentence is composed of a noun phrase followed by Verb Phrase; Vertical bar is OR ; represents empty string. Symbols that are further expanded by rules are called non terminal symbols. Symbols that correspond directly to strings that must be found in an input sentence are called terminal symbols.

## Grammars and Parsers

A simple Context-free phrase structure grammar fro English: S NP VP NP the NP1 NP PRO NP PN NP NP1 NP1 ADJS N ADJS | ADJ ADJS VP V VP V NP N file | printer PN Bill PRO I ADJ short | long | fast V printed | created | want

S

VP V NP

NP

PN

printed Bill

NP1 N

file

## A Parse tree for a sentence

John ate the apple. 1. S -> NP VP 2. VP -> V NP 3. NP -> NAME 4. NP -> ART N 5. NAME -> John 6. V -> ate 7. ART-> the 8. N -> apple
S

NP

VP

NAME

NP

John

ate

ART

the

apple

## Top-down Vs Bottom-Up parsing

There are two ways this can be done: Top-down Parsing: Begin with start symbol and apply the grammar rules forward until the symbols at the terminals of the tree correspond to the components of the sentence being parsed. Bottom-up parsing: Begin with the sentence to be parsed and apply the grammar rules backward until a single tree whose terminals are the words of the sentence and whose top node is the start symbol has been produced .

## Top-down Vs Bottom-Up parsing

The choice between these two approaches is similar to the choice between forward and backward reasoning in other problemsolving tasks. The most important consideration is the branching factor. Is it greater going backward or forward? Sometimes these two approaches are combined to a single method called bottomup parsing with top-down filtering.

## Finding One Interpretation or Finding Many

Four ways of Handling sentences: All Paths Follow all possible paths and build all the possible intermediate component Best Path with Backtracking Follow only one path at a time, at every choice point, the information that is necessary to make another choice if the chosen path fails to lead to a complete interpretation of the sentence Best Paths with Patch up Follow only one path at a time, but when an error is detected, explicitly shuffle around the components that have already been formed Wait and see Follow only one path, but rather than making decisions about the function of each component as it is encountered, procrastinate the decision until enough information is available to make the decision correctly

## Augmented Transition Networks

ATN is a top-down parsing procedure that allows various kinds of knowledge to be incorporated into the parsing system so it can operate efficiently. ATN is similar to Finite state machine in which a class of labels are attached to the arcs-that define transitions between states augmented. Consider a sentence The Long File has printed

## Augmented Transition Network

NP S Q1 V AUX Q3 NP Adj Q6 Q8/F NP Q10 /F N V PP NP Q4/F Q5/F

## AUX Q2 Det NP NPR PP Prep

PP Q7/F

Q9

Parsing sentence: The long file has printed Execution proceeds as follows:
1. 2. 3. 4. 5. 6. 7. 8.

Begin in State S Push to NP. Do a category test to see if the is a determiner. This test succeeds, so set the DETERMINER register to DEFINITE and go to state Q6. Do a category test to see if long is an adjective. This test succeeds, so append long to the list contained in the ADJS register. Stay in state Q6 Do a category text to see if file is an adjective. This test fails. Do a category test to see if file is a noun. This test succeeds, so set the NOUN register to file and go to state Q7. 9. Push to PP. 10. Do a category test to see if has is a preposition. This test fails, so pop and return the structure. 11. There is nothing else that can be done from state Q7, so pop and return the structure. 12. The return causes the machine to be in state Q1, with the SUBJ register set to the structure just returned and the TYPE register set to DCL 13. Do a category test to see if has is a verb. This test succeeds so set the AUX register to NIL and set the V register to has. Go to state Q4 14. Push to state NP. Since the next word, printed, is not a determiner or a proper noun, NP will pop and return failure. 15. The only other thing to do in state Q4 is to halt. But more input remains, so a complete parse has not been found. Backtracking is now required. 16. The last choice point was at state Q1, so return there. The registers AUX and V must be unset. 17. Do a category test to see if has is an auxiliary. This test succeeds, so set the V register to printed. Go to state Q4 18. Do category test to see if printed is a verb. This test succeeds, so set the V register to printed. Go to state Q4 19. Now, since the input is exhausted, Q4 is an acceptable final state. Pop and return the structure(S DCL(NP(FILE(LONG)DEFINITE))HAS(VP PRINTED)). This structure is the output of the parse.

ATNs can also be used in variety of ways as follows: The contents of registers can be swapped.

## Arbitrary tests can be placed on the arcs.

If the network were expanded to recognize passive sentences, then at the point that the passive was detected, the current contents of the SUBJ register would be transferred to an OBJ register and the object of the preposition by would be placed in the SUBJ register. Bill printed the file The file was printed by Bill In each of the arcs, the test is specified simply as T. But this need not be the case. Suppose that when the first NP is found, its number is determined and recorded in the register called NUMBER. Then the arcs labeled V could have an additional test placed on them that checked that the number of particular verb that was found is equal to the value stored in NUMBER.

Unification Grammar
For limiting the procedurally such as in Speech Processing Understanding and generating from same grammar Major operations by parser while applying grammar are: Matching (of sentence constituents to grammar rules) Building Structure (corresponding to the result of combining constituents)

Unification Grammar
DAG(Direct acyclic graph) can be used to define unification operator. Each DAG represents a set of attribute-value pairs. Ex: [CAT:DET [CAT:N LEX:the] LEX:file NUMBER: SING] Result of combining these two words: [NP:[DET:the HEAD:file NUMBER:SING]] We describe NP rule as: NP-> DET N Rule of the Graph: [CONSTITUENT1: [CAT:DET LEX:{1}] CONSTITUENT2:[CAT:N LEX:{2} NUMBER:{3}] BUILD:[NP:[DET:{1} HEAD:{2} NUMBER:{3}]]] Note that the order in which attribute-value pairs are stated does not matter. Ex: [CAT:DET [LEX:the LEX:{1}] should match contituent such as CAT:DET]

## Algorithm: Unification Grammar

If either G1 or G2 is an attribute that is not itself an attribute-value pair then: If the attributes conflict, then fail. If either is a variable, then bind it to the other and return the value Otherwise, return the most general value that is consistent with both the original values. Specially, if disjunction is allowed, then return the inter section of the values Otherwise, do: Set variable NEW to empty For each attribute A that is present in either G1 or G2 do If A is not present at the top level in the other input, then add A and its value to NEW If it is, then call Graph-Unify with the two values for A. If that fails, then fail Otherwise, take the new value of A to be the result of that unification and add A with its value to NEW. If there are any labels attached to G1 or G2, then bind them to NEW and return NEW.

Semantic Analysis

Semantic Analysis
Producing a syntactic parse of a sentence is only the first step toward understanding it. We must still produce a representation of the meaning of the sentence. Because understanding is a mapping process, we must first define the language into which we are trying to map. There is no single definitive language in which all sentence meaning can be described. The choice of a target language for any particular natural language understanding program must depend on what is to be done with the meanings once they are constructed.

## Choice of target language in semantic Analysis

There are two broad families of target languages that are used in NL systems, depending on the role that the natural language system is playing in a larger system:
When natural language is being considered as a phenomenon on its own, as for example when one builds a program whose goal is to read text and then answer questions about it, a target language can be designed specifically to support language processing. When natural language is being used as an interface language to another program( such as a db query system or an expert system), then the target language must be legal input to that other program. Thus the design of the target language is driven by the backend program.

Lexical processing
The first step in any semantic processing system is to look up the individual words in a dictionary ( or lexicon) and extract their meanings. Many words have several meanings, and it may not be possible to choose the correct one just by looking at the word itself. The process of determining the correct meaning of an individual word is called word sense disambiguation or lexical disambiguation. It is done by associating, with each word in lexicon, information about the contexts in which each of the words senses may appear.

Lexical processing
For example the word diamond might have following set of meanings: A geometrical shape with four equal sides. A base ball field. An extremely strong and valuable gem stone To select the correct meaning for the word diamond in the sentence, Joan saw Susans diamond shining from across the room. It is not necessary to know neither of geometrical shapes or nor baseball fields shimmer, but gem stones do.

Lexical processing
The process of determining the correct meaning of an individual word is called word sense disambiguation or lexical disambiguation. I t is done by associating, with each word in lexicon information about the contexts in which each words senses may appear. baseball field interpretation can be marked as LOCATION Some of the useful semantic markers are
PHYSICAL-OBJECT ANIMATE-OBJECT ABSTRACT-OBJECT TIME LOCATION

Sentence-Level Processing
Several approaches to the problem of creating a semantic representation of a sentence have been developed, including the following: Semantic grammars, which combine syntactic,
semantic and pragmatic knowledge into a single set of rules in the form of grammar.

Case grammars, in which the structure that is built by Conceptual parsing in which syntactic and semantic
knowledge are combined into a single interpretation system that is driven by the semantic knowledge. Approximately compositional semantic interpretation, in which semantic processing is applied to the result of performing a syntactic parse

the parser contains some semantic information, although further interpretation may also be necessary.

Semantic grammars
A semantic grammar is a context-free grammar in which the choice of non terminals and production rules is governed by semantic as well as syntactic function. There is usually a semantic action associated with each grammar rule. The result of parsing and applying all the associated semantic actions is the meaning of the sentence.

Example
S-> what is FILE-PROPERTY of FILE {query FILE.FILE-PROPERTY} S-> I want to ACTION {command ACTION} FILE-PROPERTY-> the FILE-PROP {FILE-PROP} FILE-PROP-> extension | protection | creation date| owner {value} FILE-> FILE-NAME | FILE 1 {value} FILE 1-> USERS FILE2 {FILE2.ownwer: USER} FILE 1-> FILE2 {FILE2} FILE 2-> EXT file {instance: file-struct extension: EXT}

Example
EXT-> .init | .txt | .lsp | .for | .ps | .mss value ACTION-> print FILE {instance: printing object : FILE} ACTION-> print FILE on PRINTER {instance: printing object : FILE printer : PRINTER} USER-> Bill | Susan {value}

## The Result of parsing with a Semantic Grammar

S {command: {instance: printing object: {instance file-struct extension: .init owner: Bill}}} ACTION {command: {instance: printing object: {instance file-struct extension: .init owner: Bill}}} FIL E

File 1 {instance file-struct extension: .init owner: Bill}}} File 1 {instance file-struct extension: .init owner: Bill}}} EXT

want

to

print

Bills

.init

file

Semantic grammars
The advantages of semantic grammars are
When the parse is complete, the result can be used immediately without the additional stage of processing. Many ambiguities that would arise during a strictly syntactic parse can be avoided. Syntactic issues that do not affect the semantics can be ignored. The number of rules required can become very large since many syntactic generalizations are missed . Because the number of grammar rules may be very large, the parsing process may be expensive

## The drawbacks in use of Semantic Grammars

Case grammars
Case grammars provide a different approach to the problem of how syntactic and semantic interpretation can be combined. Grammar rules are written to describe syntactic rather than semantic regularities. But the structures the rules produce correspond to semantic relations rather than to strictly syntactic ones Consider two sentences Susan printed the file. The file was printed by susan.

Case grammars
The case grammar interpretation of the two sentences would both be (printed ( agent Susan) ( object File ))
S N P V V P N P N P S V V P P P

## The file was printed by Susan

Case grammars
Mother baked for three hours (baked ( agent Mother) ( timeperiod 3-hours )) The pie baked for thee hours ( baked (object pie) ( timeperiod 3-hours ) )
S N P V P N P S N P V P

P P

The pie

## baked for thee hours

Conceptual Parsing
Conceptual parsing is a strategy for finding both the structure and meaning of a sentence in one step. Conceptual parsing is driven by dictionary that describes the meaning of words in conceptual dependency (CD) structures. The parsing is similar to case grammar. CD usually provides a greater degree of predictive power.

Conceptual Parsing
Main noun want stativ e x cf transitiv e intransitive x x cf cf cf pleased o huma n X one here huma n huma n object

## Discourse and Pragmatic Processing

There are a number of important relationships that may hold between phrases and parts of their disclosure contexts, including: Identical Entities. Consider the text, --Bill had a red balloon --John wanted it Parts of Entities. Consider the text, --Sue opened the book she just bought --The title page was torn Parts of action. Consider the text, --John went on a business trip to NY --he left on an early morning flight

Entities involved in action. Consider the text, --My house was broken into last week --They took the TV and the stereo Name of individuals. Consider the text, --Dave went to the movies Casual chains. Consider the text, --There was a big storm yesterday --The schools were closed today Planning sequences. Consider the text, --Sally wanted a new car --She decided to get a job Implicit presuppositions. Consider the text, --Did Joe fail CS101

## Discourse and Pragmatic Processing

The Kinds of Knowledge used: The current focus of the dialogue A model of each participants current beliefs The goal driven character of dialogue The rules of conversation shared by all participants Using Focus in Understanding, there are two important parts of using knowledge to facilitate understanding: Focus on the relevant part(s) of available knowledge base Use that knowledge to resolve ambiguities and to make connection among things that were said

The End

Reference: 1. Artificial intelligence - Elaine Rich, Kevin Knight 2. Artificial Intelligence: A Modern Approach by Stuart Russell and Peter Norvig