Beruflich Dokumente
Kultur Dokumente
Quantifiers
There are two types of quantifiers 1. Universal quantifier ( pronounced as For All) 2. Existential quantifier ( pronounced as There Exists)
Universal Quantifier
1. All Kings are persons x: king(x) person(x) 2. All people are literal x: person(x) literate(x) 3. All men are people x: man(x) person(x) 4. All Pompieans were romans x: pompiens(x) roman(x)
Existential Quantifier ()
x : person(x) wrote(x, games) 2. There is a person who wrote chess x : person(x) wrote(x, chess) 3. Everyone is loyal to someone. x y : loyalto(x,y)
Predicate Sentences
1. Marcus was a man man(Marcus) 2. Marcus was a Pompeian Pompeian(Marcus) 3. All Pompeians were Romans x: Pompeian(x) Roman(x) 4. Caeser was a ruler ruler(Caeser)
Predicate Sentences
5. All Romans were either loyal to Caesar or hated him. x : Roman(x) loyalto(x,Caeser) v hate(x, Caeser) 6. Everyone is loyal to someone. x: y : loyalto(x,y)
Predicate Sentences
7. People only try to assassinate rulers they are not loyal to. x: y: Person(x) & ruler(y)& tryassassinate(x,y) ~loyalto(x,y) 8. Marcus tried to assassinate Caesar. tryassisanate(Marcus, Caesar) 9. All Men are people. x : man(x) person(x)
Predicate Sentences
Answer the Question Was Marcus loyal to Caesar? We need to prove ~Loyalto(Marcus,Caesar) OR Loyalto(Marcus,Caesar)
Predicate Sentences
1. Marcus was a man (1) 2. All men are people (9) 3.Conclusion Marcus was a person 4. Marcus tried to assassinate Caesar(8) 5. Caesar was a ruler (4) 6. people only try to assassinate rulers they are not loyal to.(7) Conclude from (3, 4 and 5) Marcus not loyal to Caesar.
Predicate Sentences
man(Marcus) | (Predicate 9) person(Marcus) | (Predicate 8) person(Marcus) & Tryassassinate(Marcus,Caesar) | (Predicate 4)
(Predicate 1)
Predicate Sentences
cont...
1. marcus was a man
man(Marcus) 2. Marcus was a pompeian Pompeian(Marcus) 3. Marcus was born in 40 A.D born(marcus, 40) 4. All men are mortal x: men(x) mortal(x) 5. All Pompeians died when the volcano erupted in 79 A.D erupted(volcano,79) & x :pompeian(x)died(x, 79)
cont...
6. No mortal lives longer than150 years x: t1: t2: mortal(x) & born(x,t1) & gt(t2-t1,150) dead(x,t1) 7. It is Now 1991 Now=1991 8. Alive means not dead x: t: [ alive(x,t) ~dead(x,t)] & [~dead(x,t)alive(x,t)] 9. If someone dies then he is dead at all later times x: t1: t2: died(x,t1) & gt(t2,t1) dead(x1,t2)
cont...
1. man(Marcus)
2. Pompeian(Marcus) 3. born(marcus, 40) 4. x: men(x) mortal(x) 5. erupted(volcano,79) 6. x :pompeian(x)died(x, 79) 7. x: t1: t2: mortal(x) & born(x,t1) & gt(t2-t1,150) dead(x,t1) 8. Now=1991 9. x: t: [ alive(x,t) ~dead(x,t)] & [~dead(x,t)alive(x,t)] 10. x: t1: t2: died(x,t1) & gt(t2,t1) dead(x1,t2)
cont...
Is Marcus alive?
cont...
~alive(Marcus, Now) | (9, Substitution) Dead(Marcus, Now) | (10, Substituation) pompeian(Marcus) & gt(now ,t1) |(5, Substituation) pompeian(Marcus) & gt(now ,79) | (2) gt(now,79) | (8, substitute Equals) gt( 1991, 79) | True
cont...
Disadvantage: Many steps required to prove simple conclusions
Verity of processes such as matching and substitution used to prove simple conclusions
Resolution
Resolution is a proof procedure by refutation. To prove a statement using resolution it attempt to show that the negation of that statement.
Conversion to Conjunctive Normal Form All Romans who know Marcus either hate
Caesar or think that any one hates any one is crazy. x : [Roman(x) & known(x,Marcus)] [hate(x, Caeser) v ( y: z :hate(y,z) thinkcrazy(x,y))] CNF Equivalent: ~Roman(x) v ~known(x,Marcus) v hate(x, Caeser) v ~hate(y,z) v thinkcrazy(x,y)
CNF...
2. Reduce the scope of ~ ~(~p) = p ~(a & b)= ~a v ~b ~(a v b)= ~a & ~b x : [~Roman(x) v ~known(x,Marcus)] v [hate(x, Caeser) v ( y: z : ~hate(y,z) v thinkcrazy(x,y)]
CNF...
3. Make each quantifier bind to a unique variables x : P(x) v x Q(x) by eliminate unique variables x : P(x) v y Q(y)
CNF...
4. Move all quantifiers to the left of the formula x : y: z : [~Roman(x) v ~known(x,Marcus)] v [hate(x, Caeser) v ( ~hate(y,z) v thinkcrazy(x,y)]
CNF...
5. Eliminate existential quantifier() by substituting variable reference to y: president(y) president(F1) x: y: father-of(x, y) x: father-of(F(x), x)
CNF...
6. Drop the prefix [~Roman(x) v ~known(x,Marcus)] v [hate(x, Caeser) v ( ~hate(y,z) v thinkcrazy(x,y)] 7. Convert statement into conjunction of disjunct (a & b) v c = (a v b) & (b v c)
Propositional Resolution
Step 2: Negate the proposition we want to prove and add it to the existing clauses example : Form Above we want to prove R so ~R add it to clauses
Propositional Resolution...
step 3: select some clauses and try to prove our assumption is wrong.
(~R) & (~P v ~Q v R) [clause 2] | ~P v ~Q | [clause 1] (~P v ~Q) & P | ~Q | [clause 3(b)] (~Q) &(~T v Q) | ~T | [clause 4] (~T) & (T) | Contradiction
Unification
Unification is the process of finding substitutions that make different logical expressions look identical Proposition Logic: R & ~R Predicate Logic: man(Marcus) & ~man(Marcus) man(Marcus) & ~man(Spot)
Cont
Solution for this problem is matching and substitution. Example: Unify P(x, x) P(y, z) Here x, y, z are variables
Example
ABC Murder Story: Abbott(A), Babbitt(B), Cabot(C) be suspects in a Murder case. 1. A has alibi, in the register of respected Hotel. 2. B also has alibi, for his brother-in-law testified that B was visiting at that time. 3. C pleads alibi too, claiming to watch live match in the ground(But we have only his words).
Example...
So We can believe: 1. That A did not commit the crime 2. That B did not commit the crime 3. That A or B or C did Conclusion ?
Example...
But C have been caught by Live television. So new belief is 4. That C did not commit the crime.
Monotonic Reasoning
1. It is complete with respect to the domain interest. 2. It is consistent 3. Knowledge increase monotonically when new facts can be added. Ex: KB1 = KBL KB2 = KBL U F ( F is some facts) than KB1 is sub set of KB2
Approaches
Approaches to handle these problems 1. Non Monotonic Reasoning (Belief) 2. Statistical Reasoning (Certainty)
Deferent Reasonings
1. Default Reasoning a) Non Monotonic Logic(NML) b) Default Logic(DL) 2. Minimalist Reasoning a) Closed World Assumption (CWA)
This is the Predicate logic, augmented with model operator M, which can be read as is consistent
NML Example
x: y: Related(x, y) & M GetAlong(x, y) WillDefend(x, y) For all X and Y are related and if the fact that X gets along with Y is consistent with everything else that is believed, than conclude that X will defend Y
NML Example...
1. x: Republican(x) & M ~Pacifist(x) ~Pacifist(x) 2. x: Quaker(x) & M Pacifist(x) Pacifist(x) 3. Republican(Marcus) 4. Quaker(Marcus)
Default Logic
It is an alternative logic. In this rules are represented in the form of A: M B / C
Abduction Resoning
Deduction: x: A(x) B(x) Given A(marcus) we conclude B(marcus) Abduction: It is the reverse process It is given B(Marcus) we conclude A(Marcus) But it is wrong some times
CWA...
May be one to infinity. Reason is that course assertions are do not deny unmentioned courses are also offered. (incomplete information) Courses are different from each other.
CWA...
The assumption is that the provided information is complete. So not asserted to be true are assumed to be false. Example: Airline KB Application Is there any flight from Vskp to Hyd? ~Connect(Vskp, Hyd) is asserted when we can not prove Connect(Vskp, Hyd)
Implementation Issues
1. How to update Knowledge incrementally ? 2. Many facts are eliminated when new Knowledge become available. How should it be manged? 3. Theories are not computationally effective? These issues can be handled by search control. Depth first search ? Breadth first search?
Cronological Backtracking
It is a depth first search backtracking. It makes a guess at something, thus it creating a branch in the search space. If our guess is wrong, back up there and try alternative. It is leaving everything after guess
Example
We need to Know the fact 'F'. Which can be derived by making some assumption 'A' and derive 'F'. It also derives some additional facts 'G' and 'H' from 'F' Later we derive new facts 'M' and 'N', They are independent of 'A' and 'F'.
Example...
A F G H M N
Example...
At some time a new fact invalidates 'A'. In cronological backtracking invalidates all F, G, H, M, N even M, N not depend on assumption.
Exmple 2
Problem: Finding a time at which three busy people can all attend a meating Assumption: Meating held on wednesday Found a fact: All are free at 2:00 So choose 2:00 is the meating time.
Example...
Assume day= Wed
After Many steps that only time all people availabe is 2:00 PM FAIL(a special conference has all the rooms booked on Wed)
Repete same time finding process and again decide on 2:00 PM. For same reasons. Try to find room SUCCEED
Problem
Based on the order they geerated by search process insted of responcibility of inconsistancy, we may waste a great effort
It makes a guess at something, associate each node one or more justification in the search space.
Justification based Truth Maintenance systems(JTMS) Logical based Truth Maintenance systems(JTMS)
JTMS...
JTMS has an ability to provided dependency directed backtracking and so to support nonmonotonic reasoning.
Example: ABC Murder Story
Initially our believe that A is the primary suspect. Because he was a beneficiary and he had no alibi.
contd...
Dependency Network
Suspect A[IN] IN List + - (OUT List) Alibi Abbott
Benificiary A
Abort should be a suspect when it is belived that he is a benificiary and it is not belived that he has an alibi
Dependency Network...
There are three assertions: 1. Suspect A(Primary Murder suspect) 2. Benificiary A(He is benificiary of the victim) 3. Alibi Abbott(A was at a hotel at the time)
Dependency Network...
Suspect A[OUT] + Benificiary A [IN] + Registered A [IN] + Far Away [IN] + Alibi A [IN] _ Registered Forged A [OUT]
Abort should not be a suspect when it is belived that he is a benificiary and it is belived that he has an alibi
Dependency Network...
Suspect B[OUT] + Benificiary B [IN] + Say So B-I-L [IN] + Alibi B [IN] _ Lies B-I-L [OUT]
B should not be a suspect when it is belived that he is a benificiary and it is belived that he has an alibi
Dependency Network...
Suspect C[IN] + Benificiary C [IN] + Alibi C [OUT] + +
Tells truth Cabot [OUT] Abort should not be a suspect when it is belived that he is a benificiary and it is belived that he has an alibi
Dependency Network...
Suspect C[OUT] + Benificiary C [IN] + + Alibi C [IN] +
+ _ C seen in TV TV Forgery Tells truth Cabot [IN] [IN] [OUT] C should be a suspect when it is belived that he is a benificiary and it is belived that he has no alibi
Dependency Network...
Contradiction[IN] Sespect A Sespect A Sespect Other
Sespect A
It is similar to the JTMS. In JTMS the nodes in the network are treated as atoms. Which assumes no rerelationships among them except the ones that are expliucitly stated in the justifications. Example: we can represent Lies B-I-L and not Lies B-I-L and labled both of them IN. No contruduction will be detected automatically.
LTMS...
In LTMS contradiction will be detected automatically. In this we need not create explicit contradiction
Statistical Reasoning
Basic Probability
Prior Probability
It is associated the degree of belief in the absence of any other information
Conditional Probability
Once we have obtained some evidence concerning previously unknown random variable conditional probabilities should be used. P(a|b)= 0.2 Probability of a with known evidence b P(Cavity|Toothache) = 0.8
Conditional Probability...
Product Rule: P(a & b)= P(a|b) P(b) P(a & b)= P(b|a) P(a)
P(a|b)=P(a&b) / P(b)
Bayes Theorem
Bayes rule states: The probability of the hypothesis(H) to be true with Known observations(E) is P(H|E) = P(H&E) / P(E) => P(H|E) = P(E|H)P(H) / P(E) For N events if P(A1)+P(A2)+.......+P(An)=1 P(Ai)|B)= P(B|Ai)*P(Ai) P(B|A1)*P(A1)+....+P(B|An)*P(An)
Bayes Theorem...
Doctor know that cavity causes the patent has toothache say 50%. Prior probability that any patent has toothache 1/20 and cavity 1/1000. P(Toothache|Cavity)=0.5 P(Cavity)=0.001 P(Toothache)=0.05 Finding P(Cavity|Toothache) = 0.5 * 0.001/0.05 = 0.01
Bayes Network
S: Sprinkler was on last night W: Grass is wet R: It rained last night
Sprinkler Rain
Bayes Network
P(C)= 0.5 Cloudy C t f P(S) .10 .50 Sprinkler Rain C t f P(R) .80 .20
Bayes Theorem
Disadvantages: 1. Too many probabilities need to be provided 2. Space to store all probabilities 3. The time required to compute the probabilities 4. This theory is good for well structured situation in which all the data is available and the assumptions are satisfied. Unfortunately these conditions may not occur in reality.
cont...
In MYCIN Expert system each rule is associated with certainty factor which is the measure of the evidence to be believed. MYCIN Rule looks like: If 1. The stain of the organism is gram positive and 2. The Morphology is Coccus and 3. The Growth is Clumps then suggestive evidence 0.7 that is staphylococcus.
Certainty Factor
Certainty factor CF[h,e] is defined in two components: MB[h,e] A measure (b/w 0 and 1) of belief in hypothesis h given evidence e. MD[ h,e] A measure (b/w 0 and 1) of disbelief in hypothesis h given evidence e. CF[h,e] = MB[h,e] - MD[h,e]
Cont
In MYCIN model, e for two evidences e1 and e2 supporting hypothesis h. The measure of belief MB is MB[h,e1&e2] = MB[h,e1] +MB[h,e2] *(1-MB[h,e1]) MD[h,e1&e2] = MD[h,e1] +MD[h,e2] *(1MD[h,e1])
Cont
If MD[h,e1&e2] = 0 Or MB[h,e1&e2]=1 All the evidences (e1 and e2) approves the hypothesis (h) Or MD[h,e1&e2] = 1 Or MB[h,e1&e2]=0 All the evidences (e1 and e2) disproves the hypothesis (h)
Example for CF
Set of rules r1,r2, r7 are given a support of evidence for the hypothesis H is conclusion that it is an elephant e1: r1:It has a tail 0.3 e2: r2:It has a trunk 0.8 e3: r3:It has a heavy body 0.4 e4: r4:It has four legs 0.2 e5: r5:It has black colour 0.1 e6: r6:It has stripes 0.6 e7: r7:It has long flat ears 0.6
For rule 1: MB= 0.3 and MD=0 Inclusion of effect of rule 2 gives the value of MB and MD as MB=0.3+0.8 * (1- 0.3) =0.86 and MD=0 For rule 3 inclusion MB=0.86+0.4 * (1- 0.8) =0.94 and MD=0 For rule 4 inclusion MB=0.94+0.2 * (1- 0.94) =0.952 and MD=0
Cont
Cont
For rule 5 inclusion MB=0.952+0.1* (1- 0.952) =0.9568 and MD=0 For rule 6 inclusion 0.9568 and MD=0.6 For rule 7 inclusion MB= 0.9568 +0.6 (1- 0.9568) =0.98272 and
DST is design to deal with the distinction between Uncertainty and ignorance. It is very useful to handle epistemic information as well as ignorance or lack of information
cont...
It is represented in the Belief and Plausibility Belief measures the strength of the evidence range from 0 to 1 Plausibility is denoted to be pl(s)= 1-bel(~s)
Introduction
Knowledge can be represented in slot-and filler system as a set of entities and their attributes The structure is useful beside the support of Inheritance
It enables attribute values to be retrieved quickly Properties of relations are easy to describe It allows ease consideration of object oriented programming.
Introduction
A slot is an attribute value pair in its simplest form A filler is a value that a slot can take -- could be a numeric, string (or any data type) value or a pointer to another slot A weak slot and filler structure does not consider the content of the representation
Semantic Nets
Nodes denoting the objects. Links denoting relations between objects. Link labels that denotes particular relations.
Example
Mammal
isa
has-part
Person
Nose
instance
Blue
Uniformcolor
team
Pee-Wee-Reese
Brooklyn-Dodgers
Cubs
Visiting team
G23
score
5-3
Home-team
Dodgers
Representing a sentence
John gave the book to marry
Give
instance agent
BK23
instance objec t
john
EV7
BK23
beneficiary
Mary
Relating Entities
john
height
72
Bill
height
52
If we want to relate these two entities with the fact John is taller then Bill.
Relating Entities
john Bill
height
height
Greater-than
H1
H2
Relating Entities
john
height Greaterthan
Bill
height
H1
H2
value
value
72
52
isa
isa
isa
assailant
victim
SA
GS
isa
rm fo
Dogs
isa isa
Bite
isa assaila nt victi m
Mallcarrier
isa S1
g
A
Form which states the relation that is being asserted On or more universal quantifier connections
Bite
constables
r fo
g
A
SA
Dogs
isa
Bite
isa assailant
Mall-carrier
isa S1 victim
b
form
GS
isa
Frames
Semantic nets initially we used to represent labeled connections between objects As tasks became more complex the representation needs to be more structured The more structured system it becomes more beneficial to use frames
Frames
A frame is a collection of attributes or slots and associated values that describe some real world entity Each frame represents:
Team Instance Isa : Cardinality : *team-size : ML-Baseball-Team Isa Instance Isa Cardinality : teams that exist} *team-size : team} Manager
cont...
Brooklyn-Dodgers Instance Team Isa : Team-size Manager *uniform-colour Pee-Wee-Reese Instance Dodgers Instance Uniform-colour : Batting-Average : ML-BaseballML-Baseball-Team : 24 : Leo-Durocher : Blue : Brooklyn-
Pitcher
Catcher
Fielder
Americanleaguer
Nationalleaguer
instance instance
Three-Finger-Brown
Cont
ML-Baseball-Player Is-covered-by leager} Pitcher Isa : Mutually-disjoint-with Fielder Isa : Mutually-disjoint-with Catcher Isa : Mutually-disjoint-with National-Leaguer Isa Three-Finger-Brown Instance Instance : : : ML-Baseball-Player : {Catcher, Fielder} ML-Baseball-Player : {Pitcher,Catcher} ML-Baseball-Player : {Pitcher, Fielder} ML-Baseball-Player Pitcher National-Leaguer : {Pitcher, Catcher, Fielder} {National-Leaguer, American-
Slot-Values as Objects
John height Bill height : : 72
We could attempt to compare slots by creating slots themselves into objects. we use Lambda() notation for creating objects
Cont
John height: Bill height: 72; x( x.height > Bill.height ) x( x.height < John.height )
Inheritance
Bird
isa
fly:yes
isa
Ostrich fly:no
instance
Pet-Bird
instance
Fifi
fly:?
Cont
Republicanpacifist: false
instance
instance
Dick pacifist:?
Solution
The solution to this problem instead of using path length but use inferential distance. Class1 is closer to class2 then to class3 , if and only if class1 has an inference path through class2 to class3 (class2 is between class1 and class3)
Property inheritance
The set of competing values for a slot S in a frame F contains all those values
Can be derived from some frame X that is above F in the isa hierarchy Are not contradicted by some frame Y that has a shorter inferential distance to F than X does
Bird
isa
fly : yes
isa
Ostrich
is a
fly:no
Pet-Bird
PlumedOstrich
isa
White-Plumed Ostrich
instance
instance
Fifi
fly:?
Cont
Republican pacifist:false
isa
ConservativeRepublican
instance
Dick
pacifist:?
Frame Languages
The idea of Frame system as a way to represent declarative knowledge has been encapsulated in a series of frame oriented knowledge representation languages
KRL [Bobrow and Winograd in 1922] FRL [Roberts and Goldstein, 1977] RLL, KL-ONE, KRYPTON, NIKL, CYCL, Conceptual Graphs, THEO and FRAMEKIT
Conceptual Dependency
In semantic network and Frame systems may have specialized links and inference procedure but there is no rules about what kind of objects and links are good in general for knowledge representation Conceptual Dependency is a theory of how to represent events in natural language sentences
Facilitates drawing inferences from sentences Independent of the language in which sentences were originally stated
Conceptual Dependency
CD provides
a structure into which nodes representing information can be placed a specific set of primitives at a given level of granularity
Primitive Acts
ATRANS PTRANS PROPEL MOVE GRASP INGEST EXPEL Transfer of an abstract relationship (e.g.,give) Transfer of the physical location of an object (e.g.,go) Application of physical force to an object (e.g.,push) Movement of a body part by its owner (e.g.,kick) Grasping of an object by an actor (e.g.,clutch) Ingestion of an object by an animal (e.g.,eat) Expulsion of something from the body of an animal (e.g.,cry)
Conceptual Dependency
MTRANS Transfer of mental information (e.g.,tell) MBUILD Building new information out of old (e.g.,decide) SPEAK ATTEND Production of sounds (e.g.,say)
Premitive Concepts
conceptual categories provide building blocks which are the set of allowable dependencies in the concepts in a sentence
PP -- Real world objects(picture producers) ACT -- Real world actions PA -- Attributes of objects(Modifiers of PP) AA -- Attributes of actions(Modifiers of actions) T -- Times LOC Locations
Example
Raju ATRANS
p
book
R to from
man Raju
Raju gave the man a book Arrows indicate the direction of dependency Letters above indicate certain Relationship Double arrows () indicate two-way links between the actor (PP) and action (ACT) o -- object. R -- recipient-donor. I -- instrument e.g. eat with a spoon D -- destination e.g. going home.
Modifiers
The use of tense and mood in describing events is extremely important modifiers are: p past delta -- timeless f -- future c -- conditional t -- transition / -- negative ts-- start transition ? interrogative tf-- finished transition k -- continuing the absence of any modifier implies the present tense.
Conceptual Dependency
Arrows indicate the direction of dependency The Double arrow() has an object (actor), PP and action, ACT. I.e. PP ACT. The triple arrow( ) is also a two link but between an object, PP, and its attribute, PA. I.e. PP to PA. It represents isa type dependencies.
1.
2. 3. 4.
doctor
5.
PP
Johns dog
6. 1.
ACT ACT
PP o o
PP
PP PP
8.
AC T
P I John John INGEST o do Ice cream o spoon P P P P PP PA P D John PTRANS o fertilizer plants field bag
John ate ice cream with a spoon John fertilized the field
9.
ACT D 10.PP
Size>x Size=x
11.
(a)(b)
12.
yesterday
13.
John
PTRANS
14.
PP
woods
CP frog ears
MTRANS
one INGEST
SMOKE R
one cigarette
one
tf
INGE ST dea d aliv e
smo ke
I cigare tte
Joh n
Bill
MTRA NS
o do 1 o do 2 cf do 1 broke n
Joh n
belie ve Joh n
Advantages with CD
Using these primitives involves fewer inference rules. Many inference rules are already represented in CD structure. The holes in the initial structure help to focus on the points still to be established.
Disadvantages with CD
Knowledge must be decomposed into fairly low level primitives. Impossible or difficult to find correct set of primitives. A lot of inference may still be required. Representations can be complex even for relatively simple actions
Scripts
Scripts generally used to represent knowledge about common sequence of events Script is a structure that describes a stereotyped sequence of events in a particular context A script consists of a set of slots associated with some information
Components of Scripts
Entry conditions Conditions that must, in general, be satisfied before the events described in the script can occur. Result Conditions that will , in general, be true after the events described in the script have occurred. Props Slots representing objects that are involved in the events described in the script. The presence of these objects can be inferred even if they are not mentioned explicitly.
Roles Slots representing people who are involved in the events described in the script. The presence of these people ,too, can be inferred even if they are not mentioned explicitly. If specific individuals are mentioned, they can be inserted into the appropriate slots. Track The specific variation on a more general pattern that is represented by this particular script. Different tracks of the same script will share many but not all components. Scenes The actual sequences of events that occur. The events are represented in conceptual dependency formalism.
Planning
Contents
Introduction to Planning Blocks world Problem Components of Planning system
Greens approach STRIPS
Hierarchical Planning
Planning
Planningproblemsarehardproblems Theyarecertainlynontrivial Method which we focus on ways of decomposing the original problem into appropriate subparts and on ways of handling interactions among the subparts during the problem-solving process are often called as planning Planning refers to the process of computing several steps of a problem-solving procedure before executing any of them
Robot Actions
UNSTACK(A,B)Pick up block A from its current position on block B. The arm must be empty and block A must have no block on top of it. STACK(A,B)Place block A on block B. The arm must already be holding and the surface of B must be clear.
Robot Actions
PICKUP(A)Pick up block A from the table and hold it. The arm must be empty and there must be nothing on top of block A. PUTDOWN(A)Put block A down on the table. The arm must have been holding block A.
Set of Predicates
ON(A,B)- Block A is on block B. ONTABLE(A)- Block A is on the table. CLEAR(A)-There is nothing on top of block A. HOLDING(A)- The arm is holding block A. ARMEMPTY- The arm is holding nothing.
Logical Statements
[x:HOLDING(x)]ARMEMPY x: ONTABLE(x) y:ON(x,y) x:[ y:ON(y,x)]CLEAR(x)
2. Apply Rules
In simple systems, applying rules is easy. Each rule simply specified the problem state that would result from its application. In complex systems, we must be able to deal with rules that specify only a small part of the complete problem state. One way is to describe, for each action, each of the changes it makes to the state description.
STRIPS(Applying Rules)
STRIPS approach each operator described by set of lists of predicates STRIPES has three lists are ADD, DELETE, PRECONDITION
A list of things that become TRUE called ADD A list of things that become FALSE called DELETE A set of prerequisites that must be true before the operator can be applied
UNSTACK(x,y)
P:ON(x,y) CLEAR(x) ARMEMPTY D:ON(x,y) ARMEMPTY A:HOLDING(x) CLEARC(y)
PUTDOWN(x)
P:HOLDING(x) D:HOLDING(x) A:ONTABLE(x) ARMEMPTY
3. Detecting a Solution
A planning system has succeeded in finding a solution to a problem when it has found a sequence of operators that transform the initial problem state into the goal state In simple problem-solving systems we know the solution by a straightforward match of the state description But in complex problem different reasoning mechanisms can be used to describe the problem states, that reasoning mechanisms could be used to discover when a solution had been found
Example
We can describe the start state or Goal stack as
ON(B, A) ONTABLE(A) ONTABLE(C) ONTABLE(D) ARMEMPTY
Example
Decompose the problem into four different sub problems in the goal stack 1. ON(C,A)
2. ON(B,D) 3.ONTABLE(A) 4.ONTABLE(D)
Example
Depending on the order in which we want to solve the sub problems. There are two different orders are
(1) ON(C,A) ON(B,D) ON(C,A) ON(B,D) OTAD (2) ON(B,D) ON(C,A) ON(C,A) ON(B,D) OTAD
Example
At each step of the problem solving process the top goal on the stack will be solved until the goal stack is empty One last check, the original goal is compared to the finial state derived from chosen operator Choose first alternative, predicate on top of the goal stack is ON(C,A)
Example
First check to see whether ON(C,A) is true in the current state. It is not, so find an operator that could cause it to be true If you Apply STACK(C,A) operator will lead the state to ON(C,A) goal STACK(C,A) ON(B,D) ON(C,A) ON(B,D) OTAD
Example
In order to apply STACK(C,A) operator, its preconditions must hold, so we stack those sub goals CLEAR(A) HOLDING(C) Resultant goal stack is CLEAR(A) HOLDING(C) CLEAR(A) HOLDING(C) STACK(C,A) ON(B,D) ON(C,A) ON(B,D) OTAD
Example
Check whether CLEAR(A) is true. It is not. The only operator that could makes true is UNSTACK(B,A). So apply it to goal stack
UNSTACK(B,A) HOLDING(C) CLEAR(A) HOLDING(C) STACK(C,A) ON(B,D) ON(C,A) ON(B,D) OTAD
Set of preconditions should be satisfied when you are applying UNSTACK(B,A) operator are ON(B, A) CLEAR(B) ARMEMPTY
Example
So goal stack is
ON(B, A) CLEAR(B) ARMEMPTY ON(B, A) CLEAR(B) ARMEMPTY
UNSTACK(B,A) HOLDING(C) CLEAR(A) HOLDING(C) STACK(C,A) ON(B,D) ON(C,A) ON(B,D) OTAD
Example
Compare the top element of the Goal stack ON(B,A) is satisfied. We see that it is satisfied. So pop it off. Consider the next goal CLEAR(B), it is also satisfied. So pop it off. Consider the next goal ARMEMPTY, it is also satisfied. So pop it off. Now apply top element of goal stack, UNSTACK(B,A) operator and pop it off.
Example
The DATABASE corresponding to the world modal at this point
ONTABLE(A) ONTABLE(C) ONTABLE(D) HOLDING(B) CLEAR(A)
Example
Now attempt to satisfy the goal HOLDING(C) The two operators might make this true i.e PICKUP(C) and UNSTACK(C,x). I am considering only first operator. The goal stack is
PICKUP(C) CLEAR(A) HOLDING(C) STACK(C,A) ON(B,D) ON(C,A) ON(B,D) OTAD
Example
So preconditions for PICKUP(C) is
ONTABLE(C) CLEAR(C) ARMEMPTY ONTABLE(C) CLEAR(C) ARMEMPTY PICKUP(C)
CLEAR(A) HOLDING(C) STACK(C,A) ON(B,D) ON(C,A) ON(B,D) OTAD
Example
The top element of the goal stack is ONTABLE(C) satisfied. The next element CLEAR(C) also satisfied . So pop them from goal stack The next element ARMEMPTY is not satisfied since HOLDING(B) is true
ARMEMPTY ONTABLE(C) CLEAR(C) ARMEMPTY PICKUP(C) CLEAR(A) HOLDING(C) STACK(C,A) ON(B,D) ON(C,A) ON(B,D) OTAD
Example
There are two operators to make ARMEMPTY is true, STACK(B,x) and PUTDOWN(B) which operator should we choose? If you look ahead in the goal stack the block B onto D So we choose to apply STACK(B,D) by binding D to x.
Example
Example
Both CLEAR(D) and HOLDING(B) satisfied so pop them from goal stack and apply STACK(B,D). The resultant database
ONTABLE(A) ONTABLE(C) ONTABLE(D) ON(B,D) ARMEMPTY
The goal stack now is PICKUP(C) CLEAR(A) HOLDING(C) STACK(C,A) ON(B,D) ON(C,A) ON(B,D) OTAD
Example
Now all the preconditions for PICKUP(C) are satisfied, so it can be executed Then all the preconditions for STACK(C,A) also satisfied so execute it. So pop these the operators Check for next predicate ON(B,D), so it is already satisfied, pop it off. One last check for combined goal ON(C,A) ON(B,D) OTAD Also satisfied...
Example
The problem solver now it halt and return the plan 1. UNSTACK(B,A) 2. STACK(B,D) 3. PICKUP(C) 4. STACK(C,A)
SUSSMAN ANOMALY
Try to solve the below problem
Example
Two ways to solve the above problem (1) (2) ON(A,B) ON(B,C) ON(B,C) ON(A,B) ON(A,B) ON(B,C) ON(A,B) ON(B,C) Choose alternative 1
Example
ON(C,A) CLEAR(C) ARMEMPTY ON(C,A) CLEAR(C) ARMEMPTY UNSTACK(C,A) ARMEMPTY CLEAR(A) ARMEMPTY PICKUP(A) CLEAR(B) HOLDING(A) STACK(A,B) ON(B,C) ON(A,B) ON(B,C)
Example
All the preconditions of UNSTACK(C,A) are satisfied, so pop it off and apply this operator So the goal stack now is
ARMEMPTY CLEAR(A) ARMEMPTY PICKUP(A) CLEAR(B) HOLDING(A) STACK(A,B) ON(B,C) ON(A,B) ON(B,C)
Example
To satisfy ARMEMPTY of precondition of PICKUP(A) Simply apply operator PUTDOWN(C) and pop all the conditions until ON(B,C) ON(A,B) ON(B,C) The current state is
ONTABLE(B) ONT(A,B) ONTABLE(C) ARMEMPTY
Example
The sequence of operators applied so far is 1. UNSTACK(CA) 2. PUTDOWN(C) 3. PICKUP(A) 4. STACK(A,B)
Example
Then try to achieve the other goal ON(B,C)
ON(A,B) CLEAR(A) ARMEMPTY ON(A,B) CLEAR(A) ARMEMPTY UNSTACK(A,B) ARMEMPTY CLEAR(B) ARMEMPTY PICKUP(B) CLEAR(C) HOLDING(B) STACK(B,C) ON(A,B) ON(B,C)
Example
All the preconditions of UNSTACK(A,B) are satisfied, so pop it off and apply this operator So the goal stack now is ARMEMPTY CLEAR(B) ARMEMPTY PICKUP(B) CLEAR(C) HOLDING(B) STACK(B,C) ON(A,B) ON(B,C)
Example
To satisfy ARMEMPTY of precondition of PICKUP(B) Simply apply operator PUTDOWN(A) and pop all the conditions until ON(A,B) ON(B,C) The current state is
Example
But check Remaining goal ON(A,B) ON(B,C) is not satisfied The difference between current state to goal state is ON(A,B) Sequence of operators to be added to the goal stack is 9. PICKUP(A) 10.STACK(A,B)
Example
Now combine the operators and check for goal is satisfied 1. UNSTACK(C,A) 6. PUTDOWN(A) 2. PUTDOWN(C) 7. PICKUP(B) 3. PICKUP(A) 8. STACK(B,C) 4. STACK(A,B) 9.PICKUP(A) 5.UNSTACK(A,B) 10.STACK(A,B)
Example
But the same goal can be achieved using good plan by 1. UNSTACK(C,A) 2. PUTDOWN(C) 3. PICKUP(B) 4. STACK(B,C) 5.PICKUP(A) 6.STACK(A,B)
Example
Try to solve the sussmans anomaly using nonlinear planning
Constraint Posting
suggesting operators, trying to order them produce bindings between variables in the operators and actual blocks.
The initial plan consists of no steps There is no order or detail at this stage Gradually more detail constraints about the order of subsets of the steps are introduced until a completely ordered sequence is created
Step Addition
Introducing new steps to achieve goals or preconditions is called step addition In our problem, incrementally generate nonlinear plan that is plan with no steps. Means end analysis we choose two steps ON(A,B) and ON(B,C) To achieve the goal add new steps to the problem
Step Addition
CLEAR(B) *HOLDING(A) ---------------------STACK(A,B) ---------------------ARMEMPY ON(A,B) CLEAR(B) HOLDING(A) CLEAR(C) *HOLDING(G) ---------------------STACK(B,C) ---------------------ARMEMPTY ON(B,C) CLEAR(C) HOLDING(B)
Each step with its preconditions above and post conditions below it. Delete post conditions represented with () symbol Unachieved preconditions represented with (*) symbol
Step Addition
To achieve the preconditions of the two steps above we use step addition again
*CLEAR(A) ONTABLE(A) *ARMEMPTY -----------------PICKUP(A) -----------------ONTABLE(A) ARMEMPTY HOLDING(A) *CLEAR(B) ONTABLE(B) *ARMEMPTY ----------------PICKUP(B) -----------------ONTABLE(B) ARMEMPTY HOLDING(B)
Promotion
Promotion first used by sussman in his HACKER program Promotion is posting constraints that one step must precede another Adding PICKUP steps may not satisfy the *HOLDING precondition of STACK step Because there are no ordering constraints present among the steps S1S2 means that step S1 precede S2
Promotion
PICKUP step should precede STACK step so PICKUP(A)STACK(A,B) PICKUP(B)STACK(B,C) The above example in step addition *CLEAR(A) is unachieved because block A is not clear in the initial state *CLEAR(B) is unachieved even B is clear in the initial state. There exist a step STACK(A,B) with post condition CLEAR(B)
Promotion
So we can achieve CLEAR(B) by stating that the PICKUP(B) step must come before the STACK(A,B) PICKUP(B) STACK(A,B) Now turn to two unachieved preconditions *ARMEMPTY and *CLEAR(A) Try to achieve *ARMEMPTY
Promotion
Initial state has an empty arm. Each operators PICKUP(A) and PICKUP(B) has post condition ARMEMPTY
Either operator could prevent the other from executing So order them
Declobbering
Placing a new step in between two old steps Initial state contains an empty arm, so all preconditions of PICKUP(B) is satisfied Result of PICKUP(B) assert ARMEMPTY
This can be solved by inserting another step in between PICKUP(B) and PICKUP(A) to reassert ARMEMPTY STACK(B,C) can be achieved this (Use Heuristic)
Simple Establishment
Now try to solve unachieved precondition for PICKUP(A) is *CLEAR(A) from the PICKUP(A), step addition
*ON(x,A) *CLEAR(x) *ARMEMPTY ------------------UNSTACK(x,A) -------------------ARMEMPTY CLEAR(A) HOLDING(A) ON(x,A)
Simple Establishment
Assigning a value to a variable We introduce the variable x because the only post condition we interested is CLEAR(A) X=C in step UNSTACK(x,A)
Simple Establishment
The other preconditions to be satisfied CLEAR(C) and ARMEMPTY, we use promotion to make order UNSTACK(x,A)STACK(B,C) UNSTACK(x,A)PICKUP(A) UNSTACK(x,A)PICKUP(B) The original order so far
UNSTACK(C,A)PICKUP(B) STACK(B,C)PICKUP(A)STACK(A,B)
Example
The step PICKUP(B) requires ARMEMPTY but this is denied by the new UNSTACK(C,A) step Use Declobbering step to plan like PUTDOWN(C) in between the two steps
HOLDING(C) ---------------PUTDOWN(C) ----------------HOLDING(C) ONTABLE(C) ARMEMPTY
Example
The original order so far
UNSTACK(C,A)PUTDOWN(C) STACK(B,C) PICKUP(B)PICKUP(A)STACK(A,B) 1. UNSTACK(C,A) 2. PUTDOWN(C) 3. PICKUP(B) 4. STACK(B,C) 5. PICKUP(A) 6. STACK(A,B)
Example
The above nonlinear planning example we use four steps addition, Promotion, Declobbering and Simple Establishment The other heuristic is Separation makes preventing the assignment certain values to the variables
TWEAK Algorithm
1. 2. InitializeStobethesetofpropositionsinthegoalstate. Repeat I. II. RemovesomeunachievedpropositionPfromS. AchievePbyusingoneoftheheuristics.
Hierarchical Planning
Hierarchical Planning
Main difficulty in strips-like planning is complexity, One reason for complexity there is no structure There is no distinction between important and unimportant properties no distinction between important and unimportant operators This observation gives raise to two different ways of abstraction in planning abstraction of situations abstraction of operators
Cont
It is important to be able to eliminate some of the details of the problem until a solution that addresses the main issues is found Early attempts to do this involved the use of macro operators. But in this approach, no details were eliminated from actual descriptions of the operators.
Cont
Consider the example, you want to visit a friend in Europe but you have a limited amount of cash to spend. First preference will be find the airfares, since finding an affordable flight will be the most difficult part of the task. You should not worry about getting out of your driveway, planning a route to the airport etc, until you are sure you have a flight.
Introduction
Language is meant for Communicating about the world. By studying language, we can come to understand more about the world. We look at how we can exploit knowledge about the world, in combination with linguistic facts, to build computational natural language systems.
Introduction
NLP problem can be divided into two tasks: Processing written text, using lexical, syntactic and semantic knowledge of the language as well as the required real world information. Processing spoken language, using all the information needed above plus additional knowledge about phonology as well as enough added information to handle the further ambiguities that arise in speech.
Steps in NLP
Morphological Analysis: Individual words are analyzed into their components and non word tokens such as punctuation are separated from the words. Syntactic Analysis: Linear sequences of words are transformed into structures that show how the words relate to each other. Semantic Analysis: The structures created by the syntactic analyzer are assigned meanings.
Steps in NLP
Discourse integration: The meaning of an individual sentence may depend on the sentences that precede it and may influence the meanings of the sentences that follow it. Pragmatic Analysis: The structure representing what was said is reinterpreted to determine what was actually meant. For example, the sentence Do you know what time it is? should be interpreted as a request to told the time.
Morphological Analysis
Suppose we have an English interface to an operating system and the following sentence is typed: I want to print Bills .init file. Morphological analysis must do the following things: Pull apart the word Bills into proper noun Bill and the possessive suffix s Recognize the sequence .init as a file extension that is functioning as an adjective in the sentence.
Morphological Analysis
This process will also assign syntactic categories to all the words in the sentence. Consider the word prints. This word is either a plural noun or a third person singular verb ( he prints ).
Syntactic Analysis
Syntactic analysis must exploit the results of morphological analysis to build a structural description of the sentence. The goal of this process, called parsing, is to convert the flat list of words that forms the sentence into a structure that defines the units that are represented by that flat list.
Syntactic Analysis
The important thing here is that a flat sentence has been converted into a hierarchical structure and that the structure correspond to meaning units when semantic analysis is performed. Reference markers are shown in the parenthesis in the parse tree. Each one corresponds to some entity that has been mentioned in the sentence.
Syntactic Analysis
S (RM1) NP PRO I (RM2) V Want VP S (RM3) NP PRO I (RM2) V print VP NP (RM4) ADJS Bills (RM5) ADJS .init NP N file
Semantic Analysis
Semantic analysis must do two important things:
It must map individual words into appropriate objects in the knowledge base or database. It must create the correct structures to correspond to the way the meanings of the individual words combine with each other.
Discourse Integration
Specifically we do not know whom the pronoun I or the proper noun Bill refers to. To pin down these references requires an appeal to a model of the current discourse context, from which we can learn that the current user is USER068 and that the only person named Bill about whom we could be talking is USER073. Once the correct referent for Bill is known, we can also determine exactly which file is being referred to.
Pragmatic Analysis
The final step toward effective understanding is to decide what to do as a results. One possible thing to do is to record what was said as a fact and be done with it. For some sentences, whose intended effect is clearly declarative, that is precisely correct thing to do. But for other sentences, including the one, the intended effect is different.
Pragmatic Analysis
We can discover this intended effect by applying a set of rules that characterize cooperative dialogues. The final step in pragmatic processing is to translate, from the knowledge based representation to a command to be executed by the system. The results of the understanding process is Lpr /wsmith/stuff.init
Syntactic Processing
Syntactic Processing
Syntactic Processing is the step in which a flat input sentence is converted into a hierarchical structure that is called parsing. It plays an important role in natural language understanding systems for two reasons:
Semantic processing must operate on sentence elements. If there is no syntactic parsing step, then the semantics system must decide on its own constituents. Thus it can play a significant role in reducing overall system complexity.
Syntactic Processing
Although it is often possible to extract the meaning of a sentence without using grammatical facts, it is not always possible to do so. Consider the examples: The satellite orbited Mars Mars orbited the satellite In the second sentence, syntactic facts demand an interpretation in which a planet revolves around a satellite, despite the apparent improbability of such a scenario.
Syntactic Processing
Almost all the systems that are actually used have two main components: A declarative representation, called a grammar, of the syntactic facts about the language. A procedure, called parser, that compares the grammar against input sentences to produce parsed structures.
NP
PN
printed Bill
the ADJS
NP1 N
file
NP
VP
NAME
NP
John
ate
ART
the
apple
PP Q7/F
Q9
Parsing sentence: The long file has printed Execution proceeds as follows:
1. 2. 3. 4. 5. 6. 7. 8.
Begin in State S Push to NP. Do a category test to see if the is a determiner. This test succeeds, so set the DETERMINER register to DEFINITE and go to state Q6. Do a category test to see if long is an adjective. This test succeeds, so append long to the list contained in the ADJS register. Stay in state Q6 Do a category text to see if file is an adjective. This test fails. Do a category test to see if file is a noun. This test succeeds, so set the NOUN register to file and go to state Q7. 9. Push to PP. 10. Do a category test to see if has is a preposition. This test fails, so pop and return the structure. 11. There is nothing else that can be done from state Q7, so pop and return the structure. 12. The return causes the machine to be in state Q1, with the SUBJ register set to the structure just returned and the TYPE register set to DCL 13. Do a category test to see if has is a verb. This test succeeds so set the AUX register to NIL and set the V register to has. Go to state Q4 14. Push to state NP. Since the next word, printed, is not a determiner or a proper noun, NP will pop and return failure. 15. The only other thing to do in state Q4 is to halt. But more input remains, so a complete parse has not been found. Backtracking is now required. 16. The last choice point was at state Q1, so return there. The registers AUX and V must be unset. 17. Do a category test to see if has is an auxiliary. This test succeeds, so set the V register to printed. Go to state Q4 18. Do category test to see if printed is a verb. This test succeeds, so set the V register to printed. Go to state Q4 19. Now, since the input is exhausted, Q4 is an acceptable final state. Pop and return the structure(S DCL(NP(FILE(LONG)DEFINITE))HAS(VP PRINTED)). This structure is the output of the parse.
ATNs can also be used in variety of ways as follows: The contents of registers can be swapped.
If the network were expanded to recognize passive sentences, then at the point that the passive was detected, the current contents of the SUBJ register would be transferred to an OBJ register and the object of the preposition by would be placed in the SUBJ register. Bill printed the file The file was printed by Bill In each of the arcs, the test is specified simply as T. But this need not be the case. Suppose that when the first NP is found, its number is determined and recorded in the register called NUMBER. Then the arcs labeled V could have an additional test placed on them that checked that the number of particular verb that was found is equal to the value stored in NUMBER.
Unification Grammar
For limiting the procedurally such as in Speech Processing Understanding and generating from same grammar Major operations by parser while applying grammar are: Matching (of sentence constituents to grammar rules) Building Structure (corresponding to the result of combining constituents)
Unification Grammar
DAG(Direct acyclic graph) can be used to define unification operator. Each DAG represents a set of attribute-value pairs. Ex: [CAT:DET [CAT:N LEX:the] LEX:file NUMBER: SING] Result of combining these two words: [NP:[DET:the HEAD:file NUMBER:SING]] We describe NP rule as: NP-> DET N Rule of the Graph: [CONSTITUENT1: [CAT:DET LEX:{1}] CONSTITUENT2:[CAT:N LEX:{2} NUMBER:{3}] BUILD:[NP:[DET:{1} HEAD:{2} NUMBER:{3}]]] Note that the order in which attribute-value pairs are stated does not matter. Ex: [CAT:DET [LEX:the LEX:{1}] should match contituent such as CAT:DET]
Semantic Analysis
Semantic Analysis
Producing a syntactic parse of a sentence is only the first step toward understanding it. We must still produce a representation of the meaning of the sentence. Because understanding is a mapping process, we must first define the language into which we are trying to map. There is no single definitive language in which all sentence meaning can be described. The choice of a target language for any particular natural language understanding program must depend on what is to be done with the meanings once they are constructed.
Lexical processing
The first step in any semantic processing system is to look up the individual words in a dictionary ( or lexicon) and extract their meanings. Many words have several meanings, and it may not be possible to choose the correct one just by looking at the word itself. The process of determining the correct meaning of an individual word is called word sense disambiguation or lexical disambiguation. It is done by associating, with each word in lexicon, information about the contexts in which each of the words senses may appear.
Lexical processing
For example the word diamond might have following set of meanings: A geometrical shape with four equal sides. A base ball field. An extremely strong and valuable gem stone To select the correct meaning for the word diamond in the sentence, Joan saw Susans diamond shining from across the room. It is not necessary to know neither of geometrical shapes or nor baseball fields shimmer, but gem stones do.
Lexical processing
The process of determining the correct meaning of an individual word is called word sense disambiguation or lexical disambiguation. I t is done by associating, with each word in lexicon information about the contexts in which each words senses may appear. baseball field interpretation can be marked as LOCATION Some of the useful semantic markers are
PHYSICAL-OBJECT ANIMATE-OBJECT ABSTRACT-OBJECT TIME LOCATION
Sentence-Level Processing
Several approaches to the problem of creating a semantic representation of a sentence have been developed, including the following: Semantic grammars, which combine syntactic,
semantic and pragmatic knowledge into a single set of rules in the form of grammar.
Case grammars, in which the structure that is built by Conceptual parsing in which syntactic and semantic
knowledge are combined into a single interpretation system that is driven by the semantic knowledge. Approximately compositional semantic interpretation, in which semantic processing is applied to the result of performing a syntactic parse
the parser contains some semantic information, although further interpretation may also be necessary.
Semantic grammars
A semantic grammar is a context-free grammar in which the choice of non terminals and production rules is governed by semantic as well as syntactic function. There is usually a semantic action associated with each grammar rule. The result of parsing and applying all the associated semantic actions is the meaning of the sentence.
Example
S-> what is FILE-PROPERTY of FILE {query FILE.FILE-PROPERTY} S-> I want to ACTION {command ACTION} FILE-PROPERTY-> the FILE-PROP {FILE-PROP} FILE-PROP-> extension | protection | creation date| owner {value} FILE-> FILE-NAME | FILE 1 {value} FILE 1-> USERS FILE2 {FILE2.ownwer: USER} FILE 1-> FILE2 {FILE2} FILE 2-> EXT file {instance: file-struct extension: EXT}
Example
EXT-> .init | .txt | .lsp | .for | .ps | .mss value ACTION-> print FILE {instance: printing object : FILE} ACTION-> print FILE on PRINTER {instance: printing object : FILE printer : PRINTER} USER-> Bill | Susan {value}
File 1 {instance file-struct extension: .init owner: Bill}}} File 1 {instance file-struct extension: .init owner: Bill}}} EXT
want
to
Bills
.init
file
Semantic grammars
The advantages of semantic grammars are
When the parse is complete, the result can be used immediately without the additional stage of processing. Many ambiguities that would arise during a strictly syntactic parse can be avoided. Syntactic issues that do not affect the semantics can be ignored. The number of rules required can become very large since many syntactic generalizations are missed . Because the number of grammar rules may be very large, the parsing process may be expensive
Case grammars
Case grammars provide a different approach to the problem of how syntactic and semantic interpretation can be combined. Grammar rules are written to describe syntactic rather than semantic regularities. But the structures the rules produce correspond to semantic relations rather than to strictly syntactic ones Consider two sentences Susan printed the file. The file was printed by susan.
Case grammars
The case grammar interpretation of the two sentences would both be (printed ( agent Susan) ( object File ))
S N P V V P N P N P S V V P P P
Case grammars
Mother baked for three hours (baked ( agent Mother) ( timeperiod 3-hours )) The pie baked for thee hours ( baked (object pie) ( timeperiod 3-hours ) )
S N P V P N P S N P V P
P P
The pie
Conceptual Parsing
Conceptual parsing is a strategy for finding both the structure and meaning of a sentence in one step. Conceptual parsing is driven by dictionary that describes the meaning of words in conceptual dependency (CD) structures. The parsing is similar to case grammar. CD usually provides a greater degree of predictive power.
Conceptual Parsing
Main noun want stativ e x cf transitiv e intransitive x x cf cf cf pleased o huma n X one here huma n huma n object
Entities involved in action. Consider the text, --My house was broken into last week --They took the TV and the stereo Name of individuals. Consider the text, --Dave went to the movies Casual chains. Consider the text, --There was a big storm yesterday --The schools were closed today Planning sequences. Consider the text, --Sally wanted a new car --She decided to get a job Implicit presuppositions. Consider the text, --Did Joe fail CS101
The End
Reference: 1. Artificial intelligence - Elaine Rich, Kevin Knight 2. Artificial Intelligence: A Modern Approach by Stuart Russell and Peter Norvig