Sie sind auf Seite 1von 14

BIT 24 (1984), 288-301

P R O P O S I T I O N S AND SPECIFICATIONS O F PROGRAMS


IN MARTIN-LOF'S TYPE THEORY

BENGT NORDSTROM and JAN SMITH


Programming Methodology Group, Department of Computer Sciences, University of GOteborg and
Chalmers University of Technology, S-41296 G6teborg, Sweden

Abstract.
The constructive meaning of mathematical propositions makes it possible to identify specifica-
tions for computer programs with propositions. In Martin-L6fs type theory, constructing a pro-
gram satisfying a specification corresponds to proving the proposition expressing the specification.
These ideas are explained and some examples of specifications are given.

1. Introduction.

The task of the p r o g r a m m e r is to write an efficient and correct program to a


given specification. The major concern has been efficiency as is seen in the
structure of traditional imperative programming languages. This has been
pointed out, for instance, by Backus in his Turing Lecture [1]. The languages
reflect the design of the traditional computers in m a n y ways. Locations in the
memory of the computer are reflected as pointers and as arguments passed by
reference. The size and structure of a word in the m e m o r y are reflected as
limitations on the objects that m a y be passed as arguments or returned as values
from a procedure.
The basic computation step in a computer is a modification of its state.
Likewise, an imperative program is seen as a state to state transformation. This
often results in side effects, which make it difficult to understand a program
from an understanding of the smaller programs which, in general, it is
composed of. Thus, present day programs are often poorly understood and
accounts of programs are at a low level of abstraction.
Especially in the light of the rapid development of cheap and fast hardware, it
seems unavoidable that efficiency is going to play a diminished role in the
future. Programs will be used for solving larger and more complex tasks, so it
will become more and more important to be able to reason about the properties
of the programs. If correctness is going to be a major concern, it is necessary
that programming languages are designed together with proof rules which
make it possible to reason about programs. But, to be used, the rules must be

Received June 1983. Revised May 1984.


PROPOSITIONS AND SPECIFICATIONS OF PROGRAMS I N . . . 289

natural in the sense that they should allow simple formalization of intuitive
reasoning. That traditional programming languages are very far from this
requirement is illustrated by O'Donnell [19], who has some examples of
published incorrect rules for a weak imperative language containing only
assignments, conditionals, while-statements and function definitions.
In mathematical logic, formal languages in which one can express substantial
parts of mathematics have a long history, going back to Frege's Begriffsschrift
from 1879. So, if we want a programming language in which it should be
possible not just to write down programs but also to reason about them, an
obvious attempt is to see whether any of the formalizations used for
mathematics could also be used for programming.
Today, the standard formalization of classical, i.e. nonconstructive,
mathematics is the axiomatization of set theory given by Zermelo in 1908.
However, there is no notion of computation in classical set theory. So, using a
formalization of nonconstructive set theory for programming involves a serious
problem: there is no natural way of representing programs. For instance, the
notion of function cannot be used because functions in classical set theory are in
general not computable.
Out of the foundational crisis of mathematics in the first decades of this
century, constructive mathematics arose as an independent branch of
mathematics, mainly developed by Brouwer under the name intuitionism.
Constructive mathematics did not get much support because of the general
belief that important parts of mathematics were impossible to develop
constructively. By the work of Bishop, however, this belief has been shown to be
mistaken. In his book "Foundations of constructive analysis" [2], he
constructively rebuilds central parts of classical analysis; and he does it in a
way that demonstrates that constructive mathematics can be as simple and
elegant as classical mathematics. Bishop [3] also envisaged the possibility of
using a formalization of constructive mathematics for programming, starting
from G6ders [9] theory of computable functionals of finite type. Constable [6],
Goto [8], Sato [21], and Takasu [26] have also proposed constructive
mathematics as a foundation of programming.
Martin-L6fs type theory [14] was developed with the aim of being a
formalization of constructive mathematics. Its rules are formulated in the style
of Gentzen's natural deduction system for predicate logic, a formal system
which Gentzen set up in 1934 with the intention that it should be as close as
possible to actual reasoning. Martin-L6f [16] has suggested that type theory
also could be viewed as a programming language. As such it is a typed
functional language without assignments or other imperative features.
-Compared with other programming languages, it has a very rich type structure
in that the type of a program can be used to completely specify the task of the
program; it can be used to describe what the program should do without
describing how the program performs its task. This is in contrast with, for
290 BENGT NORDSTROM AND JAN SMITH

instance, the transformational approach of Burstall and Darlington [5].


According to them, a specification is given by a program which is constructed
without paying attention to efficiency, and then this program is transformed to
a more efficient one. In type theory, a specification may be given without giving
any program that satisfies the specification. This is similar to mathematics in
general: you may very well formulate a proposition, like e.g. Fermat's Last
Theorem, without having any idea of how to prove it. The constructive
explanation of propositions makes it possible to identify specifications and
mathematical propositions. In type theory, propositions are represented by
types, and in the following section we give the constructive explanation of the
notion of proposition and its connection with the notion of type. The remainder
of the paper gives some examples of how specifications can be expressed in this
framework.

2. The constructive explanation of the logical constants.


2.1. Propositions and types
How is it possible that propositions can be represented as types? In order to
understand that, we will explain the intuitionistic meaning of the logical
constants, specifically as done by Heyting [10]. In classical mathematics, a
proposition is thought as being true or false independently of whether we can
prove or disprove it. On the other hand, a proposition is constructively true
only if we have a method of proving it. For example, classically the law of
excluded middle, A v -1 A, is valid since any proposition A is either true or false.
Constructively, however, a disjunction is true only if we can prove one of the
disjuncts. Since we have no method of proving or disproving an arbitrary
proposition A, we have no proof of A v 7 A and therefore the law of excluded
middle is not intuitionistically valid.
So, the constructive explanations of the logical concepts must be spelled out
in terms of proofs and not in terms of a world of mathematical objects existing
independently of us. First, let us consider only implication and conjunction.

A proof of A ~ B is a function (method, program) which to each proof of A


gives a proof of B.

For example, in order to prove A ~ A we have to giye an effectively


computable function which to each proof of A gives a proof of A ; the obvious
choice is the iden(ity function 2x. x, using the 2-notation.

A proof of A & B is a pair whose first component is a proof of A and whose


second component is a proof of B.

If we denote the left projection by fst, i.e.fst((a, b)) = a where (a, b) is the pair
PROPOSITIONS AND SPECIFICATIONS OF PROGRAMS I N . . . 291

of a and b, 2x. fst(x) is a proof of (A & B) = A, which can be seen as follows.


Assume that

xisaproofofA & B.

Since x must be a pair whose first component is a proof of A, we get

fst(x) is a proof of A.

Hence, 2x .fst(x) iisa function which to each proof of A and B gives a proof A,
i.e. Ax .fst(x) is a proof of A & B D A.
The idea behind propositions as types is to identify a proposition with the
type of its proofs. That a proposition is true then means that its corresponding
type is nonempty. For implication and conjunction we get, in view of the
explanations above,

A ~ B is identified with A ~ B, the type of functions from A to B

and

A & B is identified with A x B, the cartesian product of A and B.

Using the 2-notation, the objects of A ~ B are of the form 2x.b(x), where
b(x)~ B when x e A, and the objects of type A x B are of the form (a, b) where
a e A and b~B.
These identifications may seem rather obvious, but they were not observed
until Curry [7], and then only as a formal correspondence of the types of the
basic combinators and logical axioms for a language only involving implication.
This was extended to first order intuitionistic arithmetic by Howard [11] in
1969. Similar ideas also occur in de Bruijn [4] and Lauchli [14]. Scott [22] was
the first to suggest a theory of constructions in which propositions are
introduced as types. The idea of using constructions to represent proofs is also
related to recursive realizability interpretations, first developed by Kleene [12]
for intuitionistic arithmetic and extensively used in metamathematical
investigations of constructive mathematics.
These ideas are incorporated in Martin-L6fs type theory, which has enough
types to express all the logical constants. In particular, type theory has function
types and cartesian products which, as we have seen, make it possible to express
implication and conjunction. Let us now see what type forming operators are
needed for the remaining logical constants.
A disjunction is constructively true only if we can prove one of the disjuncts.
So a proof of A v B is either a proof of A or a proof of B together with an
indication of which disjunct we have a proof of.
292 BENGT NORDSTROM AND JAN SMITH

Hence,

A v B is identified with A + B, the disjoint union of A and B.

The objects of A + B are of the form inl(a) and inr(b), where a e A and be B.
The negation of a proposition A can be defined by

7A-A~

where .1. stands for absurdity, i.e. the proposition which has no proof. If we let
q~ denote the empty type, we have that

7 A is identified with the type A --, ~

using the interpretation of implication.


To express propositional logic we only require types that are available in
many programming languages. In order to deal with the quantifiers, however,
we need operations defined on families of types, i.e. types B(x) depending on an
arbitrary object x of some type A. Heyting's explanation of the existential
quantifier is the following.
A proof of (3x ~ A)B(x) consists of art object a of type A together with a proof
of B(a).
So a proof of (3x E A)B(x) is a pair whose first component a is an object of
type A and whose second component is a proof of B(a). The corresponding type
is the disjoint union (S,x ~ A)B(x) of a family of types B(x), where x ~ A. The
objects of this type a r e pairs <a,b> where a ~ A and b eB(a). We get the
following interpretation of the existential quantifier:

(3x e A)B(x) is identified with the type (r,x ~ A)B(x).

Finally, we have the universal quantifier.

A proof of (Vx e A)B(x) is a function (method, program) which to each object a


of type A gives a proof of B(a).

The type corresponding to the universal quantifier is the cartesian product of a


family of types, denoted by (l-lx~A)B(x). The objects of this type are functions
which, when applied to an object of type A, gives an object of type B(a). Hence,

(Vxe A)B(x) is identified with the type (FlxeA)B(x).

The objects of (I'IxEA)B(x) are of the form ~x.b(x) where b(x)eB(x) when
x~A.
PROPOSITIONS AND SPECIFICATIONS OF PROGRAMS I N . . . 293

Except the empty type, we have not yet introduced any types that correspond
to atomic propositions. One such type is the equality type a =a b, which
expresses that a and b are equal objects of type A. Recalling that a proposition
is identified with the type of its proofs, we see that this type has to be nonempty
if and only if a and b are equal. If a and b are equal objects of type A, we
postulate that the constant e is the only object of the type a =Ab. This is
similar to recursive realizability interpretations of arithmetics where one usually
lets the natural number 0 realize a true atomic formula.
Besides the types for interpreting logic, there are, of course, the usual types
needed for programming: natural numbers, lists and enumeration types like
Boolean. There are also other ways of introducing types. For instance, types
may be defined by primitive recursion. As an example, a type F(n) depending on
n e N , where N is the type of natural numbers, may be introduced by the
recursion
r(o) = N
F(succ(n)) = N ---}F(n)

so, for instance, F(3) = N ~ (N ~ (N ~ N)).


For a presentation of the formal rules of type theory we refer to [16, 17, 18,
203.
2.2. An example
The proposition

(1) (3x e A)(Vy~ B)C(x, y) ~ (Vy~ B)(3x~ A)C(x, y)

corresponds to the type

(2) (Zx ~ A)(Hy e B)C(x, y) ---}(Fly ~ B)(Zx ~ A)C(x, y).

The proposition (1) is constructively true as can be seen by the following


argument. Assume that the antecedent (3x ~ A)(Vy ~ B)C(x, y) is true. Then there
exists an object a of type A such that ( Yy ~ B)C(a, y) is true. So, if y is an object
of type B, C(a, y) is true. Hence, (3x ~ A)C(x, y) is true, and, since y is arbitrary,
we get that (y ~ B)(3x ~ A)C(x, y) is true. We will use this informal proof to
construct an object of the type (2).
Assume that

(3) z ~ (,~x ~ A)(rlye B)C(x, y)

which corresponds to the assumption that (3x~A)(3y~B)C(x,y) is true,


because we assume that we have an object z of the corresponding type. Since z
is an object of a Z-type, z must be a pair, and, if we let fst and snd denote the
294 BENGT NORDSTROM AND JAN SMITH

left and right projections respectively, we get

(4) fst(z) e A

and

(5) snd(z)e (Ily e B)C(fst(z), y).

Here fst(z) corresponds to the object a in the proof of (1) above and snd(z) is a
proof of (Vy e B)C(a, y). Assume that

(6) y e B.

If we denote the result of applying a function f to an object a by ap(f, a), we


get from (5) and (6)
(7) ap(snd(z), y)e C(fst(z), y).

(7) corresponds to the truth of C(a, y) for an arbitrary object y of type B in


the proof of (1). From (4) and (7), we get

(8) (fst(z), ap(snd(z), y)) e (Z.x e A )C(x, y)

which corresponds to the truth of (gx e A)C(x, y). Since (8) is derived from the
assumption (6), we get

(9) 2y. (fst(z), ap(snd(z), y) e (l-lye B)(~x e A)C(x, y),

i.e., we have constructed, from the assumption (3), an object of the type
corresponding to the proposition (Vy e B)(gx e A)C(x, y). Since (9) is derived
from the assumption (3), we finally get

2z. 2y. (fst(z), ap(snd(z), y))e ((Zx e A)(1-ly e B)C(x, y) ~ (Fly e B)(~,x e A)C(x, y)).

The steps in this proof are all in accordance with the formal rules of type
theory.
Note that, in this example, we have used the word proof for two different
notions. First, for an object of a type corresponding to a proposition and,
second, for a derivation that an object is of a certain type. For a detailed
discussion of this important distinction, we refer to Sundholm [25].

2.3. Propositions as specifications


Kolmogorov [13] suggested that a proposition should be understood as a
problem in the following way.
PROPOSITIONS AND SPECIFICATIONSOF PROGRAMS IN... 295
If A and B are problems then

A & B is the problem of solving both of the problems A and B.

A v B is the problem of solving at least one of the problems A and B.

A D B is the problem of solving the problem B under the assumption


that the problem A can be solved.

This explanation of the logical constants can be used to specify the task of a
program in the following way.

A & B is a specification of programs which, when executed, yield a pair (a, b),
where a is a program for the task A and b is a program for the task B.

A v B is a specification of programs which, when executed, either yield inl(a) Or


inr(b), where a is a program for A and b is a program for B.

A D B is a specification of programs which, when executed, yield 2x. b(x),


where b(x) is a program for B under the assumption that x is a program for A.

These explanations can be extended to the quantifiers:


(Vx~A)B(x) specifies programs which, when executed, yield 2x.b(x), where
b(x) is a program for B(x) under the assumption that x is an object of the type
A. This means that, when a program for (xe A)B(x) is applied to an arbitrary
object zx of type A, the results will be a program for B(x).

(3xe A)B(x) specifies programs which, when executed, yield (a, b), where a i's
an object of A and b a program for B(a). So, to solve the task (3xeA)B(x), it is
necessary to find an object a of A such that B(a) holds

3. Some examples of specifications in type theory.

When we specify the task of a program in type theory, we write down a


proposition which also can be read as the type of the program. If we prove the
proposition, we also obtain a program for the task. The proposition describes
what the program should do without describing how. For instance, the task of a
sorting program is to output an ordered permutation of its input. This
description of the task does not say anything about how it should be solved and
there are many ways of solving it, each corresponding to a sorting program.
There are also tasks which we have no hope of solving An example is the task
of deciding whether two arbitrary functions mapping integers to integers yield
equal results for all arguments. Since we know that there is no recursive
296 BENGT NORDSTROM AND JAN SMITH

function deciding if two recursive functions are equal, we have no hope of


finding such a program.
In this section, we will give some simple examples of specifications in type
theory. We will first specify the task of finding the greatest common divisor of
two numbers. We will then show how to specify a compiler and, finally, we
specify a program to generate a KWIC-index.

3.1. Greatest common divisor


Consider the following specification of a program which finds the greatest
common divisor of two natural numbers:

(1) ( V x E N ) ( y ~ N ) ( 3 z ~ N ) [ ( z l x ) & ( z l y ) & ( V u ~ N ) ( ( u l x ) & ( u l y ) ~ u <<.z)]


where zlx = ( 3 y e N ) ( y * z = x). A program for this task is a function which.
when applied to natural numbers x and y gives a pair (a, b)!where a is a natural
number and b is a program for the task

(alx)& (aly)& (Vu~N)((u[x)& (uly) ~ u ~< a).

So a is the greatest common divisor of x and y and b is a proof of this.


Differently expressed, if f is a program for (1), then

fst(ap(ap(f, x ), y ) )

is the greatest common divisor of x and y and

snd(ap(ap(f, x), y)

is a proof of this fact.


Note, that in order to make the specification (1), we need not construct the
program f However, if we give a constructive proof of (1), we will obtain a
program f and thereby a program which computes the greatest common
divisor of two natural numbers. See Smith [23] for an example of the use of
the identification of propositions and types in deriving an actual program.

3.2. A compiler
Considering the restricted case where all programs in the source and target
languages terminate, the general task of a compiler can be specified in the
following way. Let all programs in the source language be represented by
objects of a type L and all programs i n t h e target language by objects of a type
I'. It is convenient to let the representation be some kind of tree structure
corresponding to the abstract syntax of the program, but the actual
representation is immaterial for our purpose. Assume that the operational
semantics of the languages are given by M and M', interpreters for L and /~,
PROPOSITIONS AND SPECIFICATIONS OF PROGRAMS I N . . . 297

respectively. Hence

M(s) e (I ~ O) when s ~ L
M'(t) ~'(I --* O) when t e E

where I ~ 0 is the type of all functions from inputs to outputs.


The task of a compiler c from L to E can now be described a,s follows:
for an arbitrary program s in L, find a program t in E which has the same
input-output behaviour as s, i.e.

(1) (Vs~ L)(3t e E)(M(s) = I-~ oM'(t))

where the equality between the two functions M(s) and M'(t) means that
ap(M(s), i) = ap(M'(t), i) for all inputs i. We can illustrate this by the commuting
diagram
c
L ....... ~L

1~0

If we prove the proposition (1) in type theory, we get a construction f such that

f ~ (gse L)(3te E)(M(s) = l _ . o M ' ( t ) )

and, if s is a source program, i.e. s ~ L, we know that

ap(f, s) e (3t e E)(M(s) =t--. oM'(t))

So fst(ap(f, s)) is an object in E with the same input-output behaviour as s. We


can now define the compiler by

c - 2s. fst(ap(f,s)).

There is no hope that the specification (1) is executable for nontrivial languages.
It should also be noticed that the specification does not determine the translated
program s uniquely from a given program t. So two compilers producing
different codes both satisfy the specification. In the transformational approach
[5], the specification defines the target programs in terms of the given source
298 BENGT NORDSTROM AND JAN SMITH

programs uniquely, so an optimizing compiler does not satisfy the original


compiler specification.

3.3. K W I C - i n d e x generation
This example is a specification of a problem from a workshop on Program
Specification [24] which was held in Aarhus, Denmark, in August 1981. The
informal specification is copied from the proceedings of the conference.

3.3.1. INFORMALSPECIFICATION
Consider a program which generates a K W I C index (keyword in context). A
title is a list of words which are either significant or nonsignificant. A rotation of
a list is a cyclic shift of the words in the list, and a significant rotation is a
rotation in which the first word is significant. Given a set of titles and a set of
nonsignificant words, the program should produce an alphabetically sorted list
of the significant rotations of the titles.
titles
T H E T H R E E L I T T L E PIGS.
S N O W W H I T E A N D T H E SEVEN DWARVES.

nonsignificant words
THE, T H R E E , AND, SEVEN

should produce :
DWARVES. SNOW W H I T E AND T H E SEVEN
L I T T L E PIGS. T H E T H R E E
PIGS. T H E T H R E E L I T I ' L E
S N O W W H I T E A N D T H E SEVEN DWARVES.
W H I T E AND T H E SEVEN DWARVES. S N O W

3.3.2 FORMALSPECIFICATION
Suppose that we have an enumeration type Printable_char which is a subset
of the enumeration type Ascii.
A title is a list of words:

Title - List(Word)
Word -- List(Printable_char).

If necessary, we could require that a title is a nonempty list of nonempty words :

Title' =- (3x e List(Word'))[x ~ nil]


Word' = (3x ~ List(Printable_char))[x --/=nil].
PROPOSITIONS AND SPECIFICATIONS OF PROGRAMS I N . . . 299

An dement in the type Title' is a pair, the first component being a nonempty list
of words and the second component a proof that the first component is
nonmpty. A rotation of a list is a cyclic shift of the elements in the list. We can
define Rotation(y, t), the proposition that y is a rotation of t, in the following
way:

Rotation(y, t) =- (3n ~ N)[shifP(y) = t] where

shift(nil) = nil
shift(a, s) = s<> (a. nil)

f(x) = x
f"+ l(x) = f ( f " ( x ) )

and 0 is the concatenation operator

nil <> y = y
a . s < > y = a. (s<>y).

A siynificant rotation is a rotation in which the first word is significant. Let


Siynrot(y, t, n) be the proposition that the nonempty list y is a significant
rotation of t with respect to the list n of nonsignificant words n. Then we may
define Signrot(y, t, n) by

Siynrot(y, t, n) - Rotation(y, t ) & "-] (first(y) inrvo,d n)

a inA nil = _I_


where
a inA b.s [a = A b] v (a in A s).

In the K W l C example, the output is sorted according to a lexicographical


orderiny between the titles. In general, assume that a <,4 b is a proposition which
is true if and only if the element a is less than the" element b in A. We let _1_
stand for the proposition which is always true, represented by a set with one
element
i--{-}

We can define a lexicographical ordering between the elements in List(A) by:

I Lex(A)(<A)(nil, nil) = .l.


Lex(A)(<.4)(a. s, nil) = A_
L e x ( A )(<A)(nil, b . t) = .L
Lex(A)(<,,)(a. s, b. t) = ([a =A b] & Lex(A)(<A)(s, t) v (a<Ab)).
300 BENGT NORDSTROM AND JAN SMITH

So, if ( is an ordering between the printable characters, then


L e x ( P r i n t a b l e c h a r ) ( ( ) is an ordering between the words and
Lex(Word)(Lex(Printable_char)(())~is an ordering between the titles.
The proposition Ordered(x) that the list of titles x is sorted according to a
lexicographical ordering can be defined by the equalities

Orderedlnil) = _1_
Ordered(a. nil) = _L
Ordered(a . b . s) = Lex(Word)(Lex(Printable_char)(())(a, b ) & Ordered(b . s).

We have now made enough definitions to be able to formally specify the kwic
program.
Given a list t of titles and a list n of nonsignificant words, the program should
produce an alphabetically sorted list of significant .rotations of the titles. This
means that the output should be ordered and a list of words should appear as
an element in the output if and only if it is a significant rotation of some title in
the input, i.e. if we assume that t~ List(Title), n~ List(Word), then a proof of the
proposition

(3x E List (Title)(Ordered(x) &


(Vy ~ Title)(y in x ~ (3z ~ Title)((z in t) & Siynrot(y, z, n)))

where

A . ~ B - (A D B ) & (B ~ A)

would yield a pair whose first component has the properties we are looking for.
A complete specification of the task of the kwic program is now given by the
proposition

(Vt e List(Title))(Vne List(Word))


(3x ~ List (Title))(Ordered(x) &
( r y e Title)(y in x ~ (Vz E Title)((z in t) & Siynrot(y, z, n)))).

This specification does not say how to solve the problem, but if we prove this
proposition in type theory we would obtain a function f which when applied to
a list of titles t and a list n of words gives a pair. The first component of the pair
is the desired result, i.e.

fst(ap(ap~, t), n))

gives a KWlC-index of t with respect to the list of nonsignificant words n.


BENGT NORDSTROM AND JAN SMITH 301
REFERENCES

1. J. Backus, Can programming be liberatedfrom the yon Neuman style ? A functional style and its
algebra of programs, Comm. ACM, Vol. 21 no. 8 pp. 613-641 (August 1978).
2. E. Bishop, Foundations of Constructive Analysis, McGraw-Hill, New York (1967).
3. E. Bishop, Mathematics as a numerical language, pp. 53-71 in Intuitionism and Proof Theory,
ed. Myhill, King, Vesley, North Holland, Amsterdam (1970).
4. N. G. de Bruijn, A Survey of the project Automath, pp. 579-606 in To H. B. Curry: Essays on
Combinatory Logic, Lambda Calculus and Formalism, ed. J. P. Seldin and J. R. Hindley,
Academic Press, London (1980).
5. R. M. Burstall and J. Darlington, A transJbrmation system .[br developing recursiveprograms, J.
of the ACM, Vol. 24 no. 1 (January 1977).
6. R. L. Constable, Constructive mathematics and automatic program writers, pp. 229-233 in
Proceedings of IFIP Congress, North-Holland, Ljubljana (1971).
7. H. B. Curry and R. Feys, Combinatory Logic, Vol. 1, North-Holland, Amsterdam (1958).
8. S. Goto, Program Synthesis from Natural Deduction Proofs, IJICAI 1979, Tokyo.
9. K. G6del, ~)bereine bisher noch nicht beniitzte Erweiterung des finiten Standpunktes, Dialeetiea,
Vol. 37 pp. 280-287 (1958).
I0. A. Heyting, Intuitionism, an Introduction, North-Holland (1956).
11. W. A. Howard, The Formulae-as-types notion of construction, pp. 479490 in To H. B. Curry:
Essays on Combinatory Logic, Lambda Calculus and Formalism, ed. J. P. Seldin and J. R.
Hindley, Academic Press, London (1980).
12. S. C. Kleene, On the interpretation of intuitionistic number theory, Journal of Symbolic Logic,
Vol. 10 pp. 109-124 (1945).
13. A. N. Kolmogorov, Zur Deutang der intuitionistischen Logik, Mathematische Zeitschrift, Vol. 35
pp. 58-65 {1932).
14. H. Lauchli, An abstract notion of realizabiHtyfor which intuitionistic predicate logic is complete
in Intuitionism and Proof Theory, ed. Myhill, King and Vesley, North Holland, Amsterdam
(1970).
15. P. Martin-LOf, An intuitionistic theory of types: predicative part, pp. 73-118 in Logic
Colloquium 1973, ed. H. E. Rose and J. C. Shepherdson, North-Holland, Amsterdam (1975).
16. P. Martin-LOf, Constructive mathematics and computer programming, pp. 153-175 in Logic,
Methodology and Philosophy of Science, VI, North-Holland Publishing Company, Amsterdam
(1982), Proc. of the 6th Int. Cong., Hannover, 1979.
17. B. NordstrOm, Programming in constructive set theory: some examples in Proc. ACM 1981
Conference on Functional Languages and Computer Architecture, Wentworth-by-the-Sea,
Portsmouth, New Hampshire (October 1981).
18. B. NordstrOm, K. Petersson, and J. Smith, An Introduction to Type Theory, Programming
Methodology Group, Chalmers University of Technology, GOteborg. In preparation.
19. M. J. O'Donnell, A Critique of the Foundations of Hoare-style Programming Looic, CACM,
Vol. 25 no. 12 pp. 927-934 (Decembe.r 1982).
20. K. Petersson, A Programming System for Type Theory, Memo 21, Programming Methodology
Group, Chalmers University of Technology, GOteborg (1982).
21. M. Sato, Toward a mathematical theo.ry of program synthesis, Proc. of the 6-th IJICAI 1979.
22. D. Scott, Constructive validity, pp. 237-275 in Symposium on Automatic Demonstration,
Lecture Notes in Mathematics, Vol. 125, Springer Vedag, Berlin (1970).
23. J. Smith, The identification of propositions and types in Martin-l.,OJ~s type theory: A
programming example in International Conference on Foundations of Computation Theory,
Borgholm, Sweden (August 2t-27, 1983). Lecture notes in Computer Science, Vol. 158,
Springer Vedag.
24. J. Staunsrup, Proceedings of a Workshop on Program Specification, Aarhus, Denmark (August
1981). Lecture Notes in Computer Science, Vol. 134, Springer Verlag.
25. G. Sundholm, Constructions, proofs and the meaning of the logical constants, Journal of
Philosophical Logic 12 (1983) pp. 151-172.
26. S. Takasu, Proofs and Programs, Proc. of the 3rd IBM Syrup. on Math. Foundations of
Computer Science 1978, IBM Japan.

Das könnte Ihnen auch gefallen