Beruflich Dokumente
Kultur Dokumente
1
Course Information
• Instructor : Karamjit Cheema
– Office: 215
– Email: karamjit.kaur@thapar.edu
●
Course Web Page: http://sites.google.com/site/thaparcs015/
2
Grading
• Midterm : 24%
• Lab Eval : 20%
• Quizzes : 16%
• Final : 40%
3
Course Outline
• Introduction to Compiling
• Lexical Analysis
• Syntax Analysis
– Context Free Grammars
– Top-Down Parsing, LL Parsing
– Bottom-Up Parsing, LR Parsing
• Syntax-Directed Translation
– Attribute Definitions
– Evaluation of Attribute Definitions
• Semantic Analysis, Type Checking
• Run-Time Organization
• Intermediate Code Generation
4
Syllabus : Part I
Introduction to Compiling:
Compilers,
Analysis of the source program,
Phases of a compiler
Compilation and interpretation.
5
How are Languages Implemented?
6
Interpreter
An interpreter reads the source code one instruction or line at a time, converts this
line into machine code and executes it. The machine code is then discarded and
the next line is read.
Advantages :
– Simple
– We can interrupt it while it is running, change the program and either continue or start again
– Lets the programmer know immediately when and where problems exist in the code;
Disadvantages :
Slow : every line has to be translated every time it is executed, even if it is executed many times as the
program runs.
Source code of an interpreted language cannot run without the interpreter.
Cannot perform optimizations.
Because the interpreter must be loaded into memory, there is less space available during execution; a
compiler is only loaded into memory during compilation stage, and so only the machine code is
resident in memory during run-time;
Once we have a compiled file, we can re-run it any time we want to without having to use the compiler
each time; With any interpreted language, however, we would have to re-interpret the program
each time we wanted to run it;
7
The Context of a Compiler
8
History of Compilers
The first compiler was written by Grace Hopper, in 1952, for the A-0 programming
language.
The FORTRAN team led by John Backus at IBM is generally credited as having
introduced the first complete compiler, in 1957.
COBOL was an early language to be compiled on multiple architectures, in 1960.
Early compilers were written in assembly language.
The first self-hosting compiler — capable of compiling its own source code in a high-
level language — was created for Lisp by Tim Hart and Mike Levin at MIT in 1962.
Since the 1970s it has become common practice to implement a compiler in the
language it compiles, although both Pascal and C have been popular choices for
implementation language. Building a self-hosting compiler is a bootstrapping
problem—the first such compiler for a language must be compiled either by a
compiler written in a different language, or (as in Hart and Levin's Lisp compiler)
compiled by running the compiler in an interpreter.
9
Classification of Compilers
A native or hosted compiler is one whose output is intended to directly
run on the same type of computer and operating system that the
compiler itself runs on.
The output of a cross compiler is designed to run on a different platform.
A cross compiler is a compiler capable of creating executable code for a
platform other than the one on which the compiler is run. Cross
compilers are often used when developing software for embedded
systems that are not intended to support a software development
environment.
10
COMPILERS
• A compiler is a program takes a program written in a source language
and translates it into an equivalent program in a target language.
error messages
11
Major Parts of Compilers
• There are two major parts of a compiler: Analysis and Synthesis
12
Phases of A Compiler
13
14
Pre-Processing Phase
16
Syntax Analyzer
• A Syntax Analyzer creates the syntactic structure (generally a parse
tree) of the given program.
• A syntax analyzer is also called as a parser.
• A parse tree describes a syntactic structure.
assgstmt
oldval 12
17
Syntax Analyzer (CFG)
• The syntax of a language is specified by a context free grammar
(CFG).
• The rules in a CFG are mostly recursive.
• A syntax analyzer checks whether a given program satisfies the rules
implied by a CFG or not.
– If it satisfies, the syntax analyzer creates a parse tree for the given program.
18
Syntax Analyzer versus Lexical Analyzer
• Which constructs of a program should be recognized by the lexical
analyzer, and which ones by the syntax analyzer?
– Both of them do similar things; But the lexical analyzer deals with simple
non-recursive constructs of the language.
– The syntax analyzer deals with recursive constructs of the language.
– The lexical analyzer simplifies the job of the syntax analyzer.
– The lexical analyzer recognizes the smallest meaningful units (tokens) in a
source program.
– The syntax analyzer works on the smallest meaningful units (tokens) in a
source program to recognize meaningful structures in our programming
language.
19
Parsing Techniques
• Depending on how the parse tree is created, there are different parsing
techniques.
• These parsing techniques are categorized into two groups:
– Top-Down Parsing,
– Bottom-Up Parsing
• Top-Down Parsing:
– Construction of the parse tree starts at the root, and proceeds towards the leaves.
– Efficient top-down parsers can be easily constructed by hand.
– Recursive Predictive Parsing, Non-Recursive Predictive Parsing (LL Parsing).
• Bottom-Up Parsing:
– Construction of the parse tree starts at the leaves, and proceeds towards the root.
– Normally efficient bottom-up parsers are created with the help of some software tools.
– Bottom-up parsing is also known as shift-reduce parsing.
– Operator-Precedence Parsing – simple, restrictive, easy to implement
– LR Parsing – much general form of shift-reduce parsing, LR, SLR, LALR
20
Semantic Analyzer
• A semantic analyzer checks the source program for semantic errors and
collects the type information for the code generation.
• Type-checking is an important part of semantic analyzer.
• Normally semantic information cannot be represented by a context-free
language used in syntax analyzers.
• Context-free grammars used in the syntax analysis are integrated with
attributes (semantic rules)
– the result is a syntax-directed translation,
– Attribute grammars
• Ex:
newval := oldval + 12
• The type of the identifier newval must match with type of the expression (oldval+12)
21
Intermediate Code Generation
• A compiler may produce an explicit intermediate codes representing the
source program.
• These intermediate codes are generally machine (architecture independent).
But the level of intermediate codes is close to the level of machine codes.
• Ex:
newval := oldval * fact + 1
22
23
Code Optimizer (for Intermediate Code Generator)
• The code optimizer optimizes the code produced by the intermediate
code generator in the terms of time and space.
• Ex:
MULT id2,id3,temp1
ADD temp1,#1,id1
24
Code Generator
• Produces the target language in a specific architecture.
• The target program is normally is a relocatable object file containing
the machine codes.
• Ex:
( assume that we have an architecture with instructions whose at least one of its operands is
a machine register)
MOVE id2,R1
MULT id3,R1
ADD #1,R1
MOVE R1,id1
25
26
Decompiler
Translates a file containing information at a relatively low level of abstraction
(usually designed to be computer readable rather than human readable) into
a form having a higher level of abstraction (usually designed to be human
readable).
It can be used for the recovery of lost source code, and is also useful in some
cases for computer security, interoperability and error correction.
27