Search

The Online Encyclopedia and Dictionary

 
     
 

Encyclopedia

Dictionary

Quotes

 

Compiler

(Redirected from Compile)


A compiler is a computer program that translates a computer program written in one computer language (called the source language) into an equivalent program written in another computer language (called the output, object, or target language).

Contents

Introduction and history

Most compilers translate source code written in a high level language to object code or machine language that may be directly executed by a computer or a virtual machine. However, translation from a low level language to a high level one is also possible; this is normally known as a decompiler if it is reconstructing a high level language program which (could have) generated the low level language program. Compilers also exist which translate from one high level language to another (cross compilers), or sometimes to an intermediate language that still needs further processing; these are sometimes known as cascaders.

Typical compilers output so-called objects that basically contain machine code augmented by information about the name and location of entry points and external calls (to functions not contained in the object). A set of object files, which need not have all come from a single compiler provided that the compilers used share a common output format, may then be linked together to create the final executable which can be run directly by a user.

Several experimental compilers were developed in the 1950s, but the FORTRAN team led by John Backus at IBM is generally credited as having introduced the first complete compiler, in 1957. COBOL was an early language to be compiled on multiple architectures, in 1960. [1]

The idea of compilation quickly caught on, and most of the principles of compiler design were developed during the 1960s.

A compiler is itself a computer program written in some implementation language. Early compilers were written in assembly language. The first self-hosting compiler -- capable of compiling its own source code in a high-level language -- was created for Lisp by Hart and Levin at MIT in 1962. [2] The use of high-level languages for writing compilers gained added impetus in the early 1970s when Pascal and C compilers were written in their own languages. Building a self-hosting compiler is a bootstrapping problem -- the first such compiler for a language must be compiled either by a compiler written in a different language, or (as in Hart and Levin's Lisp compiler) compiled by running the compiler in an interpreter.

During the 1990s a large number of free compilers and compiler development tools were developed for all kinds of languages, both as part of the GNU project and other open-source initiatives. Some of them are considered to be of high quality and their free source code makes a nice read for anyone interested in modern compiler concepts.

Types of compilers

A compiler may produce code intended to run on the same type of computer and operating system ("platform") as the compiler itself runs on. This is sometimes called a native-code compiler. Alternatively, it might produce code designed to run on a different platform. This is known as a cross compiler. Cross compilers are very useful when bringing up a new hardware platform for the first time (see bootstrapping). A "source to source compiler" is a type of compiler that takes a high level language as its input and outputs a high level language. For example, an automatic parallelizing compiler will frequently take in a high level language program as an input and then transform the code and annotate it with parallel code annotations (e.g. OpenMP) or language constructs (e.g. Fortran's DOALL statements).

  • One-pass compiler , like early compilers for Pascal
    • The compilation is done in one pass, hence it is very fast.
  • Threaded code compiler (or interpreter), like most implementations of FORTH
    • This kind of compiler can be thought of as a database lookup program. It just replaces given strings in the source with given binary code. The level of this binary code can vary; in fact, some FORTH compilers can compile programs that don't even need an operating system.
  • Incremental compiler, like many Lisp systems
    • Individual functions can be compiled in a run-time environment that also includes interpreted functions. Incremental compilation dates back to 1962 and the first Lisp compiler, and is still used in Common Lisp systems.
  • Stage compiler that compiles to assembly language of a theoretical machine, like some Prolog implementations
    • This Prolog machine is also known as the Warren abstract machine (or WAM). Byte-code compilers for Java, Python (and many more) are also a subtype of this.
  • Just-in-time compiler, used by Smalltalk and Java systems
    • Applications are delivered in bytecode, which is compiled to native machine code just prior to execution.
  • A retargetable compiler is a compiler that can relatively easily be modified to generate code for different CPU architectures. The object code produced by these is frequently of lesser quality than that produced by a compiler developed specifically for a processor. Retargetable compilers are often also cross compilers. GCC is an example of a retargetable compiler.
  • A parallelizing compiler converts a serial input program into a form suitable for efficient execution on a parallel computer architecture.

Compiled vs. interpreted languages

Many people divide higher-level programming languages into compiled languages and interpreted languages. However, there is rarely anything about a language that requires it to be compiled or interpreted. Compilers and interpreters are implementations of languages, not languages themselves. The categorization usually reflects the most popular or widespread implementations of a language -- for instance, BASIC is thought of as an interpreted language, and C a compiled one, despite the existence of BASIC compilers and C interpreters. There are exceptions, however; some language specifications assume the use of a compiler (as with C), or spell out that implementations must include a compilation facility (as with Common Lisp).

Compiler design

In the past, compilers were divided into many passes1 to save space. A pass in this context is a run of the compiler through the source code of the program to be compiled, resulting in the building up of the internal data of the compiler (such as the evolving symbol table and other assisting data). When each pass is finished, the compiler can free the internal data space needed during that pass. This 'multipass' method of compiling was the common compiler technology at the time, but was also due to the small main memories of host computers relative to the source code and data.

Many modern compilers share a common 'two stage' design. The front end translates the source language into an intermediate representation. The second stage is the back end, which works with the internal representation to produce code in the output language. The front end and back end may operate as separate passes, or the front end may call the back end as a subroutine, passing it the intermediate representation.

This approach mitigates complexity separating the concerns of the front end, which typically revolve around language semantics, error checking, and the like, from the concerns of the back end, which concentrates on producing output that is both efficient and correct. It also has the advantage of allowing the use of a single back end for multiple source languages, and similarly allows the use of different back ends for different targets.

Often, optimizers and error checkers can be shared by both front ends and back ends if they are designed to operate on the intermediate language that a front-end passes to a back end. This can let many compilers (combinations of front and back ends) reuse the large amounts of work that often go into code analyzers and optimizers.

Certain languages, due to the design of the language and certain rules placed on the declaration of variables and other objects used, and the predeclaration of executable procedures prior to reference or use, are capable of being compiled in a single pass. The Pascal programming language is well known for this capability, and in fact many Pascal compilers are themselves written in the Pascal language because of the rigid specification of the language and the capability to use a single pass to compile Pascal language programs.

Compiler front end

The compiler front end consists of multiple phases itself, each informed by formal language theory:

  1. Lexical analysis - breaking the source code text into small pieces ('tokens' or 'terminals'), each representing a single atomic unit of the language, for instance a keyword, identifier or symbol names. The token language is typically a regular language, so a finite state automaton constructed from a regular expression can be used to recognize it. This phase is also called lexing or scanning.
  2. Syntax analysis - Identifying syntactic structures of source code. It only focuses on the structure. In other words, it identifies the order of tokens and understand hierarchical structures in code. This phase is also called parsing.
  3. Semantic analysis is to recognize the meaning of program code and start to prepare for output. In that phase, type checking is done and most of compiler errors show up.
  4. Intermediate language generation - an equivalent to the original program is created in an intermediate language.

Compiler back end

While there are applications where only the compiler front end is necessary, such as static language verification tools, a real compiler hands the intermediate representation generated by the front end to the back end, which produces a functional equivalent program in the output language. This is done in multiple steps:

  1. Compiler Analysis - This is the process to gather program information from the intermediate representation of the input source files. Typical analysis are variable define-use and use-define chain, data dependence analysis , alias analysis etc. Accurate analysis is the base for any compiler optimizations. The call graph and control flow graph are usually also built during the analysis phase.
  2. Optimization - the intermediate language representation is transformed into functionally equivalent but faster (or smaller) forms. Popular optimizations are in-line expansion, dead code elimination, constant propagation, loop transformation , register allocation or even auto parallelization .
  3. Code generation - the transformed intermediate language is translated into the output language, usually the native machine language of the system. This involves resource and storage decisions, such as deciding which variables to fit into registers and memory and the selection and scheduling of appropriate machine instructions along with their associated addressing modes (see also Sethi-Ullman algorithm).

Notes

  1. A pass has also been known as a parse in some textbooks. The idea is that the source code is parsed by gradual, iterative refinement to produce the completely translated object code at the end of the process. There is, however, some dispute over the general use of parse for all those phases (passes), since some of them, e.g. object code generation, are arguably not regarded to be parsing as such.

References

  • (ISBN 0333217322) by Richard Bornat is an unusually helpful book, being one of the few that adequately explains the recursive generation of machine instructions from a parse-tree. Having learnt his subject in the early days of mainframes and minicomputers, the author has many useful insights that more recent books often fail to convey.

See also

External links

Last updated: 08-06-2005 14:38:56
The contents of this article are licensed from Wikipedia.org under the GNU Free Documentation License. How to see transparent copy