Compiler: SOURCE program--Target program, offline offline
Interpreter: A program of handlers that outputs results, online (prints out the results)
The compilation principle embodies many core ideas of computer science: algorithms, data structures, software engineering
The compiler itself is a very important area of research.
The compiler can also divide functionality into front-end and back-end. The front end can be divided into lexical analysis and grammatical analysis . The back-end instruction generates the part of the instruction-optimized part.
Assembly Code---> Assembler--Connectors
The main role of lexical analysis
- Character Stream-- lexical analysis cut into a tick flow
- c, where the character stream is a collection of ASCII Java, Unicode
- Tick flow: The data structure defined internally by the compiler.
Manual implementation: Pure hand-written code, relatively complex, error prone, the current popular implementation method is GCC 4.0, LLVM advantages: There is a precise control
- Transfer diagram, transfer graph algorithm
- The transfer diagram of the identifier, which is part of the identifier.
- A keyword table that constructs a hash tablethat consists of a keyword H.
Automatic generator mode: Fast prototyping, less code: more difficult to control details
Parsing parser
Tick Flow- parser--abstract syntax Tree---> Semantic parser--middle code
Compiling principle--Introduction