1 <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN"
2 "http://www.w3.org/TR/html4/strict.dtd">
6 <title>Kaleidoscope: Adding JIT and Optimizer Support</title>
7 <meta http-equiv="Content-Type" content="text/html; charset=utf-8">
8 <meta name="author" content="Chris Lattner">
9 <link rel="stylesheet" href="../llvm.css" type="text/css">
14 <div class="doc_title">Kaleidoscope: Adding JIT and Optimizer Support</div>
17 <li><a href="index.html">Up to Tutorial Index</a></li>
20 <li><a href="#intro">Chapter 4 Introduction</a></li>
21 <li><a href="#trivialconstfold">Trivial Constant Folding</a></li>
22 <li><a href="#optimizerpasses">LLVM Optimization Passes</a></li>
23 <li><a href="#jit">Adding a JIT Compiler</a></li>
24 <li><a href="#code">Full Code Listing</a></li>
27 <li><a href="LangImpl5.html">Chapter 5</a>: Extending the Language: Control
31 <div class="doc_author">
32 <p>Written by <a href="mailto:sabre@nondot.org">Chris Lattner</a></p>
35 <!-- *********************************************************************** -->
36 <div class="doc_section"><a name="intro">Chapter 4 Introduction</a></div>
37 <!-- *********************************************************************** -->
39 <div class="doc_text">
41 <p>Welcome to Chapter 4 of the "<a href="index.html">Implementing a language
42 with LLVM</a>" tutorial. Chapters 1-3 described the implementation of a simple
43 language and added support for generating LLVM IR. This chapter describes
44 two new techniques: adding optimizer support to your language, and adding JIT
45 compiler support. These additions will demonstrate how to get nice, efficient code
46 for the Kaleidoscope language.</p>
50 <!-- *********************************************************************** -->
51 <div class="doc_section"><a name="trivialconstfold">Trivial Constant
53 <!-- *********************************************************************** -->
55 <div class="doc_text">
58 Our demonstration for Chapter 3 is elegant and easy to extend. Unfortunately,
59 it does not produce wonderful code. The IRBuilder, however, does give us
60 obvious optimizations when compiling simple code:</p>
62 <div class="doc_code">
64 ready> <b>def test(x) 1+2+x;</b>
65 Read function definition:
66 define double @test(double %x) {
68 %addtmp = add double 3.000000e+00, %x
74 <p>This code is not a literal transcription of the AST built by parsing the
77 <div class="doc_code">
79 ready> <b>def test(x) 1+2+x;</b>
80 Read function definition:
81 define double @test(double %x) {
83 %addtmp = add double 2.000000e+00, 1.000000e+00
84 %addtmp1 = add double %addtmp, %x
90 <p>Constant folding, as seen above, in particular, is a very common and very
91 important optimization: so much so that many language implementors implement
92 constant folding support in their AST representation.</p>
94 <p>With LLVM, you don't need this support in the AST. Since all calls to build
95 LLVM IR go through the LLVM IR builder, the builder itself checked to see if
96 there was a constant folding opportunity when you call it. If so, it just does
97 the constant fold and return the constant instead of creating an instruction.
99 <p>Well, that was easy :). In practice, we recommend always using
100 <tt>IRBuilder</tt> when generating code like this. It has no
101 "syntactic overhead" for its use (you don't have to uglify your compiler with
102 constant checks everywhere) and it can dramatically reduce the amount of
103 LLVM IR that is generated in some cases (particular for languages with a macro
104 preprocessor or that use a lot of constants).</p>
106 <p>On the other hand, the <tt>IRBuilder</tt> is limited by the fact
107 that it does all of its analysis inline with the code as it is built. If you
108 take a slightly more complex example:</p>
110 <div class="doc_code">
112 ready> <b>def test(x) (1+2+x)*(x+(1+2));</b>
113 ready> Read function definition:
114 define double @test(double %x) {
116 %addtmp = add double 3.000000e+00, %x
117 %addtmp1 = add double %x, 3.000000e+00
118 %multmp = mul double %addtmp, %addtmp1
124 <p>In this case, the LHS and RHS of the multiplication are the same value. We'd
125 really like to see this generate "<tt>tmp = x+3; result = tmp*tmp;</tt>" instead
126 of computing "<tt>x+3</tt>" twice.</p>
128 <p>Unfortunately, no amount of local analysis will be able to detect and correct
129 this. This requires two transformations: reassociation of expressions (to
130 make the add's lexically identical) and Common Subexpression Elimination (CSE)
131 to delete the redundant add instruction. Fortunately, LLVM provides a broad
132 range of optimizations that you can use, in the form of "passes".</p>
136 <!-- *********************************************************************** -->
137 <div class="doc_section"><a name="optimizerpasses">LLVM Optimization
139 <!-- *********************************************************************** -->
141 <div class="doc_text">
143 <p>LLVM provides many optimization passes, which do many different sorts of
144 things and have different tradeoffs. Unlike other systems, LLVM doesn't hold
145 to the mistaken notion that one set of optimizations is right for all languages
146 and for all situations. LLVM allows a compiler implementor to make complete
147 decisions about what optimizations to use, in which order, and in what
150 <p>As a concrete example, LLVM supports both "whole module" passes, which look
151 across as large of body of code as they can (often a whole file, but if run
152 at link time, this can be a substantial portion of the whole program). It also
153 supports and includes "per-function" passes which just operate on a single
154 function at a time, without looking at other functions. For more information
155 on passes and how they are run, see the <a href="../WritingAnLLVMPass.html">How
156 to Write a Pass</a> document and the <a href="../Passes.html">List of LLVM
159 <p>For Kaleidoscope, we are currently generating functions on the fly, one at
160 a time, as the user types them in. We aren't shooting for the ultimate
161 optimization experience in this setting, but we also want to catch the easy and
162 quick stuff where possible. As such, we will choose to run a few per-function
163 optimizations as the user types the function in. If we wanted to make a "static
164 Kaleidoscope compiler", we would use exactly the code we have now, except that
165 we would defer running the optimizer until the entire file has been parsed.</p>
167 <p>In order to get per-function optimizations going, we need to set up a
168 <a href="../WritingAnLLVMPass.html#passmanager">FunctionPassManager</a> to hold and
169 organize the LLVM optimizations that we want to run. Once we have that, we can
170 add a set of optimizations to run. The code looks like this:</p>
172 <div class="doc_code">
174 ExistingModuleProvider *OurModuleProvider =
175 new ExistingModuleProvider(TheModule);
177 FunctionPassManager OurFPM(OurModuleProvider);
179 // Set up the optimizer pipeline. Start with registering info about how the
180 // target lays out data structures.
181 OurFPM.add(new TargetData(*TheExecutionEngine->getTargetData()));
182 // Do simple "peephole" optimizations and bit-twiddling optzns.
183 OurFPM.add(createInstructionCombiningPass());
184 // Reassociate expressions.
185 OurFPM.add(createReassociatePass());
186 // Eliminate Common SubExpressions.
187 OurFPM.add(createGVNPass());
188 // Simplify the control flow graph (deleting unreachable blocks, etc).
189 OurFPM.add(createCFGSimplificationPass());
191 OurFPM.doInitialization();
193 // Set the global so the code gen can use this.
194 TheFPM = &OurFPM;
196 // Run the main "interpreter loop" now.
201 <p>This code defines two objects, an <tt>ExistingModuleProvider</tt> and a
202 <tt>FunctionPassManager</tt>. The former is basically a wrapper around our
203 <tt>Module</tt> that the PassManager requires. It provides certain flexibility
204 that we're not going to take advantage of here, so I won't dive into any details
207 <p>The meat of the matter here, is the definition of "<tt>OurFPM</tt>". It
208 requires a pointer to the <tt>Module</tt> (through the <tt>ModuleProvider</tt>)
209 to construct itself. Once it is set up, we use a series of "add" calls to add
210 a bunch of LLVM passes. The first pass is basically boilerplate, it adds a pass
211 so that later optimizations know how the data structures in the program are
212 laid out. The "<tt>TheExecutionEngine</tt>" variable is related to the JIT,
213 which we will get to in the next section.</p>
215 <p>In this case, we choose to add 4 optimization passes. The passes we chose
216 here are a pretty standard set of "cleanup" optimizations that are useful for
217 a wide variety of code. I won't delve into what they do but, believe me,
218 they are a good starting place :).</p>
220 <p>Once the PassManager is set up, we need to make use of it. We do this by
221 running it after our newly created function is constructed (in
222 <tt>FunctionAST::Codegen</tt>), but before it is returned to the client:</p>
224 <div class="doc_code">
226 if (Value *RetVal = Body->Codegen()) {
227 // Finish off the function.
228 Builder.CreateRet(RetVal);
230 // Validate the generated code, checking for consistency.
231 verifyFunction(*TheFunction);
233 <b>// Optimize the function.
234 TheFPM->run(*TheFunction);</b>
241 <p>As you can see, this is pretty straightforward. The
242 <tt>FunctionPassManager</tt> optimizes and updates the LLVM Function* in place,
243 improving (hopefully) its body. With this in place, we can try our test above
246 <div class="doc_code">
248 ready> <b>def test(x) (1+2+x)*(x+(1+2));</b>
249 ready> Read function definition:
250 define double @test(double %x) {
252 %addtmp = add double %x, 3.000000e+00
253 %multmp = mul double %addtmp, %addtmp
259 <p>As expected, we now get our nicely optimized code, saving a floating point
260 add instruction from every execution of this function.</p>
262 <p>LLVM provides a wide variety of optimizations that can be used in certain
263 circumstances. Some <a href="../Passes.html">documentation about the various
264 passes</a> is available, but it isn't very complete. Another good source of
265 ideas can come from looking at the passes that <tt>llvm-gcc</tt> or
266 <tt>llvm-ld</tt> run to get started. The "<tt>opt</tt>" tool allows you to
267 experiment with passes from the command line, so you can see if they do
270 <p>Now that we have reasonable code coming out of our front-end, lets talk about
275 <!-- *********************************************************************** -->
276 <div class="doc_section"><a name="jit">Adding a JIT Compiler</a></div>
277 <!-- *********************************************************************** -->
279 <div class="doc_text">
281 <p>Code that is available in LLVM IR can have a wide variety of tools
282 applied to it. For example, you can run optimizations on it (as we did above),
283 you can dump it out in textual or binary forms, you can compile the code to an
284 assembly file (.s) for some target, or you can JIT compile it. The nice thing
285 about the LLVM IR representation is that it is the "common currency" between
286 many different parts of the compiler.
289 <p>In this section, we'll add JIT compiler support to our interpreter. The
290 basic idea that we want for Kaleidoscope is to have the user enter function
291 bodies as they do now, but immediately evaluate the top-level expressions they
292 type in. For example, if they type in "1 + 2;", we should evaluate and print
293 out 3. If they define a function, they should be able to call it from the
296 <p>In order to do this, we first declare and initialize the JIT. This is done
297 by adding a global variable and a call in <tt>main</tt>:</p>
299 <div class="doc_code">
301 <b>static ExecutionEngine *TheExecutionEngine;</b>
305 <b>// Create the JIT. This takes ownership of the module and module provider.
306 TheExecutionEngine = EngineBuilder(OurModuleProvider).create();</b>
312 <p>This creates an abstract "Execution Engine" which can be either a JIT
313 compiler or the LLVM interpreter. LLVM will automatically pick a JIT compiler
314 for you if one is available for your platform, otherwise it will fall back to
317 <p>Once the <tt>ExecutionEngine</tt> is created, the JIT is ready to be used.
318 There are a variety of APIs that are useful, but the simplest one is the
319 "<tt>getPointerToFunction(F)</tt>" method. This method JIT compiles the
320 specified LLVM Function and returns a function pointer to the generated machine
321 code. In our case, this means that we can change the code that parses a
322 top-level expression to look like this:</p>
324 <div class="doc_code">
326 static void HandleTopLevelExpression() {
327 // Evaluate a top-level expression into an anonymous function.
328 if (FunctionAST *F = ParseTopLevelExpr()) {
329 if (Function *LF = F->Codegen()) {
330 LF->dump(); // Dump the function for exposition purposes.
332 <b>// JIT the function, returning a function pointer.
333 void *FPtr = TheExecutionEngine->getPointerToFunction(LF);
335 // Cast it to the right type (takes no arguments, returns a double) so we
336 // can call it as a native function.
337 double (*FP)() = (double (*)())(intptr_t)FPtr;
338 fprintf(stderr, "Evaluated to %f\n", FP());</b>
343 <p>Recall that we compile top-level expressions into a self-contained LLVM
344 function that takes no arguments and returns the computed double. Because the
345 LLVM JIT compiler matches the native platform ABI, this means that you can just
346 cast the result pointer to a function pointer of that type and call it directly.
347 This means, there is no difference between JIT compiled code and native machine
348 code that is statically linked into your application.</p>
350 <p>With just these two changes, lets see how Kaleidoscope works now!</p>
352 <div class="doc_code">
354 ready> <b>4+5;</b>
355 define double @""() {
357 ret double 9.000000e+00
360 <em>Evaluated to 9.000000</em>
364 <p>Well this looks like it is basically working. The dump of the function
365 shows the "no argument function that always returns double" that we synthesize
366 for each top-level expression that is typed in. This demonstrates very basic
367 functionality, but can we do more?</p>
369 <div class="doc_code">
371 ready> <b>def testfunc(x y) x + y*2; </b>
372 Read function definition:
373 define double @testfunc(double %x, double %y) {
375 %multmp = mul double %y, 2.000000e+00
376 %addtmp = add double %multmp, %x
380 ready> <b>testfunc(4, 10);</b>
381 define double @""() {
383 %calltmp = call double @testfunc( double 4.000000e+00, double 1.000000e+01 )
387 <em>Evaluated to 24.000000</em>
391 <p>This illustrates that we can now call user code, but there is something a bit
392 subtle going on here. Note that we only invoke the JIT on the anonymous
393 functions that <em>call testfunc</em>, but we never invoked it
394 on <em>testfunc</em> itself. What actually happened here is that the JIT
395 scanned for all non-JIT'd functions transitively called from the anonymous
396 function and compiled all of them before returning
397 from <tt>getPointerToFunction()</tt>.</p>
399 <p>The JIT provides a number of other more advanced interfaces for things like
400 freeing allocated machine code, rejit'ing functions to update them, etc.
401 However, even with this simple code, we get some surprisingly powerful
402 capabilities - check this out (I removed the dump of the anonymous functions,
403 you should get the idea by now :) :</p>
405 <div class="doc_code">
407 ready> <b>extern sin(x);</b>
409 declare double @sin(double)
411 ready> <b>extern cos(x);</b>
413 declare double @cos(double)
415 ready> <b>sin(1.0);</b>
416 <em>Evaluated to 0.841471</em>
418 ready> <b>def foo(x) sin(x)*sin(x) + cos(x)*cos(x);</b>
419 Read function definition:
420 define double @foo(double %x) {
422 %calltmp = call double @sin( double %x )
423 %multmp = mul double %calltmp, %calltmp
424 %calltmp2 = call double @cos( double %x )
425 %multmp4 = mul double %calltmp2, %calltmp2
426 %addtmp = add double %multmp, %multmp4
430 ready> <b>foo(4.0);</b>
431 <em>Evaluated to 1.000000</em>
435 <p>Whoa, how does the JIT know about sin and cos? The answer is surprisingly
437 example, the JIT started execution of a function and got to a function call. It
438 realized that the function was not yet JIT compiled and invoked the standard set
439 of routines to resolve the function. In this case, there is no body defined
440 for the function, so the JIT ended up calling "<tt>dlsym("sin")</tt>" on the
441 Kaleidoscope process itself.
442 Since "<tt>sin</tt>" is defined within the JIT's address space, it simply
443 patches up calls in the module to call the libm version of <tt>sin</tt>
446 <p>The LLVM JIT provides a number of interfaces (look in the
447 <tt>ExecutionEngine.h</tt> file) for controlling how unknown functions get
448 resolved. It allows you to establish explicit mappings between IR objects and
449 addresses (useful for LLVM global variables that you want to map to static
450 tables, for example), allows you to dynamically decide on the fly based on the
451 function name, and even allows you to have the JIT compile functions lazily the
452 first time they're called.</p>
454 <p>One interesting application of this is that we can now extend the language
455 by writing arbitrary C++ code to implement operations. For example, if we add:
458 <div class="doc_code">
460 /// putchard - putchar that takes a double and returns 0.
462 double putchard(double X) {
469 <p>Now we can produce simple output to the console by using things like:
470 "<tt>extern putchard(x); putchard(120);</tt>", which prints a lowercase 'x' on
471 the console (120 is the ASCII code for 'x'). Similar code could be used to
472 implement file I/O, console input, and many other capabilities in
475 <p>This completes the JIT and optimizer chapter of the Kaleidoscope tutorial. At
476 this point, we can compile a non-Turing-complete programming language, optimize
477 and JIT compile it in a user-driven way. Next up we'll look into <a
478 href="LangImpl5.html">extending the language with control flow constructs</a>,
479 tackling some interesting LLVM IR issues along the way.</p>
483 <!-- *********************************************************************** -->
484 <div class="doc_section"><a name="code">Full Code Listing</a></div>
485 <!-- *********************************************************************** -->
487 <div class="doc_text">
490 Here is the complete code listing for our running example, enhanced with the
491 LLVM JIT and optimizer. To build this example, use:
494 <div class="doc_code">
497 g++ -g toy.cpp `llvm-config --cppflags --ldflags --libs core jit interpreter native` -O3 -o toy
504 If you are compiling this on Linux, make sure to add the "-rdynamic" option
505 as well. This makes sure that the external functions are resolved properly
508 <p>Here is the code:</p>
510 <div class="doc_code">
512 #include "llvm/DerivedTypes.h"
513 #include "llvm/ExecutionEngine/ExecutionEngine.h"
514 #include "llvm/ExecutionEngine/Interpreter.h"
515 #include "llvm/ExecutionEngine/JIT.h"
516 #include "llvm/LLVMContext.h"
517 #include "llvm/Module.h"
518 #include "llvm/ModuleProvider.h"
519 #include "llvm/PassManager.h"
520 #include "llvm/Analysis/Verifier.h"
521 #include "llvm/Target/TargetData.h"
522 #include "llvm/Target/TargetSelect.h"
523 #include "llvm/Transforms/Scalar.h"
524 #include "llvm/Support/IRBuilder.h"
525 #include <cstdio>
526 #include <string>
528 #include <vector>
529 using namespace llvm;
531 //===----------------------------------------------------------------------===//
533 //===----------------------------------------------------------------------===//
535 // The lexer returns tokens [0-255] if it is an unknown character, otherwise one
536 // of these for known things.
541 tok_def = -2, tok_extern = -3,
544 tok_identifier = -4, tok_number = -5
547 static std::string IdentifierStr; // Filled in if tok_identifier
548 static double NumVal; // Filled in if tok_number
550 /// gettok - Return the next token from standard input.
551 static int gettok() {
552 static int LastChar = ' ';
554 // Skip any whitespace.
555 while (isspace(LastChar))
556 LastChar = getchar();
558 if (isalpha(LastChar)) { // identifier: [a-zA-Z][a-zA-Z0-9]*
559 IdentifierStr = LastChar;
560 while (isalnum((LastChar = getchar())))
561 IdentifierStr += LastChar;
563 if (IdentifierStr == "def") return tok_def;
564 if (IdentifierStr == "extern") return tok_extern;
565 return tok_identifier;
568 if (isdigit(LastChar) || LastChar == '.') { // Number: [0-9.]+
572 LastChar = getchar();
573 } while (isdigit(LastChar) || LastChar == '.');
575 NumVal = strtod(NumStr.c_str(), 0);
579 if (LastChar == '#') {
580 // Comment until end of line.
581 do LastChar = getchar();
582 while (LastChar != EOF && LastChar != '\n' && LastChar != '\r');
588 // Check for end of file. Don't eat the EOF.
592 // Otherwise, just return the character as its ascii value.
593 int ThisChar = LastChar;
594 LastChar = getchar();
598 //===----------------------------------------------------------------------===//
599 // Abstract Syntax Tree (aka Parse Tree)
600 //===----------------------------------------------------------------------===//
602 /// ExprAST - Base class for all expression nodes.
605 virtual ~ExprAST() {}
606 virtual Value *Codegen() = 0;
609 /// NumberExprAST - Expression class for numeric literals like "1.0".
610 class NumberExprAST : public ExprAST {
613 NumberExprAST(double val) : Val(val) {}
614 virtual Value *Codegen();
617 /// VariableExprAST - Expression class for referencing a variable, like "a".
618 class VariableExprAST : public ExprAST {
621 VariableExprAST(const std::string &name) : Name(name) {}
622 virtual Value *Codegen();
625 /// BinaryExprAST - Expression class for a binary operator.
626 class BinaryExprAST : public ExprAST {
630 BinaryExprAST(char op, ExprAST *lhs, ExprAST *rhs)
631 : Op(op), LHS(lhs), RHS(rhs) {}
632 virtual Value *Codegen();
635 /// CallExprAST - Expression class for function calls.
636 class CallExprAST : public ExprAST {
638 std::vector<ExprAST*> Args;
640 CallExprAST(const std::string &callee, std::vector<ExprAST*> &args)
641 : Callee(callee), Args(args) {}
642 virtual Value *Codegen();
645 /// PrototypeAST - This class represents the "prototype" for a function,
646 /// which captures its name, and its argument names (thus implicitly the number
647 /// of arguments the function takes).
650 std::vector<std::string> Args;
652 PrototypeAST(const std::string &name, const std::vector<std::string> &args)
653 : Name(name), Args(args) {}
658 /// FunctionAST - This class represents a function definition itself.
663 FunctionAST(PrototypeAST *proto, ExprAST *body)
664 : Proto(proto), Body(body) {}
669 //===----------------------------------------------------------------------===//
671 //===----------------------------------------------------------------------===//
673 /// CurTok/getNextToken - Provide a simple token buffer. CurTok is the current
674 /// token the parser is looking at. getNextToken reads another token from the
675 /// lexer and updates CurTok with its results.
677 static int getNextToken() {
678 return CurTok = gettok();
681 /// BinopPrecedence - This holds the precedence for each binary operator that is
683 static std::map<char, int> BinopPrecedence;
685 /// GetTokPrecedence - Get the precedence of the pending binary operator token.
686 static int GetTokPrecedence() {
687 if (!isascii(CurTok))
690 // Make sure it's a declared binop.
691 int TokPrec = BinopPrecedence[CurTok];
692 if (TokPrec <= 0) return -1;
696 /// Error* - These are little helper functions for error handling.
697 ExprAST *Error(const char *Str) { fprintf(stderr, "Error: %s\n", Str);return 0;}
698 PrototypeAST *ErrorP(const char *Str) { Error(Str); return 0; }
699 FunctionAST *ErrorF(const char *Str) { Error(Str); return 0; }
701 static ExprAST *ParseExpression();
705 /// ::= identifier '(' expression* ')'
706 static ExprAST *ParseIdentifierExpr() {
707 std::string IdName = IdentifierStr;
709 getNextToken(); // eat identifier.
711 if (CurTok != '(') // Simple variable ref.
712 return new VariableExprAST(IdName);
715 getNextToken(); // eat (
716 std::vector<ExprAST*> Args;
719 ExprAST *Arg = ParseExpression();
723 if (CurTok == ')') break;
726 return Error("Expected ')' or ',' in argument list");
734 return new CallExprAST(IdName, Args);
737 /// numberexpr ::= number
738 static ExprAST *ParseNumberExpr() {
739 ExprAST *Result = new NumberExprAST(NumVal);
740 getNextToken(); // consume the number
744 /// parenexpr ::= '(' expression ')'
745 static ExprAST *ParseParenExpr() {
746 getNextToken(); // eat (.
747 ExprAST *V = ParseExpression();
751 return Error("expected ')'");
752 getNextToken(); // eat ).
757 /// ::= identifierexpr
760 static ExprAST *ParsePrimary() {
762 default: return Error("unknown token when expecting an expression");
763 case tok_identifier: return ParseIdentifierExpr();
764 case tok_number: return ParseNumberExpr();
765 case '(': return ParseParenExpr();
770 /// ::= ('+' primary)*
771 static ExprAST *ParseBinOpRHS(int ExprPrec, ExprAST *LHS) {
772 // If this is a binop, find its precedence.
774 int TokPrec = GetTokPrecedence();
776 // If this is a binop that binds at least as tightly as the current binop,
777 // consume it, otherwise we are done.
778 if (TokPrec < ExprPrec)
781 // Okay, we know this is a binop.
783 getNextToken(); // eat binop
785 // Parse the primary expression after the binary operator.
786 ExprAST *RHS = ParsePrimary();
789 // If BinOp binds less tightly with RHS than the operator after RHS, let
790 // the pending operator take RHS as its LHS.
791 int NextPrec = GetTokPrecedence();
792 if (TokPrec < NextPrec) {
793 RHS = ParseBinOpRHS(TokPrec+1, RHS);
794 if (RHS == 0) return 0;
798 LHS = new BinaryExprAST(BinOp, LHS, RHS);
803 /// ::= primary binoprhs
805 static ExprAST *ParseExpression() {
806 ExprAST *LHS = ParsePrimary();
809 return ParseBinOpRHS(0, LHS);
813 /// ::= id '(' id* ')'
814 static PrototypeAST *ParsePrototype() {
815 if (CurTok != tok_identifier)
816 return ErrorP("Expected function name in prototype");
818 std::string FnName = IdentifierStr;
822 return ErrorP("Expected '(' in prototype");
824 std::vector<std::string> ArgNames;
825 while (getNextToken() == tok_identifier)
826 ArgNames.push_back(IdentifierStr);
828 return ErrorP("Expected ')' in prototype");
831 getNextToken(); // eat ')'.
833 return new PrototypeAST(FnName, ArgNames);
836 /// definition ::= 'def' prototype expression
837 static FunctionAST *ParseDefinition() {
838 getNextToken(); // eat def.
839 PrototypeAST *Proto = ParsePrototype();
840 if (Proto == 0) return 0;
842 if (ExprAST *E = ParseExpression())
843 return new FunctionAST(Proto, E);
847 /// toplevelexpr ::= expression
848 static FunctionAST *ParseTopLevelExpr() {
849 if (ExprAST *E = ParseExpression()) {
850 // Make an anonymous proto.
851 PrototypeAST *Proto = new PrototypeAST("", std::vector<std::string>());
852 return new FunctionAST(Proto, E);
857 /// external ::= 'extern' prototype
858 static PrototypeAST *ParseExtern() {
859 getNextToken(); // eat extern.
860 return ParsePrototype();
863 //===----------------------------------------------------------------------===//
865 //===----------------------------------------------------------------------===//
867 static Module *TheModule;
868 static IRBuilder<> Builder(getGlobalContext());
869 static std::map<std::string, Value*> NamedValues;
870 static FunctionPassManager *TheFPM;
872 Value *ErrorV(const char *Str) { Error(Str); return 0; }
874 Value *NumberExprAST::Codegen() {
875 return ConstantFP::get(getGlobalContext(), APFloat(Val));
878 Value *VariableExprAST::Codegen() {
879 // Look this variable up in the function.
880 Value *V = NamedValues[Name];
881 return V ? V : ErrorV("Unknown variable name");
884 Value *BinaryExprAST::Codegen() {
885 Value *L = LHS->Codegen();
886 Value *R = RHS->Codegen();
887 if (L == 0 || R == 0) return 0;
890 case '+': return Builder.CreateAdd(L, R, "addtmp");
891 case '-': return Builder.CreateSub(L, R, "subtmp");
892 case '*': return Builder.CreateMul(L, R, "multmp");
894 L = Builder.CreateFCmpULT(L, R, "cmptmp");
895 // Convert bool 0/1 to double 0.0 or 1.0
896 return Builder.CreateUIToFP(L, Type::getDoubleTy(getGlobalContext()),
898 default: return ErrorV("invalid binary operator");
902 Value *CallExprAST::Codegen() {
903 // Look up the name in the global module table.
904 Function *CalleeF = TheModule->getFunction(Callee);
906 return ErrorV("Unknown function referenced");
908 // If argument mismatch error.
909 if (CalleeF->arg_size() != Args.size())
910 return ErrorV("Incorrect # arguments passed");
912 std::vector<Value*> ArgsV;
913 for (unsigned i = 0, e = Args.size(); i != e; ++i) {
914 ArgsV.push_back(Args[i]->Codegen());
915 if (ArgsV.back() == 0) return 0;
918 return Builder.CreateCall(CalleeF, ArgsV.begin(), ArgsV.end(), "calltmp");
921 Function *PrototypeAST::Codegen() {
922 // Make the function type: double(double,double) etc.
923 std::vector<const Type*> Doubles(Args.size(),
924 Type::getDoubleTy(getGlobalContext()));
925 FunctionType *FT = FunctionType::get(Type::getDoubleTy(getGlobalContext()),
928 Function *F = Function::Create(FT, Function::ExternalLinkage, Name, TheModule);
930 // If F conflicted, there was already something named 'Name'. If it has a
931 // body, don't allow redefinition or reextern.
932 if (F->getName() != Name) {
933 // Delete the one we just made and get the existing one.
934 F->eraseFromParent();
935 F = TheModule->getFunction(Name);
937 // If F already has a body, reject this.
938 if (!F->empty()) {
939 ErrorF("redefinition of function");
943 // If F took a different number of args, reject.
944 if (F->arg_size() != Args.size()) {
945 ErrorF("redefinition of function with different # args");
950 // Set names for all arguments.
952 for (Function::arg_iterator AI = F->arg_begin(); Idx != Args.size();
954 AI->setName(Args[Idx]);
956 // Add arguments to variable symbol table.
957 NamedValues[Args[Idx]] = AI;
963 Function *FunctionAST::Codegen() {
966 Function *TheFunction = Proto->Codegen();
967 if (TheFunction == 0)
970 // Create a new basic block to start insertion into.
971 BasicBlock *BB = BasicBlock::Create(getGlobalContext(), "entry", TheFunction);
972 Builder.SetInsertPoint(BB);
974 if (Value *RetVal = Body->Codegen()) {
975 // Finish off the function.
976 Builder.CreateRet(RetVal);
978 // Validate the generated code, checking for consistency.
979 verifyFunction(*TheFunction);
981 // Optimize the function.
982 TheFPM->run(*TheFunction);
987 // Error reading body, remove function.
988 TheFunction->eraseFromParent();
992 //===----------------------------------------------------------------------===//
993 // Top-Level parsing and JIT Driver
994 //===----------------------------------------------------------------------===//
996 static ExecutionEngine *TheExecutionEngine;
998 static void HandleDefinition() {
999 if (FunctionAST *F = ParseDefinition()) {
1000 if (Function *LF = F->Codegen()) {
1001 fprintf(stderr, "Read function definition:");
1005 // Skip token for error recovery.
1010 static void HandleExtern() {
1011 if (PrototypeAST *P = ParseExtern()) {
1012 if (Function *F = P->Codegen()) {
1013 fprintf(stderr, "Read extern: ");
1017 // Skip token for error recovery.
1022 static void HandleTopLevelExpression() {
1023 // Evaluate a top-level expression into an anonymous function.
1024 if (FunctionAST *F = ParseTopLevelExpr()) {
1025 if (Function *LF = F->Codegen()) {
1026 // JIT the function, returning a function pointer.
1027 void *FPtr = TheExecutionEngine->getPointerToFunction(LF);
1029 // Cast it to the right type (takes no arguments, returns a double) so we
1030 // can call it as a native function.
1031 double (*FP)() = (double (*)())(intptr_t)FPtr;
1032 fprintf(stderr, "Evaluated to %f\n", FP());
1035 // Skip token for error recovery.
1040 /// top ::= definition | external | expression | ';'
1041 static void MainLoop() {
1043 fprintf(stderr, "ready> ");
1045 case tok_eof: return;
1046 case ';': getNextToken(); break; // ignore top-level semicolons.
1047 case tok_def: HandleDefinition(); break;
1048 case tok_extern: HandleExtern(); break;
1049 default: HandleTopLevelExpression(); break;
1054 //===----------------------------------------------------------------------===//
1055 // "Library" functions that can be "extern'd" from user code.
1056 //===----------------------------------------------------------------------===//
1058 /// putchard - putchar that takes a double and returns 0.
1060 double putchard(double X) {
1065 //===----------------------------------------------------------------------===//
1066 // Main driver code.
1067 //===----------------------------------------------------------------------===//
1070 InitializeNativeTarget();
1071 LLVMContext &Context = getGlobalContext();
1073 // Install standard binary operators.
1074 // 1 is lowest precedence.
1075 BinopPrecedence['<'] = 10;
1076 BinopPrecedence['+'] = 20;
1077 BinopPrecedence['-'] = 20;
1078 BinopPrecedence['*'] = 40; // highest.
1080 // Prime the first token.
1081 fprintf(stderr, "ready> ");
1084 // Make the module, which holds all the code.
1085 TheModule = new Module("my cool jit", Context);
1087 ExistingModuleProvider *OurModuleProvider =
1088 new ExistingModuleProvider(TheModule);
1090 // Create the JIT. This takes ownership of the module and module provider.
1091 TheExecutionEngine = EngineBuilder(OurModuleProvider).create();
1093 FunctionPassManager OurFPM(OurModuleProvider);
1095 // Set up the optimizer pipeline. Start with registering info about how the
1096 // target lays out data structures.
1097 OurFPM.add(new TargetData(*TheExecutionEngine->getTargetData()));
1098 // Do simple "peephole" optimizations and bit-twiddling optzns.
1099 OurFPM.add(createInstructionCombiningPass());
1100 // Reassociate expressions.
1101 OurFPM.add(createReassociatePass());
1102 // Eliminate Common SubExpressions.
1103 OurFPM.add(createGVNPass());
1104 // Simplify the control flow graph (deleting unreachable blocks, etc).
1105 OurFPM.add(createCFGSimplificationPass());
1107 OurFPM.doInitialization();
1109 // Set the global so the code gen can use this.
1110 TheFPM = &OurFPM;
1112 // Run the main "interpreter loop" now.
1117 // Print out all of the generated code.
1118 TheModule->dump();
1125 <a href="LangImpl5.html">Next: Extending the language: control flow</a>
1128 <!-- *********************************************************************** -->
1131 <a href="http://jigsaw.w3.org/css-validator/check/referer"><img
1132 src="http://jigsaw.w3.org/css-validator/images/vcss" alt="Valid CSS!"></a>
1133 <a href="http://validator.w3.org/check/referer"><img
1134 src="http://www.w3.org/Icons/valid-html401" alt="Valid HTML 4.01!"></a>
1136 <a href="mailto:sabre@nondot.org">Chris Lattner</a><br>
1137 <a href="http://llvm.org">The LLVM Compiler Infrastructure</a><br>
1138 Last modified: $Date: 2007-10-17 11:05:13 -0700 (Wed, 17 Oct 2007) $