Automated renaming of gimple subclasses
[official-gcc.git] / gcc / cgraphunit.c
blobaed565c862bc149896904e143760803c9564f23d
1 /* Driver of optimization process
2 Copyright (C) 2003-2014 Free Software Foundation, Inc.
3 Contributed by Jan Hubicka
5 This file is part of GCC.
7 GCC is free software; you can redistribute it and/or modify it under
8 the terms of the GNU General Public License as published by the Free
9 Software Foundation; either version 3, or (at your option) any later
10 version.
12 GCC is distributed in the hope that it will be useful, but WITHOUT ANY
13 WARRANTY; without even the implied warranty of MERCHANTABILITY or
14 FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
15 for more details.
17 You should have received a copy of the GNU General Public License
18 along with GCC; see the file COPYING3. If not see
19 <http://www.gnu.org/licenses/>. */
21 /* This module implements main driver of compilation process.
23 The main scope of this file is to act as an interface in between
24 tree based frontends and the backend.
26 The front-end is supposed to use following functionality:
28 - finalize_function
30 This function is called once front-end has parsed whole body of function
31 and it is certain that the function body nor the declaration will change.
33 (There is one exception needed for implementing GCC extern inline
34 function.)
36 - varpool_finalize_decl
38 This function has same behavior as the above but is used for static
39 variables.
41 - add_asm_node
43 Insert new toplevel ASM statement
45 - finalize_compilation_unit
47 This function is called once (source level) compilation unit is finalized
48 and it will no longer change.
50 The symbol table is constructed starting from the trivially needed
51 symbols finalized by the frontend. Functions are lowered into
52 GIMPLE representation and callgraph/reference lists are constructed.
53 Those are used to discover other necessary functions and variables.
55 At the end the bodies of unreachable functions are removed.
57 The function can be called multiple times when multiple source level
58 compilation units are combined.
60 - compile
62 This passes control to the back-end. Optimizations are performed and
63 final assembler is generated. This is done in the following way. Note
64 that with link time optimization the process is split into three
65 stages (compile time, linktime analysis and parallel linktime as
66 indicated bellow).
68 Compile time:
70 1) Inter-procedural optimization.
71 (ipa_passes)
73 This part is further split into:
75 a) early optimizations. These are local passes executed in
76 the topological order on the callgraph.
78 The purpose of early optimiations is to optimize away simple
79 things that may otherwise confuse IP analysis. Very simple
80 propagation across the callgraph is done i.e. to discover
81 functions without side effects and simple inlining is performed.
83 b) early small interprocedural passes.
85 Those are interprocedural passes executed only at compilation
86 time. These include, for example, transational memory lowering,
87 unreachable code removal and other simple transformations.
89 c) IP analysis stage. All interprocedural passes do their
90 analysis.
92 Interprocedural passes differ from small interprocedural
93 passes by their ability to operate across whole program
94 at linktime. Their analysis stage is performed early to
95 both reduce linking times and linktime memory usage by
96 not having to represent whole program in memory.
98 d) LTO sreaming. When doing LTO, everything important gets
99 streamed into the object file.
101 Compile time and or linktime analysis stage (WPA):
103 At linktime units gets streamed back and symbol table is
104 merged. Function bodies are not streamed in and not
105 available.
106 e) IP propagation stage. All IP passes execute their
107 IP propagation. This is done based on the earlier analysis
108 without having function bodies at hand.
109 f) Ltrans streaming. When doing WHOPR LTO, the program
110 is partitioned and streamed into multple object files.
112 Compile time and/or parallel linktime stage (ltrans)
114 Each of the object files is streamed back and compiled
115 separately. Now the function bodies becomes available
116 again.
118 2) Virtual clone materialization
119 (cgraph_materialize_clone)
121 IP passes can produce copies of existing functoins (such
122 as versioned clones or inline clones) without actually
123 manipulating their bodies by creating virtual clones in
124 the callgraph. At this time the virtual clones are
125 turned into real functions
126 3) IP transformation
128 All IP passes transform function bodies based on earlier
129 decision of the IP propagation.
131 4) late small IP passes
133 Simple IP passes working within single program partition.
135 5) Expansion
136 (expand_all_functions)
138 At this stage functions that needs to be output into
139 assembler are identified and compiled in topological order
140 6) Output of variables and aliases
141 Now it is known what variable references was not optimized
142 out and thus all variables are output to the file.
144 Note that with -fno-toplevel-reorder passes 5 and 6
145 are combined together in cgraph_output_in_order.
147 Finally there are functions to manipulate the callgraph from
148 backend.
149 - cgraph_add_new_function is used to add backend produced
150 functions introduced after the unit is finalized.
151 The functions are enqueue for later processing and inserted
152 into callgraph with cgraph_process_new_functions.
154 - cgraph_function_versioning
156 produces a copy of function into new one (a version)
157 and apply simple transformations
160 #include "config.h"
161 #include "system.h"
162 #include "coretypes.h"
163 #include "tm.h"
164 #include "tree.h"
165 #include "varasm.h"
166 #include "stor-layout.h"
167 #include "stringpool.h"
168 #include "output.h"
169 #include "rtl.h"
170 #include "basic-block.h"
171 #include "tree-ssa-alias.h"
172 #include "internal-fn.h"
173 #include "gimple-fold.h"
174 #include "gimple-expr.h"
175 #include "is-a.h"
176 #include "gimple.h"
177 #include "gimplify.h"
178 #include "gimple-iterator.h"
179 #include "gimplify-me.h"
180 #include "gimple-ssa.h"
181 #include "tree-cfg.h"
182 #include "tree-into-ssa.h"
183 #include "tree-ssa.h"
184 #include "tree-inline.h"
185 #include "langhooks.h"
186 #include "toplev.h"
187 #include "flags.h"
188 #include "debug.h"
189 #include "target.h"
190 #include "diagnostic.h"
191 #include "params.h"
192 #include "fibheap.h"
193 #include "intl.h"
194 #include "function.h"
195 #include "ipa-prop.h"
196 #include "tree-iterator.h"
197 #include "tree-pass.h"
198 #include "tree-dump.h"
199 #include "gimple-pretty-print.h"
200 #include "output.h"
201 #include "coverage.h"
202 #include "plugin.h"
203 #include "ipa-inline.h"
204 #include "ipa-utils.h"
205 #include "lto-streamer.h"
206 #include "except.h"
207 #include "cfgloop.h"
208 #include "regset.h" /* FIXME: For reg_obstack. */
209 #include "context.h"
210 #include "pass_manager.h"
211 #include "tree-nested.h"
212 #include "gimplify.h"
213 #include "dbgcnt.h"
215 /* Queue of cgraph nodes scheduled to be added into cgraph. This is a
216 secondary queue used during optimization to accommodate passes that
217 may generate new functions that need to be optimized and expanded. */
218 vec<cgraph_node *> cgraph_new_nodes;
220 static void expand_all_functions (void);
221 static void mark_functions_to_output (void);
222 static void handle_alias_pairs (void);
224 /* Used for vtable lookup in thunk adjusting. */
225 static GTY (()) tree vtable_entry_type;
227 /* Determine if symbol declaration is needed. That is, visible to something
228 either outside this translation unit, something magic in the system
229 configury */
230 bool
231 symtab_node::needed_p (void)
233 /* Double check that no one output the function into assembly file
234 early. */
235 gcc_checking_assert (!DECL_ASSEMBLER_NAME_SET_P (decl)
236 || !TREE_SYMBOL_REFERENCED (DECL_ASSEMBLER_NAME (decl)));
238 if (!definition)
239 return false;
241 if (DECL_EXTERNAL (decl))
242 return false;
244 /* If the user told us it is used, then it must be so. */
245 if (force_output)
246 return true;
248 /* ABI forced symbols are needed when they are external. */
249 if (forced_by_abi && TREE_PUBLIC (decl))
250 return true;
252 /* Keep constructors, destructors and virtual functions. */
253 if (TREE_CODE (decl) == FUNCTION_DECL
254 && (DECL_STATIC_CONSTRUCTOR (decl) || DECL_STATIC_DESTRUCTOR (decl)))
255 return true;
257 /* Externally visible variables must be output. The exception is
258 COMDAT variables that must be output only when they are needed. */
259 if (TREE_PUBLIC (decl) && !DECL_COMDAT (decl))
260 return true;
262 return false;
265 /* Head and terminator of the queue of nodes to be processed while building
266 callgraph. */
268 static symtab_node symtab_terminator;
269 static symtab_node *queued_nodes = &symtab_terminator;
271 /* Add NODE to queue starting at QUEUED_NODES.
272 The queue is linked via AUX pointers and terminated by pointer to 1. */
274 static void
275 enqueue_node (symtab_node *node)
277 if (node->aux)
278 return;
279 gcc_checking_assert (queued_nodes);
280 node->aux = queued_nodes;
281 queued_nodes = node;
284 /* Process CGRAPH_NEW_FUNCTIONS and perform actions necessary to add these
285 functions into callgraph in a way so they look like ordinary reachable
286 functions inserted into callgraph already at construction time. */
288 void
289 symbol_table::process_new_functions (void)
291 tree fndecl;
293 if (!cgraph_new_nodes.exists ())
294 return;
296 handle_alias_pairs ();
297 /* Note that this queue may grow as its being processed, as the new
298 functions may generate new ones. */
299 for (unsigned i = 0; i < cgraph_new_nodes.length (); i++)
301 cgraph_node *node = cgraph_new_nodes[i];
302 fndecl = node->decl;
303 switch (state)
305 case CONSTRUCTION:
306 /* At construction time we just need to finalize function and move
307 it into reachable functions list. */
309 cgraph_node::finalize_function (fndecl, false);
310 call_cgraph_insertion_hooks (node);
311 enqueue_node (node);
312 break;
314 case IPA:
315 case IPA_SSA:
316 /* When IPA optimization already started, do all essential
317 transformations that has been already performed on the whole
318 cgraph but not on this function. */
320 gimple_register_cfg_hooks ();
321 if (!node->analyzed)
322 node->analyze ();
323 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
324 if (state == IPA_SSA
325 && !gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
326 g->get_passes ()->execute_early_local_passes ();
327 else if (inline_summary_vec != NULL)
328 compute_inline_parameters (node, true);
329 free_dominance_info (CDI_POST_DOMINATORS);
330 free_dominance_info (CDI_DOMINATORS);
331 pop_cfun ();
332 call_cgraph_insertion_hooks (node);
333 break;
335 case EXPANSION:
336 /* Functions created during expansion shall be compiled
337 directly. */
338 node->process = 0;
339 call_cgraph_insertion_hooks (node);
340 node->expand ();
341 break;
343 default:
344 gcc_unreachable ();
345 break;
349 cgraph_new_nodes.release ();
352 /* As an GCC extension we allow redefinition of the function. The
353 semantics when both copies of bodies differ is not well defined.
354 We replace the old body with new body so in unit at a time mode
355 we always use new body, while in normal mode we may end up with
356 old body inlined into some functions and new body expanded and
357 inlined in others.
359 ??? It may make more sense to use one body for inlining and other
360 body for expanding the function but this is difficult to do. */
362 void
363 cgraph_node::reset (void)
365 /* If process is set, then we have already begun whole-unit analysis.
366 This is *not* testing for whether we've already emitted the function.
367 That case can be sort-of legitimately seen with real function redefinition
368 errors. I would argue that the front end should never present us with
369 such a case, but don't enforce that for now. */
370 gcc_assert (!process);
372 /* Reset our data structures so we can analyze the function again. */
373 memset (&local, 0, sizeof (local));
374 memset (&global, 0, sizeof (global));
375 memset (&rtl, 0, sizeof (rtl));
376 analyzed = false;
377 definition = false;
378 alias = false;
379 weakref = false;
380 cpp_implicit_alias = false;
382 remove_callees ();
383 remove_all_references ();
386 /* Return true when there are references to the node. */
388 bool
389 symtab_node::referred_to_p (void)
391 ipa_ref *ref = NULL;
393 /* See if there are any references at all. */
394 if (iterate_referring (0, ref))
395 return true;
396 /* For functions check also calls. */
397 cgraph_node *cn = dyn_cast <cgraph_node *> (this);
398 if (cn && cn->callers)
399 return true;
400 return false;
403 /* DECL has been parsed. Take it, queue it, compile it at the whim of the
404 logic in effect. If NO_COLLECT is true, then our caller cannot stand to have
405 the garbage collector run at the moment. We would need to either create
406 a new GC context, or just not compile right now. */
408 void
409 cgraph_node::finalize_function (tree decl, bool no_collect)
411 cgraph_node *node = cgraph_node::get_create (decl);
413 if (node->definition)
415 /* Nested functions should only be defined once. */
416 gcc_assert (!DECL_CONTEXT (decl)
417 || TREE_CODE (DECL_CONTEXT (decl)) != FUNCTION_DECL);
418 node->reset ();
419 node->local.redefined_extern_inline = true;
422 notice_global_symbol (decl);
423 node->definition = true;
424 node->lowered = DECL_STRUCT_FUNCTION (decl)->cfg != NULL;
426 /* With -fkeep-inline-functions we are keeping all inline functions except
427 for extern inline ones. */
428 if (flag_keep_inline_functions
429 && DECL_DECLARED_INLINE_P (decl)
430 && !DECL_EXTERNAL (decl)
431 && !DECL_DISREGARD_INLINE_LIMITS (decl))
432 node->force_output = 1;
434 /* When not optimizing, also output the static functions. (see
435 PR24561), but don't do so for always_inline functions, functions
436 declared inline and nested functions. These were optimized out
437 in the original implementation and it is unclear whether we want
438 to change the behavior here. */
439 if ((!optimize
440 && !node->cpp_implicit_alias
441 && !DECL_DISREGARD_INLINE_LIMITS (decl)
442 && !DECL_DECLARED_INLINE_P (decl)
443 && !(DECL_CONTEXT (decl)
444 && TREE_CODE (DECL_CONTEXT (decl)) == FUNCTION_DECL))
445 && !DECL_COMDAT (decl) && !DECL_EXTERNAL (decl))
446 node->force_output = 1;
448 /* If we've not yet emitted decl, tell the debug info about it. */
449 if (!TREE_ASM_WRITTEN (decl))
450 (*debug_hooks->deferred_inline_function) (decl);
452 /* Possibly warn about unused parameters. */
453 if (warn_unused_parameter)
454 do_warn_unused_parameter (decl);
456 if (!no_collect)
457 ggc_collect ();
459 if (symtab->state == CONSTRUCTION
460 && (node->needed_p () || node->referred_to_p ()))
461 enqueue_node (node);
464 /* Add the function FNDECL to the call graph.
465 Unlike finalize_function, this function is intended to be used
466 by middle end and allows insertion of new function at arbitrary point
467 of compilation. The function can be either in high, low or SSA form
468 GIMPLE.
470 The function is assumed to be reachable and have address taken (so no
471 API breaking optimizations are performed on it).
473 Main work done by this function is to enqueue the function for later
474 processing to avoid need the passes to be re-entrant. */
476 void
477 cgraph_node::add_new_function (tree fndecl, bool lowered)
479 gcc::pass_manager *passes = g->get_passes ();
480 cgraph_node *node;
481 switch (symtab->state)
483 case PARSING:
484 cgraph_node::finalize_function (fndecl, false);
485 break;
486 case CONSTRUCTION:
487 /* Just enqueue function to be processed at nearest occurrence. */
488 node = cgraph_node::get_create (fndecl);
489 if (lowered)
490 node->lowered = true;
491 cgraph_new_nodes.safe_push (node);
492 break;
494 case IPA:
495 case IPA_SSA:
496 case EXPANSION:
497 /* Bring the function into finalized state and enqueue for later
498 analyzing and compilation. */
499 node = cgraph_node::get_create (fndecl);
500 node->local.local = false;
501 node->definition = true;
502 node->force_output = true;
503 if (!lowered && symtab->state == EXPANSION)
505 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
506 gimple_register_cfg_hooks ();
507 bitmap_obstack_initialize (NULL);
508 execute_pass_list (cfun, passes->all_lowering_passes);
509 passes->execute_early_local_passes ();
510 bitmap_obstack_release (NULL);
511 pop_cfun ();
513 lowered = true;
515 if (lowered)
516 node->lowered = true;
517 cgraph_new_nodes.safe_push (node);
518 break;
520 case FINISHED:
521 /* At the very end of compilation we have to do all the work up
522 to expansion. */
523 node = cgraph_node::create (fndecl);
524 if (lowered)
525 node->lowered = true;
526 node->definition = true;
527 node->analyze ();
528 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
529 gimple_register_cfg_hooks ();
530 bitmap_obstack_initialize (NULL);
531 if (!gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
532 g->get_passes ()->execute_early_local_passes ();
533 bitmap_obstack_release (NULL);
534 pop_cfun ();
535 node->expand ();
536 break;
538 default:
539 gcc_unreachable ();
542 /* Set a personality if required and we already passed EH lowering. */
543 if (lowered
544 && (function_needs_eh_personality (DECL_STRUCT_FUNCTION (fndecl))
545 == eh_personality_lang))
546 DECL_FUNCTION_PERSONALITY (fndecl) = lang_hooks.eh_personality ();
549 /* Analyze the function scheduled to be output. */
550 void
551 cgraph_node::analyze (void)
553 tree decl = this->decl;
554 location_t saved_loc = input_location;
555 input_location = DECL_SOURCE_LOCATION (decl);
557 if (thunk.thunk_p)
559 create_edge (cgraph_node::get (thunk.alias),
560 NULL, 0, CGRAPH_FREQ_BASE);
561 if (!expand_thunk (false, false))
563 thunk.alias = NULL;
564 return;
566 thunk.alias = NULL;
568 if (alias)
569 resolve_alias (cgraph_node::get (alias_target));
570 else if (dispatcher_function)
572 /* Generate the dispatcher body of multi-versioned functions. */
573 cgraph_function_version_info *dispatcher_version_info
574 = function_version ();
575 if (dispatcher_version_info != NULL
576 && (dispatcher_version_info->dispatcher_resolver
577 == NULL_TREE))
579 tree resolver = NULL_TREE;
580 gcc_assert (targetm.generate_version_dispatcher_body);
581 resolver = targetm.generate_version_dispatcher_body (this);
582 gcc_assert (resolver != NULL_TREE);
585 else
587 push_cfun (DECL_STRUCT_FUNCTION (decl));
589 assign_assembler_name_if_neeeded (decl);
591 /* Make sure to gimplify bodies only once. During analyzing a
592 function we lower it, which will require gimplified nested
593 functions, so we can end up here with an already gimplified
594 body. */
595 if (!gimple_has_body_p (decl))
596 gimplify_function_tree (decl);
597 dump_function (TDI_generic, decl);
599 /* Lower the function. */
600 if (!lowered)
602 if (nested)
603 lower_nested_functions (decl);
604 gcc_assert (!nested);
606 gimple_register_cfg_hooks ();
607 bitmap_obstack_initialize (NULL);
608 execute_pass_list (cfun, g->get_passes ()->all_lowering_passes);
609 free_dominance_info (CDI_POST_DOMINATORS);
610 free_dominance_info (CDI_DOMINATORS);
611 compact_blocks ();
612 bitmap_obstack_release (NULL);
613 lowered = true;
616 pop_cfun ();
618 analyzed = true;
620 input_location = saved_loc;
623 /* C++ frontend produce same body aliases all over the place, even before PCH
624 gets streamed out. It relies on us linking the aliases with their function
625 in order to do the fixups, but ipa-ref is not PCH safe. Consequentely we
626 first produce aliases without links, but once C++ FE is sure he won't sream
627 PCH we build the links via this function. */
629 void
630 symbol_table::process_same_body_aliases (void)
632 symtab_node *node;
633 FOR_EACH_SYMBOL (node)
634 if (node->cpp_implicit_alias && !node->analyzed)
635 node->resolve_alias
636 (TREE_CODE (node->alias_target) == VAR_DECL
637 ? (symtab_node *)varpool_node::get_create (node->alias_target)
638 : (symtab_node *)cgraph_node::get_create (node->alias_target));
639 cpp_implicit_aliases_done = true;
642 /* Process attributes common for vars and functions. */
644 static void
645 process_common_attributes (symtab_node *node, tree decl)
647 tree weakref = lookup_attribute ("weakref", DECL_ATTRIBUTES (decl));
649 if (weakref && !lookup_attribute ("alias", DECL_ATTRIBUTES (decl)))
651 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
652 "%<weakref%> attribute should be accompanied with"
653 " an %<alias%> attribute");
654 DECL_WEAK (decl) = 0;
655 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
656 DECL_ATTRIBUTES (decl));
659 if (lookup_attribute ("no_reorder", DECL_ATTRIBUTES (decl)))
660 node->no_reorder = 1;
663 /* Look for externally_visible and used attributes and mark cgraph nodes
664 accordingly.
666 We cannot mark the nodes at the point the attributes are processed (in
667 handle_*_attribute) because the copy of the declarations available at that
668 point may not be canonical. For example, in:
670 void f();
671 void f() __attribute__((used));
673 the declaration we see in handle_used_attribute will be the second
674 declaration -- but the front end will subsequently merge that declaration
675 with the original declaration and discard the second declaration.
677 Furthermore, we can't mark these nodes in finalize_function because:
679 void f() {}
680 void f() __attribute__((externally_visible));
682 is valid.
684 So, we walk the nodes at the end of the translation unit, applying the
685 attributes at that point. */
687 static void
688 process_function_and_variable_attributes (cgraph_node *first,
689 varpool_node *first_var)
691 cgraph_node *node;
692 varpool_node *vnode;
694 for (node = symtab->first_function (); node != first;
695 node = symtab->next_function (node))
697 tree decl = node->decl;
698 if (DECL_PRESERVE_P (decl))
699 node->mark_force_output ();
700 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
702 if (! TREE_PUBLIC (node->decl))
703 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
704 "%<externally_visible%>"
705 " attribute have effect only on public objects");
707 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
708 && (node->definition && !node->alias))
710 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
711 "%<weakref%> attribute ignored"
712 " because function is defined");
713 DECL_WEAK (decl) = 0;
714 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
715 DECL_ATTRIBUTES (decl));
718 if (lookup_attribute ("always_inline", DECL_ATTRIBUTES (decl))
719 && !DECL_DECLARED_INLINE_P (decl)
720 /* redefining extern inline function makes it DECL_UNINLINABLE. */
721 && !DECL_UNINLINABLE (decl))
722 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
723 "always_inline function might not be inlinable");
725 process_common_attributes (node, decl);
727 for (vnode = symtab->first_variable (); vnode != first_var;
728 vnode = symtab->next_variable (vnode))
730 tree decl = vnode->decl;
731 if (DECL_EXTERNAL (decl)
732 && DECL_INITIAL (decl))
733 varpool_node::finalize_decl (decl);
734 if (DECL_PRESERVE_P (decl))
735 vnode->force_output = true;
736 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
738 if (! TREE_PUBLIC (vnode->decl))
739 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
740 "%<externally_visible%>"
741 " attribute have effect only on public objects");
743 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
744 && vnode->definition
745 && DECL_INITIAL (decl))
747 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
748 "%<weakref%> attribute ignored"
749 " because variable is initialized");
750 DECL_WEAK (decl) = 0;
751 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
752 DECL_ATTRIBUTES (decl));
754 process_common_attributes (vnode, decl);
758 /* Mark DECL as finalized. By finalizing the declaration, frontend instruct the
759 middle end to output the variable to asm file, if needed or externally
760 visible. */
762 void
763 varpool_node::finalize_decl (tree decl)
765 varpool_node *node = varpool_node::get_create (decl);
767 gcc_assert (TREE_STATIC (decl) || DECL_EXTERNAL (decl));
769 if (node->definition)
770 return;
771 notice_global_symbol (decl);
772 node->definition = true;
773 if (TREE_THIS_VOLATILE (decl) || DECL_PRESERVE_P (decl)
774 /* Traditionally we do not eliminate static variables when not
775 optimizing and when not doing toplevel reoder. */
776 || node->no_reorder
777 || ((!flag_toplevel_reorder
778 && !DECL_COMDAT (node->decl)
779 && !DECL_ARTIFICIAL (node->decl))))
780 node->force_output = true;
782 if (symtab->state == CONSTRUCTION
783 && (node->needed_p () || node->referred_to_p ()))
784 enqueue_node (node);
785 if (symtab->state >= IPA_SSA)
786 node->analyze ();
787 /* Some frontends produce various interface variables after compilation
788 finished. */
789 if (symtab->state == FINISHED
790 || (!flag_toplevel_reorder
791 && symtab->state == EXPANSION))
792 node->assemble_decl ();
795 /* EDGE is an polymorphic call. Mark all possible targets as reachable
796 and if there is only one target, perform trivial devirtualization.
797 REACHABLE_CALL_TARGETS collects target lists we already walked to
798 avoid udplicate work. */
800 static void
801 walk_polymorphic_call_targets (hash_set<void *> *reachable_call_targets,
802 cgraph_edge *edge)
804 unsigned int i;
805 void *cache_token;
806 bool final;
807 vec <cgraph_node *>targets
808 = possible_polymorphic_call_targets
809 (edge, &final, &cache_token);
811 if (!reachable_call_targets->add (cache_token))
813 if (symtab->dump_file)
814 dump_possible_polymorphic_call_targets
815 (symtab->dump_file, edge);
817 for (i = 0; i < targets.length (); i++)
819 /* Do not bother to mark virtual methods in anonymous namespace;
820 either we will find use of virtual table defining it, or it is
821 unused. */
822 if (targets[i]->definition
823 && TREE_CODE
824 (TREE_TYPE (targets[i]->decl))
825 == METHOD_TYPE
826 && !type_in_anonymous_namespace_p
827 (method_class_type
828 (TREE_TYPE (targets[i]->decl))))
829 enqueue_node (targets[i]);
833 /* Very trivial devirtualization; when the type is
834 final or anonymous (so we know all its derivation)
835 and there is only one possible virtual call target,
836 make the edge direct. */
837 if (final)
839 if (targets.length () <= 1 && dbg_cnt (devirt))
841 cgraph_node *target;
842 if (targets.length () == 1)
843 target = targets[0];
844 else
845 target = cgraph_node::create
846 (builtin_decl_implicit (BUILT_IN_UNREACHABLE));
848 if (symtab->dump_file)
850 fprintf (symtab->dump_file,
851 "Devirtualizing call: ");
852 print_gimple_stmt (symtab->dump_file,
853 edge->call_stmt, 0,
854 TDF_SLIM);
856 if (dump_enabled_p ())
858 location_t locus = gimple_location_safe (edge->call_stmt);
859 dump_printf_loc (MSG_OPTIMIZED_LOCATIONS, locus,
860 "devirtualizing call in %s to %s\n",
861 edge->caller->name (), target->name ());
864 edge->make_direct (target);
865 edge->redirect_call_stmt_to_callee ();
866 if (symtab->dump_file)
868 fprintf (symtab->dump_file,
869 "Devirtualized as: ");
870 print_gimple_stmt (symtab->dump_file,
871 edge->call_stmt, 0,
872 TDF_SLIM);
879 /* Discover all functions and variables that are trivially needed, analyze
880 them as well as all functions and variables referred by them */
882 static void
883 analyze_functions (void)
885 /* Keep track of already processed nodes when called multiple times for
886 intermodule optimization. */
887 static cgraph_node *first_analyzed;
888 cgraph_node *first_handled = first_analyzed;
889 static varpool_node *first_analyzed_var;
890 varpool_node *first_handled_var = first_analyzed_var;
891 hash_set<void *> reachable_call_targets;
893 symtab_node *node;
894 symtab_node *next;
895 int i;
896 ipa_ref *ref;
897 bool changed = true;
898 location_t saved_loc = input_location;
900 bitmap_obstack_initialize (NULL);
901 symtab->state = CONSTRUCTION;
902 input_location = UNKNOWN_LOCATION;
904 /* Ugly, but the fixup can not happen at a time same body alias is created;
905 C++ FE is confused about the COMDAT groups being right. */
906 if (symtab->cpp_implicit_aliases_done)
907 FOR_EACH_SYMBOL (node)
908 if (node->cpp_implicit_alias)
909 node->fixup_same_cpp_alias_visibility (node->get_alias_target ());
910 if (optimize && flag_devirtualize)
911 build_type_inheritance_graph ();
913 /* Analysis adds static variables that in turn adds references to new functions.
914 So we need to iterate the process until it stabilize. */
915 while (changed)
917 changed = false;
918 process_function_and_variable_attributes (first_analyzed,
919 first_analyzed_var);
921 /* First identify the trivially needed symbols. */
922 for (node = symtab->first_symbol ();
923 node != first_analyzed
924 && node != first_analyzed_var; node = node->next)
926 /* Convert COMDAT group designators to IDENTIFIER_NODEs. */
927 node->get_comdat_group_id ();
928 if (node->needed_p ())
930 enqueue_node (node);
931 if (!changed && symtab->dump_file)
932 fprintf (symtab->dump_file, "Trivially needed symbols:");
933 changed = true;
934 if (symtab->dump_file)
935 fprintf (symtab->dump_file, " %s", node->asm_name ());
936 if (!changed && symtab->dump_file)
937 fprintf (symtab->dump_file, "\n");
939 if (node == first_analyzed
940 || node == first_analyzed_var)
941 break;
943 symtab->process_new_functions ();
944 first_analyzed_var = symtab->first_variable ();
945 first_analyzed = symtab->first_function ();
947 if (changed && symtab->dump_file)
948 fprintf (symtab->dump_file, "\n");
950 /* Lower representation, build callgraph edges and references for all trivially
951 needed symbols and all symbols referred by them. */
952 while (queued_nodes != &symtab_terminator)
954 changed = true;
955 node = queued_nodes;
956 queued_nodes = (symtab_node *)queued_nodes->aux;
957 cgraph_node *cnode = dyn_cast <cgraph_node *> (node);
958 if (cnode && cnode->definition)
960 cgraph_edge *edge;
961 tree decl = cnode->decl;
963 /* ??? It is possible to create extern inline function
964 and later using weak alias attribute to kill its body.
965 See gcc.c-torture/compile/20011119-1.c */
966 if (!DECL_STRUCT_FUNCTION (decl)
967 && !cnode->alias
968 && !cnode->thunk.thunk_p
969 && !cnode->dispatcher_function)
971 cnode->reset ();
972 cnode->local.redefined_extern_inline = true;
973 continue;
976 if (!cnode->analyzed)
977 cnode->analyze ();
979 for (edge = cnode->callees; edge; edge = edge->next_callee)
980 if (edge->callee->definition)
981 enqueue_node (edge->callee);
982 if (optimize && flag_devirtualize)
984 cgraph_edge *next;
986 for (edge = cnode->indirect_calls; edge; edge = next)
988 next = edge->next_callee;
989 if (edge->indirect_info->polymorphic)
990 walk_polymorphic_call_targets (&reachable_call_targets,
991 edge);
995 /* If decl is a clone of an abstract function,
996 mark that abstract function so that we don't release its body.
997 The DECL_INITIAL() of that abstract function declaration
998 will be later needed to output debug info. */
999 if (DECL_ABSTRACT_ORIGIN (decl))
1001 cgraph_node *origin_node
1002 = cgraph_node::get_create (DECL_ABSTRACT_ORIGIN (decl));
1003 origin_node->used_as_abstract_origin = true;
1006 else
1008 varpool_node *vnode = dyn_cast <varpool_node *> (node);
1009 if (vnode && vnode->definition && !vnode->analyzed)
1010 vnode->analyze ();
1013 if (node->same_comdat_group)
1015 symtab_node *next;
1016 for (next = node->same_comdat_group;
1017 next != node;
1018 next = next->same_comdat_group)
1019 enqueue_node (next);
1021 for (i = 0; node->iterate_reference (i, ref); i++)
1022 if (ref->referred->definition)
1023 enqueue_node (ref->referred);
1024 symtab->process_new_functions ();
1027 if (optimize && flag_devirtualize)
1028 update_type_inheritance_graph ();
1030 /* Collect entry points to the unit. */
1031 if (symtab->dump_file)
1033 fprintf (symtab->dump_file, "\n\nInitial ");
1034 symtab_node::dump_table (symtab->dump_file);
1037 if (symtab->dump_file)
1038 fprintf (symtab->dump_file, "\nRemoving unused symbols:");
1040 for (node = symtab->first_symbol ();
1041 node != first_handled
1042 && node != first_handled_var; node = next)
1044 next = node->next;
1045 if (!node->aux && !node->referred_to_p ())
1047 if (symtab->dump_file)
1048 fprintf (symtab->dump_file, " %s", node->name ());
1049 node->remove ();
1050 continue;
1052 if (cgraph_node *cnode = dyn_cast <cgraph_node *> (node))
1054 tree decl = node->decl;
1056 if (cnode->definition && !gimple_has_body_p (decl)
1057 && !cnode->alias
1058 && !cnode->thunk.thunk_p)
1059 cnode->reset ();
1061 gcc_assert (!cnode->definition || cnode->thunk.thunk_p
1062 || cnode->alias
1063 || gimple_has_body_p (decl));
1064 gcc_assert (cnode->analyzed == cnode->definition);
1066 node->aux = NULL;
1068 for (;node; node = node->next)
1069 node->aux = NULL;
1070 first_analyzed = symtab->first_function ();
1071 first_analyzed_var = symtab->first_variable ();
1072 if (symtab->dump_file)
1074 fprintf (symtab->dump_file, "\n\nReclaimed ");
1075 symtab_node::dump_table (symtab->dump_file);
1077 bitmap_obstack_release (NULL);
1078 ggc_collect ();
1079 /* Initialize assembler name hash, in particular we want to trigger C++
1080 mangling and same body alias creation before we free DECL_ARGUMENTS
1081 used by it. */
1082 if (!seen_error ())
1083 symtab->symtab_initialize_asm_name_hash ();
1085 input_location = saved_loc;
1088 /* Translate the ugly representation of aliases as alias pairs into nice
1089 representation in callgraph. We don't handle all cases yet,
1090 unfortunately. */
1092 static void
1093 handle_alias_pairs (void)
1095 alias_pair *p;
1096 unsigned i;
1098 for (i = 0; alias_pairs && alias_pairs->iterate (i, &p);)
1100 symtab_node *target_node = symtab_node::get_for_asmname (p->target);
1102 /* Weakrefs with target not defined in current unit are easy to handle:
1103 they behave just as external variables except we need to note the
1104 alias flag to later output the weakref pseudo op into asm file. */
1105 if (!target_node
1106 && lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)) != NULL)
1108 symtab_node *node = symtab_node::get (p->decl);
1109 if (node)
1111 node->alias_target = p->target;
1112 node->weakref = true;
1113 node->alias = true;
1115 alias_pairs->unordered_remove (i);
1116 continue;
1118 else if (!target_node)
1120 error ("%q+D aliased to undefined symbol %qE", p->decl, p->target);
1121 symtab_node *node = symtab_node::get (p->decl);
1122 if (node)
1123 node->alias = false;
1124 alias_pairs->unordered_remove (i);
1125 continue;
1128 if (DECL_EXTERNAL (target_node->decl)
1129 /* We use local aliases for C++ thunks to force the tailcall
1130 to bind locally. This is a hack - to keep it working do
1131 the following (which is not strictly correct). */
1132 && (TREE_CODE (target_node->decl) != FUNCTION_DECL
1133 || ! DECL_VIRTUAL_P (target_node->decl))
1134 && ! lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)))
1136 error ("%q+D aliased to external symbol %qE",
1137 p->decl, p->target);
1140 if (TREE_CODE (p->decl) == FUNCTION_DECL
1141 && target_node && is_a <cgraph_node *> (target_node))
1143 cgraph_node *src_node = cgraph_node::get (p->decl);
1144 if (src_node && src_node->definition)
1145 src_node->reset ();
1146 cgraph_node::create_alias (p->decl, target_node->decl);
1147 alias_pairs->unordered_remove (i);
1149 else if (TREE_CODE (p->decl) == VAR_DECL
1150 && target_node && is_a <varpool_node *> (target_node))
1152 varpool_node::create_alias (p->decl, target_node->decl);
1153 alias_pairs->unordered_remove (i);
1155 else
1157 error ("%q+D alias in between function and variable is not supported",
1158 p->decl);
1159 warning (0, "%q+D aliased declaration",
1160 target_node->decl);
1161 alias_pairs->unordered_remove (i);
1164 vec_free (alias_pairs);
1168 /* Figure out what functions we want to assemble. */
1170 static void
1171 mark_functions_to_output (void)
1173 cgraph_node *node;
1174 #ifdef ENABLE_CHECKING
1175 bool check_same_comdat_groups = false;
1177 FOR_EACH_FUNCTION (node)
1178 gcc_assert (!node->process);
1179 #endif
1181 FOR_EACH_FUNCTION (node)
1183 tree decl = node->decl;
1185 gcc_assert (!node->process || node->same_comdat_group);
1186 if (node->process)
1187 continue;
1189 /* We need to output all local functions that are used and not
1190 always inlined, as well as those that are reachable from
1191 outside the current compilation unit. */
1192 if (node->analyzed
1193 && !node->thunk.thunk_p
1194 && !node->alias
1195 && !node->global.inlined_to
1196 && !TREE_ASM_WRITTEN (decl)
1197 && !DECL_EXTERNAL (decl))
1199 node->process = 1;
1200 if (node->same_comdat_group)
1202 cgraph_node *next;
1203 for (next = dyn_cast<cgraph_node *> (node->same_comdat_group);
1204 next != node;
1205 next = dyn_cast<cgraph_node *> (next->same_comdat_group))
1206 if (!next->thunk.thunk_p && !next->alias
1207 && !next->comdat_local_p ())
1208 next->process = 1;
1211 else if (node->same_comdat_group)
1213 #ifdef ENABLE_CHECKING
1214 check_same_comdat_groups = true;
1215 #endif
1217 else
1219 /* We should've reclaimed all functions that are not needed. */
1220 #ifdef ENABLE_CHECKING
1221 if (!node->global.inlined_to
1222 && gimple_has_body_p (decl)
1223 /* FIXME: in ltrans unit when offline copy is outside partition but inline copies
1224 are inside partition, we can end up not removing the body since we no longer
1225 have analyzed node pointing to it. */
1226 && !node->in_other_partition
1227 && !node->alias
1228 && !node->clones
1229 && !DECL_EXTERNAL (decl))
1231 node->debug ();
1232 internal_error ("failed to reclaim unneeded function");
1234 #endif
1235 gcc_assert (node->global.inlined_to
1236 || !gimple_has_body_p (decl)
1237 || node->in_other_partition
1238 || node->clones
1239 || DECL_ARTIFICIAL (decl)
1240 || DECL_EXTERNAL (decl));
1245 #ifdef ENABLE_CHECKING
1246 if (check_same_comdat_groups)
1247 FOR_EACH_FUNCTION (node)
1248 if (node->same_comdat_group && !node->process)
1250 tree decl = node->decl;
1251 if (!node->global.inlined_to
1252 && gimple_has_body_p (decl)
1253 /* FIXME: in an ltrans unit when the offline copy is outside a
1254 partition but inline copies are inside a partition, we can
1255 end up not removing the body since we no longer have an
1256 analyzed node pointing to it. */
1257 && !node->in_other_partition
1258 && !node->clones
1259 && !DECL_EXTERNAL (decl))
1261 node->debug ();
1262 internal_error ("failed to reclaim unneeded function in same "
1263 "comdat group");
1266 #endif
1269 /* DECL is FUNCTION_DECL. Initialize datastructures so DECL is a function
1270 in lowered gimple form. IN_SSA is true if the gimple is in SSA.
1272 Set current_function_decl and cfun to newly constructed empty function body.
1273 return basic block in the function body. */
1275 basic_block
1276 init_lowered_empty_function (tree decl, bool in_ssa)
1278 basic_block bb;
1280 current_function_decl = decl;
1281 allocate_struct_function (decl, false);
1282 gimple_register_cfg_hooks ();
1283 init_empty_tree_cfg ();
1285 if (in_ssa)
1287 init_tree_ssa (cfun);
1288 init_ssa_operands (cfun);
1289 cfun->gimple_df->in_ssa_p = true;
1290 cfun->curr_properties |= PROP_ssa;
1293 DECL_INITIAL (decl) = make_node (BLOCK);
1295 DECL_SAVED_TREE (decl) = error_mark_node;
1296 cfun->curr_properties |= (PROP_gimple_lcf | PROP_gimple_leh | PROP_gimple_any
1297 | PROP_cfg | PROP_loops);
1299 set_loops_for_fn (cfun, ggc_cleared_alloc<loops> ());
1300 init_loops_structure (cfun, loops_for_fn (cfun), 1);
1301 loops_for_fn (cfun)->state |= LOOPS_MAY_HAVE_MULTIPLE_LATCHES;
1303 /* Create BB for body of the function and connect it properly. */
1304 bb = create_basic_block (NULL, (void *) 0, ENTRY_BLOCK_PTR_FOR_FN (cfun));
1305 make_edge (ENTRY_BLOCK_PTR_FOR_FN (cfun), bb, EDGE_FALLTHRU);
1306 make_edge (bb, EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
1307 add_bb_to_loop (bb, ENTRY_BLOCK_PTR_FOR_FN (cfun)->loop_father);
1309 return bb;
1312 /* Adjust PTR by the constant FIXED_OFFSET, and by the vtable
1313 offset indicated by VIRTUAL_OFFSET, if that is
1314 non-null. THIS_ADJUSTING is nonzero for a this adjusting thunk and
1315 zero for a result adjusting thunk. */
1317 static tree
1318 thunk_adjust (gimple_stmt_iterator * bsi,
1319 tree ptr, bool this_adjusting,
1320 HOST_WIDE_INT fixed_offset, tree virtual_offset)
1322 gassign *stmt;
1323 tree ret;
1325 if (this_adjusting
1326 && fixed_offset != 0)
1328 stmt = gimple_build_assign
1329 (ptr, fold_build_pointer_plus_hwi_loc (input_location,
1330 ptr,
1331 fixed_offset));
1332 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1335 /* If there's a virtual offset, look up that value in the vtable and
1336 adjust the pointer again. */
1337 if (virtual_offset)
1339 tree vtabletmp;
1340 tree vtabletmp2;
1341 tree vtabletmp3;
1343 if (!vtable_entry_type)
1345 tree vfunc_type = make_node (FUNCTION_TYPE);
1346 TREE_TYPE (vfunc_type) = integer_type_node;
1347 TYPE_ARG_TYPES (vfunc_type) = NULL_TREE;
1348 layout_type (vfunc_type);
1350 vtable_entry_type = build_pointer_type (vfunc_type);
1353 vtabletmp =
1354 create_tmp_reg (build_pointer_type
1355 (build_pointer_type (vtable_entry_type)), "vptr");
1357 /* The vptr is always at offset zero in the object. */
1358 stmt = gimple_build_assign (vtabletmp,
1359 build1 (NOP_EXPR, TREE_TYPE (vtabletmp),
1360 ptr));
1361 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1363 /* Form the vtable address. */
1364 vtabletmp2 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp)),
1365 "vtableaddr");
1366 stmt = gimple_build_assign (vtabletmp2,
1367 build_simple_mem_ref (vtabletmp));
1368 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1370 /* Find the entry with the vcall offset. */
1371 stmt = gimple_build_assign (vtabletmp2,
1372 fold_build_pointer_plus_loc (input_location,
1373 vtabletmp2,
1374 virtual_offset));
1375 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1377 /* Get the offset itself. */
1378 vtabletmp3 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp2)),
1379 "vcalloffset");
1380 stmt = gimple_build_assign (vtabletmp3,
1381 build_simple_mem_ref (vtabletmp2));
1382 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1384 /* Adjust the `this' pointer. */
1385 ptr = fold_build_pointer_plus_loc (input_location, ptr, vtabletmp3);
1386 ptr = force_gimple_operand_gsi (bsi, ptr, true, NULL_TREE, false,
1387 GSI_CONTINUE_LINKING);
1390 if (!this_adjusting
1391 && fixed_offset != 0)
1392 /* Adjust the pointer by the constant. */
1394 tree ptrtmp;
1396 if (TREE_CODE (ptr) == VAR_DECL)
1397 ptrtmp = ptr;
1398 else
1400 ptrtmp = create_tmp_reg (TREE_TYPE (ptr), "ptr");
1401 stmt = gimple_build_assign (ptrtmp, ptr);
1402 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1404 ptr = fold_build_pointer_plus_hwi_loc (input_location,
1405 ptrtmp, fixed_offset);
1408 /* Emit the statement and gimplify the adjustment expression. */
1409 ret = create_tmp_reg (TREE_TYPE (ptr), "adjusted_this");
1410 stmt = gimple_build_assign (ret, ptr);
1411 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1413 return ret;
1416 /* Expand thunk NODE to gimple if possible.
1417 When FORCE_GIMPLE_THUNK is true, gimple thunk is created and
1418 no assembler is produced.
1419 When OUTPUT_ASM_THUNK is true, also produce assembler for
1420 thunks that are not lowered. */
1422 bool
1423 cgraph_node::expand_thunk (bool output_asm_thunks, bool force_gimple_thunk)
1425 bool this_adjusting = thunk.this_adjusting;
1426 HOST_WIDE_INT fixed_offset = thunk.fixed_offset;
1427 HOST_WIDE_INT virtual_value = thunk.virtual_value;
1428 tree virtual_offset = NULL;
1429 tree alias = callees->callee->decl;
1430 tree thunk_fndecl = decl;
1431 tree a;
1434 if (!force_gimple_thunk && this_adjusting
1435 && targetm.asm_out.can_output_mi_thunk (thunk_fndecl, fixed_offset,
1436 virtual_value, alias))
1438 const char *fnname;
1439 tree fn_block;
1440 tree restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1442 if (!output_asm_thunks)
1444 analyzed = true;
1445 return false;
1448 if (in_lto_p)
1449 get_body ();
1450 a = DECL_ARGUMENTS (thunk_fndecl);
1452 current_function_decl = thunk_fndecl;
1454 /* Ensure thunks are emitted in their correct sections. */
1455 resolve_unique_section (thunk_fndecl, 0, flag_function_sections);
1457 DECL_RESULT (thunk_fndecl)
1458 = build_decl (DECL_SOURCE_LOCATION (thunk_fndecl),
1459 RESULT_DECL, 0, restype);
1460 DECL_CONTEXT (DECL_RESULT (thunk_fndecl)) = thunk_fndecl;
1461 fnname = IDENTIFIER_POINTER (DECL_ASSEMBLER_NAME (thunk_fndecl));
1463 /* The back end expects DECL_INITIAL to contain a BLOCK, so we
1464 create one. */
1465 fn_block = make_node (BLOCK);
1466 BLOCK_VARS (fn_block) = a;
1467 DECL_INITIAL (thunk_fndecl) = fn_block;
1468 init_function_start (thunk_fndecl);
1469 cfun->is_thunk = 1;
1470 insn_locations_init ();
1471 set_curr_insn_location (DECL_SOURCE_LOCATION (thunk_fndecl));
1472 prologue_location = curr_insn_location ();
1473 assemble_start_function (thunk_fndecl, fnname);
1475 targetm.asm_out.output_mi_thunk (asm_out_file, thunk_fndecl,
1476 fixed_offset, virtual_value, alias);
1478 assemble_end_function (thunk_fndecl, fnname);
1479 insn_locations_finalize ();
1480 init_insn_lengths ();
1481 free_after_compilation (cfun);
1482 set_cfun (NULL);
1483 TREE_ASM_WRITTEN (thunk_fndecl) = 1;
1484 thunk.thunk_p = false;
1485 analyzed = false;
1487 else
1489 tree restype;
1490 basic_block bb, then_bb, else_bb, return_bb;
1491 gimple_stmt_iterator bsi;
1492 int nargs = 0;
1493 tree arg;
1494 int i;
1495 tree resdecl;
1496 tree restmp = NULL;
1498 gcall *call;
1499 greturn *ret;
1501 if (in_lto_p)
1502 get_body ();
1503 a = DECL_ARGUMENTS (thunk_fndecl);
1505 current_function_decl = thunk_fndecl;
1507 /* Ensure thunks are emitted in their correct sections. */
1508 resolve_unique_section (thunk_fndecl, 0, flag_function_sections);
1510 DECL_IGNORED_P (thunk_fndecl) = 1;
1511 bitmap_obstack_initialize (NULL);
1513 if (thunk.virtual_offset_p)
1514 virtual_offset = size_int (virtual_value);
1516 /* Build the return declaration for the function. */
1517 restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1518 if (DECL_RESULT (thunk_fndecl) == NULL_TREE)
1520 resdecl = build_decl (input_location, RESULT_DECL, 0, restype);
1521 DECL_ARTIFICIAL (resdecl) = 1;
1522 DECL_IGNORED_P (resdecl) = 1;
1523 DECL_RESULT (thunk_fndecl) = resdecl;
1524 DECL_CONTEXT (DECL_RESULT (thunk_fndecl)) = thunk_fndecl;
1526 else
1527 resdecl = DECL_RESULT (thunk_fndecl);
1529 bb = then_bb = else_bb = return_bb = init_lowered_empty_function (thunk_fndecl, true);
1531 bsi = gsi_start_bb (bb);
1533 /* Build call to the function being thunked. */
1534 if (!VOID_TYPE_P (restype))
1536 if (DECL_BY_REFERENCE (resdecl))
1537 restmp = gimple_fold_indirect_ref (resdecl);
1538 else if (!is_gimple_reg_type (restype))
1540 restmp = resdecl;
1541 add_local_decl (cfun, restmp);
1542 BLOCK_VARS (DECL_INITIAL (current_function_decl)) = restmp;
1544 else
1545 restmp = create_tmp_reg (restype, "retval");
1548 for (arg = a; arg; arg = DECL_CHAIN (arg))
1549 nargs++;
1550 auto_vec<tree> vargs (nargs);
1551 if (this_adjusting)
1552 vargs.quick_push (thunk_adjust (&bsi, a, 1, fixed_offset,
1553 virtual_offset));
1554 else if (nargs)
1555 vargs.quick_push (a);
1557 if (nargs)
1558 for (i = 1, arg = DECL_CHAIN (a); i < nargs; i++, arg = DECL_CHAIN (arg))
1560 tree tmp = arg;
1561 if (!is_gimple_val (arg))
1563 tmp = create_tmp_reg (TYPE_MAIN_VARIANT
1564 (TREE_TYPE (arg)), "arg");
1565 gimple stmt = gimple_build_assign (tmp, arg);
1566 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1568 vargs.quick_push (tmp);
1570 call = gimple_build_call_vec (build_fold_addr_expr_loc (0, alias), vargs);
1571 callees->call_stmt = call;
1572 gimple_call_set_from_thunk (call, true);
1573 if (restmp)
1575 gimple_call_set_lhs (call, restmp);
1576 gcc_assert (useless_type_conversion_p (TREE_TYPE (restmp),
1577 TREE_TYPE (TREE_TYPE (alias))));
1579 gsi_insert_after (&bsi, call, GSI_NEW_STMT);
1580 if (!(gimple_call_flags (call) & ECF_NORETURN))
1582 if (restmp && !this_adjusting
1583 && (fixed_offset || virtual_offset))
1585 tree true_label = NULL_TREE;
1587 if (TREE_CODE (TREE_TYPE (restmp)) == POINTER_TYPE)
1589 gimple stmt;
1590 /* If the return type is a pointer, we need to
1591 protect against NULL. We know there will be an
1592 adjustment, because that's why we're emitting a
1593 thunk. */
1594 then_bb = create_basic_block (NULL, (void *) 0, bb);
1595 return_bb = create_basic_block (NULL, (void *) 0, then_bb);
1596 else_bb = create_basic_block (NULL, (void *) 0, else_bb);
1597 add_bb_to_loop (then_bb, bb->loop_father);
1598 add_bb_to_loop (return_bb, bb->loop_father);
1599 add_bb_to_loop (else_bb, bb->loop_father);
1600 remove_edge (single_succ_edge (bb));
1601 true_label = gimple_block_label (then_bb);
1602 stmt = gimple_build_cond (NE_EXPR, restmp,
1603 build_zero_cst (TREE_TYPE (restmp)),
1604 NULL_TREE, NULL_TREE);
1605 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1606 make_edge (bb, then_bb, EDGE_TRUE_VALUE);
1607 make_edge (bb, else_bb, EDGE_FALSE_VALUE);
1608 make_edge (return_bb, EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
1609 make_edge (then_bb, return_bb, EDGE_FALLTHRU);
1610 make_edge (else_bb, return_bb, EDGE_FALLTHRU);
1611 bsi = gsi_last_bb (then_bb);
1614 restmp = thunk_adjust (&bsi, restmp, /*this_adjusting=*/0,
1615 fixed_offset, virtual_offset);
1616 if (true_label)
1618 gimple stmt;
1619 bsi = gsi_last_bb (else_bb);
1620 stmt = gimple_build_assign (restmp,
1621 build_zero_cst (TREE_TYPE (restmp)));
1622 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1623 bsi = gsi_last_bb (return_bb);
1626 else
1627 gimple_call_set_tail (call, true);
1629 /* Build return value. */
1630 ret = gimple_build_return (restmp);
1631 gsi_insert_after (&bsi, ret, GSI_NEW_STMT);
1633 else
1635 gimple_call_set_tail (call, true);
1636 remove_edge (single_succ_edge (bb));
1639 cfun->gimple_df->in_ssa_p = true;
1640 /* FIXME: C++ FE should stop setting TREE_ASM_WRITTEN on thunks. */
1641 TREE_ASM_WRITTEN (thunk_fndecl) = false;
1642 delete_unreachable_blocks ();
1643 update_ssa (TODO_update_ssa);
1644 #ifdef ENABLE_CHECKING
1645 verify_flow_info ();
1646 #endif
1647 free_dominance_info (CDI_DOMINATORS);
1649 /* Since we want to emit the thunk, we explicitly mark its name as
1650 referenced. */
1651 thunk.thunk_p = false;
1652 lowered = true;
1653 bitmap_obstack_release (NULL);
1655 current_function_decl = NULL;
1656 set_cfun (NULL);
1657 return true;
1660 /* Assemble thunks and aliases associated to node. */
1662 void
1663 cgraph_node::assemble_thunks_and_aliases (void)
1665 cgraph_edge *e;
1666 ipa_ref *ref;
1668 for (e = callers; e;)
1669 if (e->caller->thunk.thunk_p)
1671 cgraph_node *thunk = e->caller;
1673 e = e->next_caller;
1674 thunk->expand_thunk (true, false);
1675 thunk->assemble_thunks_and_aliases ();
1677 else
1678 e = e->next_caller;
1680 FOR_EACH_ALIAS (this, ref)
1682 cgraph_node *alias = dyn_cast <cgraph_node *> (ref->referring);
1683 bool saved_written = TREE_ASM_WRITTEN (decl);
1685 /* Force assemble_alias to really output the alias this time instead
1686 of buffering it in same alias pairs. */
1687 TREE_ASM_WRITTEN (decl) = 1;
1688 do_assemble_alias (alias->decl,
1689 DECL_ASSEMBLER_NAME (decl));
1690 alias->assemble_thunks_and_aliases ();
1691 TREE_ASM_WRITTEN (decl) = saved_written;
1695 /* Expand function specified by node. */
1697 void
1698 cgraph_node::expand (void)
1700 location_t saved_loc;
1702 /* We ought to not compile any inline clones. */
1703 gcc_assert (!global.inlined_to);
1705 announce_function (decl);
1706 process = 0;
1707 gcc_assert (lowered);
1708 get_body ();
1710 /* Generate RTL for the body of DECL. */
1712 timevar_push (TV_REST_OF_COMPILATION);
1714 gcc_assert (symtab->global_info_ready);
1716 /* Initialize the default bitmap obstack. */
1717 bitmap_obstack_initialize (NULL);
1719 /* Initialize the RTL code for the function. */
1720 current_function_decl = decl;
1721 saved_loc = input_location;
1722 input_location = DECL_SOURCE_LOCATION (decl);
1723 init_function_start (decl);
1725 gimple_register_cfg_hooks ();
1727 bitmap_obstack_initialize (&reg_obstack); /* FIXME, only at RTL generation*/
1729 execute_all_ipa_transforms ();
1731 /* Perform all tree transforms and optimizations. */
1733 /* Signal the start of passes. */
1734 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_START, NULL);
1736 execute_pass_list (cfun, g->get_passes ()->all_passes);
1738 /* Signal the end of passes. */
1739 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_END, NULL);
1741 bitmap_obstack_release (&reg_obstack);
1743 /* Release the default bitmap obstack. */
1744 bitmap_obstack_release (NULL);
1746 /* If requested, warn about function definitions where the function will
1747 return a value (usually of some struct or union type) which itself will
1748 take up a lot of stack space. */
1749 if (warn_larger_than && !DECL_EXTERNAL (decl) && TREE_TYPE (decl))
1751 tree ret_type = TREE_TYPE (TREE_TYPE (decl));
1753 if (ret_type && TYPE_SIZE_UNIT (ret_type)
1754 && TREE_CODE (TYPE_SIZE_UNIT (ret_type)) == INTEGER_CST
1755 && 0 < compare_tree_int (TYPE_SIZE_UNIT (ret_type),
1756 larger_than_size))
1758 unsigned int size_as_int
1759 = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (ret_type));
1761 if (compare_tree_int (TYPE_SIZE_UNIT (ret_type), size_as_int) == 0)
1762 warning (OPT_Wlarger_than_, "size of return value of %q+D is %u bytes",
1763 decl, size_as_int);
1764 else
1765 warning (OPT_Wlarger_than_, "size of return value of %q+D is larger than %wd bytes",
1766 decl, larger_than_size);
1770 gimple_set_body (decl, NULL);
1771 if (DECL_STRUCT_FUNCTION (decl) == 0
1772 && !cgraph_node::get (decl)->origin)
1774 /* Stop pointing to the local nodes about to be freed.
1775 But DECL_INITIAL must remain nonzero so we know this
1776 was an actual function definition.
1777 For a nested function, this is done in c_pop_function_context.
1778 If rest_of_compilation set this to 0, leave it 0. */
1779 if (DECL_INITIAL (decl) != 0)
1780 DECL_INITIAL (decl) = error_mark_node;
1783 input_location = saved_loc;
1785 ggc_collect ();
1786 timevar_pop (TV_REST_OF_COMPILATION);
1788 /* Make sure that BE didn't give up on compiling. */
1789 gcc_assert (TREE_ASM_WRITTEN (decl));
1790 set_cfun (NULL);
1791 current_function_decl = NULL;
1793 /* It would make a lot more sense to output thunks before function body to get more
1794 forward and lest backwarding jumps. This however would need solving problem
1795 with comdats. See PR48668. Also aliases must come after function itself to
1796 make one pass assemblers, like one on AIX, happy. See PR 50689.
1797 FIXME: Perhaps thunks should be move before function IFF they are not in comdat
1798 groups. */
1799 assemble_thunks_and_aliases ();
1800 release_body ();
1801 /* Eliminate all call edges. This is important so the GIMPLE_CALL no longer
1802 points to the dead function body. */
1803 remove_callees ();
1804 remove_all_references ();
1807 /* Node comparer that is responsible for the order that corresponds
1808 to time when a function was launched for the first time. */
1810 static int
1811 node_cmp (const void *pa, const void *pb)
1813 const cgraph_node *a = *(const cgraph_node * const *) pa;
1814 const cgraph_node *b = *(const cgraph_node * const *) pb;
1816 /* Functions with time profile must be before these without profile. */
1817 if (!a->tp_first_run || !b->tp_first_run)
1818 return a->tp_first_run - b->tp_first_run;
1820 return a->tp_first_run != b->tp_first_run
1821 ? b->tp_first_run - a->tp_first_run
1822 : b->order - a->order;
1825 /* Expand all functions that must be output.
1827 Attempt to topologically sort the nodes so function is output when
1828 all called functions are already assembled to allow data to be
1829 propagated across the callgraph. Use a stack to get smaller distance
1830 between a function and its callees (later we may choose to use a more
1831 sophisticated algorithm for function reordering; we will likely want
1832 to use subsections to make the output functions appear in top-down
1833 order). */
1835 static void
1836 expand_all_functions (void)
1838 cgraph_node *node;
1839 cgraph_node **order = XCNEWVEC (cgraph_node *,
1840 symtab->cgraph_count);
1841 unsigned int expanded_func_count = 0, profiled_func_count = 0;
1842 int order_pos, new_order_pos = 0;
1843 int i;
1845 order_pos = ipa_reverse_postorder (order);
1846 gcc_assert (order_pos == symtab->cgraph_count);
1848 /* Garbage collector may remove inline clones we eliminate during
1849 optimization. So we must be sure to not reference them. */
1850 for (i = 0; i < order_pos; i++)
1851 if (order[i]->process)
1852 order[new_order_pos++] = order[i];
1854 if (flag_profile_reorder_functions)
1855 qsort (order, new_order_pos, sizeof (cgraph_node *), node_cmp);
1857 for (i = new_order_pos - 1; i >= 0; i--)
1859 node = order[i];
1861 if (node->process)
1863 expanded_func_count++;
1864 if(node->tp_first_run)
1865 profiled_func_count++;
1867 if (symtab->dump_file)
1868 fprintf (symtab->dump_file,
1869 "Time profile order in expand_all_functions:%s:%d\n",
1870 node->asm_name (), node->tp_first_run);
1871 node->process = 0;
1872 node->expand ();
1876 if (dump_file)
1877 fprintf (dump_file, "Expanded functions with time profile (%s):%u/%u\n",
1878 main_input_filename, profiled_func_count, expanded_func_count);
1880 if (symtab->dump_file && flag_profile_reorder_functions)
1881 fprintf (symtab->dump_file, "Expanded functions with time profile:%u/%u\n",
1882 profiled_func_count, expanded_func_count);
1884 symtab->process_new_functions ();
1885 free_gimplify_stack ();
1887 free (order);
1890 /* This is used to sort the node types by the cgraph order number. */
1892 enum cgraph_order_sort_kind
1894 ORDER_UNDEFINED = 0,
1895 ORDER_FUNCTION,
1896 ORDER_VAR,
1897 ORDER_ASM
1900 struct cgraph_order_sort
1902 enum cgraph_order_sort_kind kind;
1903 union
1905 cgraph_node *f;
1906 varpool_node *v;
1907 asm_node *a;
1908 } u;
1911 /* Output all functions, variables, and asm statements in the order
1912 according to their order fields, which is the order in which they
1913 appeared in the file. This implements -fno-toplevel-reorder. In
1914 this mode we may output functions and variables which don't really
1915 need to be output.
1916 When NO_REORDER is true only do this for symbols marked no reorder. */
1918 static void
1919 output_in_order (bool no_reorder)
1921 int max;
1922 cgraph_order_sort *nodes;
1923 int i;
1924 cgraph_node *pf;
1925 varpool_node *pv;
1926 asm_node *pa;
1927 max = symtab->order;
1928 nodes = XCNEWVEC (cgraph_order_sort, max);
1930 FOR_EACH_DEFINED_FUNCTION (pf)
1932 if (pf->process && !pf->thunk.thunk_p && !pf->alias)
1934 if (no_reorder && !pf->no_reorder)
1935 continue;
1936 i = pf->order;
1937 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
1938 nodes[i].kind = ORDER_FUNCTION;
1939 nodes[i].u.f = pf;
1943 FOR_EACH_DEFINED_VARIABLE (pv)
1944 if (!DECL_EXTERNAL (pv->decl))
1946 if (no_reorder && !pv->no_reorder)
1947 continue;
1948 i = pv->order;
1949 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
1950 nodes[i].kind = ORDER_VAR;
1951 nodes[i].u.v = pv;
1954 for (pa = symtab->first_asm_symbol (); pa; pa = pa->next)
1956 i = pa->order;
1957 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
1958 nodes[i].kind = ORDER_ASM;
1959 nodes[i].u.a = pa;
1962 /* In toplevel reorder mode we output all statics; mark them as needed. */
1964 for (i = 0; i < max; ++i)
1965 if (nodes[i].kind == ORDER_VAR)
1966 nodes[i].u.v->finalize_named_section_flags ();
1968 for (i = 0; i < max; ++i)
1970 switch (nodes[i].kind)
1972 case ORDER_FUNCTION:
1973 nodes[i].u.f->process = 0;
1974 nodes[i].u.f->expand ();
1975 break;
1977 case ORDER_VAR:
1978 nodes[i].u.v->assemble_decl ();
1979 break;
1981 case ORDER_ASM:
1982 assemble_asm (nodes[i].u.a->asm_str);
1983 break;
1985 case ORDER_UNDEFINED:
1986 break;
1988 default:
1989 gcc_unreachable ();
1993 symtab->clear_asm_symbols ();
1995 free (nodes);
1998 static void
1999 ipa_passes (void)
2001 gcc::pass_manager *passes = g->get_passes ();
2003 set_cfun (NULL);
2004 current_function_decl = NULL;
2005 gimple_register_cfg_hooks ();
2006 bitmap_obstack_initialize (NULL);
2008 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_START, NULL);
2010 if (!in_lto_p)
2012 execute_ipa_pass_list (passes->all_small_ipa_passes);
2013 if (seen_error ())
2014 return;
2017 /* This extra symtab_remove_unreachable_nodes pass tends to catch some
2018 devirtualization and other changes where removal iterate. */
2019 symtab->remove_unreachable_nodes (true, symtab->dump_file);
2021 /* If pass_all_early_optimizations was not scheduled, the state of
2022 the cgraph will not be properly updated. Update it now. */
2023 if (symtab->state < IPA_SSA)
2024 symtab->state = IPA_SSA;
2026 if (!in_lto_p)
2028 /* Generate coverage variables and constructors. */
2029 coverage_finish ();
2031 /* Process new functions added. */
2032 set_cfun (NULL);
2033 current_function_decl = NULL;
2034 symtab->process_new_functions ();
2036 execute_ipa_summary_passes
2037 ((ipa_opt_pass_d *) passes->all_regular_ipa_passes);
2040 /* Some targets need to handle LTO assembler output specially. */
2041 if (flag_generate_lto)
2042 targetm.asm_out.lto_start ();
2044 if (!in_lto_p)
2045 ipa_write_summaries ();
2047 if (flag_generate_lto)
2048 targetm.asm_out.lto_end ();
2050 if (!flag_ltrans && (in_lto_p || !flag_lto || flag_fat_lto_objects))
2051 execute_ipa_pass_list (passes->all_regular_ipa_passes);
2052 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_END, NULL);
2054 bitmap_obstack_release (NULL);
2058 /* Return string alias is alias of. */
2060 static tree
2061 get_alias_symbol (tree decl)
2063 tree alias = lookup_attribute ("alias", DECL_ATTRIBUTES (decl));
2064 return get_identifier (TREE_STRING_POINTER
2065 (TREE_VALUE (TREE_VALUE (alias))));
2069 /* Weakrefs may be associated to external decls and thus not output
2070 at expansion time. Emit all necessary aliases. */
2072 void
2073 symbol_table::output_weakrefs (void)
2075 symtab_node *node;
2076 FOR_EACH_SYMBOL (node)
2077 if (node->alias
2078 && !TREE_ASM_WRITTEN (node->decl)
2079 && node->weakref)
2081 tree target;
2083 /* Weakrefs are special by not requiring target definition in current
2084 compilation unit. It is thus bit hard to work out what we want to
2085 alias.
2086 When alias target is defined, we need to fetch it from symtab reference,
2087 otherwise it is pointed to by alias_target. */
2088 if (node->alias_target)
2089 target = (DECL_P (node->alias_target)
2090 ? DECL_ASSEMBLER_NAME (node->alias_target)
2091 : node->alias_target);
2092 else if (node->analyzed)
2093 target = DECL_ASSEMBLER_NAME (node->get_alias_target ()->decl);
2094 else
2096 gcc_unreachable ();
2097 target = get_alias_symbol (node->decl);
2099 do_assemble_alias (node->decl, target);
2103 /* Perform simple optimizations based on callgraph. */
2105 void
2106 symbol_table::compile (void)
2108 if (seen_error ())
2109 return;
2111 #ifdef ENABLE_CHECKING
2112 symtab_node::verify_symtab_nodes ();
2113 #endif
2115 timevar_push (TV_CGRAPHOPT);
2116 if (pre_ipa_mem_report)
2118 fprintf (stderr, "Memory consumption before IPA\n");
2119 dump_memory_report (false);
2121 if (!quiet_flag)
2122 fprintf (stderr, "Performing interprocedural optimizations\n");
2123 state = IPA;
2125 /* If LTO is enabled, initialize the streamer hooks needed by GIMPLE. */
2126 if (flag_lto)
2127 lto_streamer_hooks_init ();
2129 /* Don't run the IPA passes if there was any error or sorry messages. */
2130 if (!seen_error ())
2131 ipa_passes ();
2133 /* Do nothing else if any IPA pass found errors or if we are just streaming LTO. */
2134 if (seen_error ()
2135 || (!in_lto_p && flag_lto && !flag_fat_lto_objects))
2137 timevar_pop (TV_CGRAPHOPT);
2138 return;
2141 /* This pass remove bodies of extern inline functions we never inlined.
2142 Do this later so other IPA passes see what is really going on.
2143 FIXME: This should be run just after inlining by pasmanager. */
2144 remove_unreachable_nodes (false, dump_file);
2145 global_info_ready = true;
2146 if (dump_file)
2148 fprintf (dump_file, "Optimized ");
2149 symtab_node:: dump_table (dump_file);
2151 if (post_ipa_mem_report)
2153 fprintf (stderr, "Memory consumption after IPA\n");
2154 dump_memory_report (false);
2156 timevar_pop (TV_CGRAPHOPT);
2158 /* Output everything. */
2159 (*debug_hooks->assembly_start) ();
2160 if (!quiet_flag)
2161 fprintf (stderr, "Assembling functions:\n");
2162 #ifdef ENABLE_CHECKING
2163 symtab_node::verify_symtab_nodes ();
2164 #endif
2166 materialize_all_clones ();
2167 bitmap_obstack_initialize (NULL);
2168 execute_ipa_pass_list (g->get_passes ()->all_late_ipa_passes);
2169 bitmap_obstack_release (NULL);
2170 mark_functions_to_output ();
2172 /* When weakref support is missing, we autmatically translate all
2173 references to NODE to references to its ultimate alias target.
2174 The renaming mechanizm uses flag IDENTIFIER_TRANSPARENT_ALIAS and
2175 TREE_CHAIN.
2177 Set up this mapping before we output any assembler but once we are sure
2178 that all symbol renaming is done.
2180 FIXME: All this uglyness can go away if we just do renaming at gimple
2181 level by physically rewritting the IL. At the moment we can only redirect
2182 calls, so we need infrastructure for renaming references as well. */
2183 #ifndef ASM_OUTPUT_WEAKREF
2184 symtab_node *node;
2186 FOR_EACH_SYMBOL (node)
2187 if (node->alias
2188 && lookup_attribute ("weakref", DECL_ATTRIBUTES (node->decl)))
2190 IDENTIFIER_TRANSPARENT_ALIAS
2191 (DECL_ASSEMBLER_NAME (node->decl)) = 1;
2192 TREE_CHAIN (DECL_ASSEMBLER_NAME (node->decl))
2193 = (node->alias_target ? node->alias_target
2194 : DECL_ASSEMBLER_NAME (node->get_alias_target ()->decl));
2196 #endif
2198 state = EXPANSION;
2200 if (!flag_toplevel_reorder)
2201 output_in_order (false);
2202 else
2204 /* Output first asm statements and anything ordered. The process
2205 flag is cleared for these nodes, so we skip them later. */
2206 output_in_order (true);
2207 expand_all_functions ();
2208 output_variables ();
2211 process_new_functions ();
2212 state = FINISHED;
2213 output_weakrefs ();
2215 if (dump_file)
2217 fprintf (dump_file, "\nFinal ");
2218 symtab_node::dump_table (dump_file);
2220 #ifdef ENABLE_CHECKING
2221 symtab_node::verify_symtab_nodes ();
2222 /* Double check that all inline clones are gone and that all
2223 function bodies have been released from memory. */
2224 if (!seen_error ())
2226 cgraph_node *node;
2227 bool error_found = false;
2229 FOR_EACH_DEFINED_FUNCTION (node)
2230 if (node->global.inlined_to
2231 || gimple_has_body_p (node->decl))
2233 error_found = true;
2234 node->debug ();
2236 if (error_found)
2237 internal_error ("nodes with unreleased memory found");
2239 #endif
2243 /* Analyze the whole compilation unit once it is parsed completely. */
2245 void
2246 symbol_table::finalize_compilation_unit (void)
2248 timevar_push (TV_CGRAPH);
2250 /* If we're here there's no current function anymore. Some frontends
2251 are lazy in clearing these. */
2252 current_function_decl = NULL;
2253 set_cfun (NULL);
2255 /* Do not skip analyzing the functions if there were errors, we
2256 miss diagnostics for following functions otherwise. */
2258 /* Emit size functions we didn't inline. */
2259 finalize_size_functions ();
2261 /* Mark alias targets necessary and emit diagnostics. */
2262 handle_alias_pairs ();
2264 if (!quiet_flag)
2266 fprintf (stderr, "\nAnalyzing compilation unit\n");
2267 fflush (stderr);
2270 if (flag_dump_passes)
2271 dump_passes ();
2273 /* Gimplify and lower all functions, compute reachability and
2274 remove unreachable nodes. */
2275 analyze_functions ();
2277 /* Mark alias targets necessary and emit diagnostics. */
2278 handle_alias_pairs ();
2280 /* Gimplify and lower thunks. */
2281 analyze_functions ();
2283 /* Finally drive the pass manager. */
2284 compile ();
2286 timevar_pop (TV_CGRAPH);
2289 /* Creates a wrapper from cgraph_node to TARGET node. Thunk is used for this
2290 kind of wrapper method. */
2292 void
2293 cgraph_node::create_wrapper (cgraph_node *target)
2295 /* Preserve DECL_RESULT so we get right by reference flag. */
2296 tree decl_result = DECL_RESULT (decl);
2298 /* Remove the function's body but keep arguments to be reused
2299 for thunk. */
2300 release_body (true);
2301 reset ();
2303 DECL_RESULT (decl) = decl_result;
2304 DECL_INITIAL (decl) = NULL;
2305 allocate_struct_function (decl, false);
2306 set_cfun (NULL);
2308 /* Turn alias into thunk and expand it into GIMPLE representation. */
2309 definition = true;
2310 thunk.thunk_p = true;
2311 thunk.this_adjusting = false;
2313 cgraph_edge *e = create_edge (target, NULL, 0, CGRAPH_FREQ_BASE);
2315 expand_thunk (false, true);
2316 e->call_stmt_cannot_inline_p = true;
2318 /* Inline summary set-up. */
2319 analyze ();
2320 inline_analyze_function (this);
2323 #include "gt-cgraphunit.h"