PR c++/60417
[official-gcc.git] / gcc / cgraphunit.c
blob02c9fa13f57e0fd4f4b18bcb06e667d7772b8c8d
1 /* Driver of optimization process
2 Copyright (C) 2003-2014 Free Software Foundation, Inc.
3 Contributed by Jan Hubicka
5 This file is part of GCC.
7 GCC is free software; you can redistribute it and/or modify it under
8 the terms of the GNU General Public License as published by the Free
9 Software Foundation; either version 3, or (at your option) any later
10 version.
12 GCC is distributed in the hope that it will be useful, but WITHOUT ANY
13 WARRANTY; without even the implied warranty of MERCHANTABILITY or
14 FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
15 for more details.
17 You should have received a copy of the GNU General Public License
18 along with GCC; see the file COPYING3. If not see
19 <http://www.gnu.org/licenses/>. */
21 /* This module implements main driver of compilation process.
23 The main scope of this file is to act as an interface in between
24 tree based frontends and the backend.
26 The front-end is supposed to use following functionality:
28 - cgraph_finalize_function
30 This function is called once front-end has parsed whole body of function
31 and it is certain that the function body nor the declaration will change.
33 (There is one exception needed for implementing GCC extern inline
34 function.)
36 - varpool_finalize_decl
38 This function has same behavior as the above but is used for static
39 variables.
41 - add_asm_node
43 Insert new toplevel ASM statement
45 - finalize_compilation_unit
47 This function is called once (source level) compilation unit is finalized
48 and it will no longer change.
50 The symbol table is constructed starting from the trivially needed
51 symbols finalized by the frontend. Functions are lowered into
52 GIMPLE representation and callgraph/reference lists are constructed.
53 Those are used to discover other necessary functions and variables.
55 At the end the bodies of unreachable functions are removed.
57 The function can be called multiple times when multiple source level
58 compilation units are combined.
60 - compile
62 This passes control to the back-end. Optimizations are performed and
63 final assembler is generated. This is done in the following way. Note
64 that with link time optimization the process is split into three
65 stages (compile time, linktime analysis and parallel linktime as
66 indicated bellow).
68 Compile time:
70 1) Inter-procedural optimization.
71 (ipa_passes)
73 This part is further split into:
75 a) early optimizations. These are local passes executed in
76 the topological order on the callgraph.
78 The purpose of early optimiations is to optimize away simple
79 things that may otherwise confuse IP analysis. Very simple
80 propagation across the callgraph is done i.e. to discover
81 functions without side effects and simple inlining is performed.
83 b) early small interprocedural passes.
85 Those are interprocedural passes executed only at compilation
86 time. These include, for example, transational memory lowering,
87 unreachable code removal and other simple transformations.
89 c) IP analysis stage. All interprocedural passes do their
90 analysis.
92 Interprocedural passes differ from small interprocedural
93 passes by their ability to operate across whole program
94 at linktime. Their analysis stage is performed early to
95 both reduce linking times and linktime memory usage by
96 not having to represent whole program in memory.
98 d) LTO sreaming. When doing LTO, everything important gets
99 streamed into the object file.
101 Compile time and or linktime analysis stage (WPA):
103 At linktime units gets streamed back and symbol table is
104 merged. Function bodies are not streamed in and not
105 available.
106 e) IP propagation stage. All IP passes execute their
107 IP propagation. This is done based on the earlier analysis
108 without having function bodies at hand.
109 f) Ltrans streaming. When doing WHOPR LTO, the program
110 is partitioned and streamed into multple object files.
112 Compile time and/or parallel linktime stage (ltrans)
114 Each of the object files is streamed back and compiled
115 separately. Now the function bodies becomes available
116 again.
118 2) Virtual clone materialization
119 (cgraph_materialize_clone)
121 IP passes can produce copies of existing functoins (such
122 as versioned clones or inline clones) without actually
123 manipulating their bodies by creating virtual clones in
124 the callgraph. At this time the virtual clones are
125 turned into real functions
126 3) IP transformation
128 All IP passes transform function bodies based on earlier
129 decision of the IP propagation.
131 4) late small IP passes
133 Simple IP passes working within single program partition.
135 5) Expansion
136 (expand_all_functions)
138 At this stage functions that needs to be output into
139 assembler are identified and compiled in topological order
140 6) Output of variables and aliases
141 Now it is known what variable references was not optimized
142 out and thus all variables are output to the file.
144 Note that with -fno-toplevel-reorder passes 5 and 6
145 are combined together in cgraph_output_in_order.
147 Finally there are functions to manipulate the callgraph from
148 backend.
149 - cgraph_add_new_function is used to add backend produced
150 functions introduced after the unit is finalized.
151 The functions are enqueue for later processing and inserted
152 into callgraph with cgraph_process_new_functions.
154 - cgraph_function_versioning
156 produces a copy of function into new one (a version)
157 and apply simple transformations
160 #include "config.h"
161 #include "system.h"
162 #include "coretypes.h"
163 #include "tm.h"
164 #include "tree.h"
165 #include "varasm.h"
166 #include "stor-layout.h"
167 #include "stringpool.h"
168 #include "output.h"
169 #include "rtl.h"
170 #include "basic-block.h"
171 #include "tree-ssa-alias.h"
172 #include "internal-fn.h"
173 #include "gimple-fold.h"
174 #include "gimple-expr.h"
175 #include "is-a.h"
176 #include "gimple.h"
177 #include "gimplify.h"
178 #include "gimple-iterator.h"
179 #include "gimplify-me.h"
180 #include "gimple-ssa.h"
181 #include "tree-cfg.h"
182 #include "tree-into-ssa.h"
183 #include "tree-ssa.h"
184 #include "tree-inline.h"
185 #include "langhooks.h"
186 #include "toplev.h"
187 #include "flags.h"
188 #include "debug.h"
189 #include "target.h"
190 #include "diagnostic.h"
191 #include "params.h"
192 #include "fibheap.h"
193 #include "intl.h"
194 #include "function.h"
195 #include "ipa-prop.h"
196 #include "tree-iterator.h"
197 #include "tree-pass.h"
198 #include "tree-dump.h"
199 #include "gimple-pretty-print.h"
200 #include "output.h"
201 #include "coverage.h"
202 #include "plugin.h"
203 #include "ipa-inline.h"
204 #include "ipa-utils.h"
205 #include "lto-streamer.h"
206 #include "except.h"
207 #include "cfgloop.h"
208 #include "regset.h" /* FIXME: For reg_obstack. */
209 #include "context.h"
210 #include "pass_manager.h"
211 #include "tree-nested.h"
212 #include "gimplify.h"
213 #include "dbgcnt.h"
215 /* Queue of cgraph nodes scheduled to be added into cgraph. This is a
216 secondary queue used during optimization to accommodate passes that
217 may generate new functions that need to be optimized and expanded. */
218 cgraph_node_set cgraph_new_nodes;
220 static void expand_all_functions (void);
221 static void mark_functions_to_output (void);
222 static void expand_function (struct cgraph_node *);
223 static void handle_alias_pairs (void);
225 FILE *cgraph_dump_file;
227 /* Linked list of cgraph asm nodes. */
228 struct asm_node *asm_nodes;
230 /* Last node in cgraph_asm_nodes. */
231 static GTY(()) struct asm_node *asm_last_node;
233 /* Used for vtable lookup in thunk adjusting. */
234 static GTY (()) tree vtable_entry_type;
236 /* Determine if symbol DECL is needed. That is, visible to something
237 either outside this translation unit, something magic in the system
238 configury */
239 bool
240 decide_is_symbol_needed (symtab_node *node)
242 tree decl = node->decl;
244 /* Double check that no one output the function into assembly file
245 early. */
246 gcc_checking_assert (!DECL_ASSEMBLER_NAME_SET_P (decl)
247 || !TREE_SYMBOL_REFERENCED (DECL_ASSEMBLER_NAME (decl)));
249 if (!node->definition)
250 return false;
252 if (DECL_EXTERNAL (decl))
253 return false;
255 /* If the user told us it is used, then it must be so. */
256 if (node->force_output)
257 return true;
259 /* ABI forced symbols are needed when they are external. */
260 if (node->forced_by_abi && TREE_PUBLIC (decl))
261 return true;
263 /* Keep constructors, destructors and virtual functions. */
264 if (TREE_CODE (decl) == FUNCTION_DECL
265 && (DECL_STATIC_CONSTRUCTOR (decl) || DECL_STATIC_DESTRUCTOR (decl)))
266 return true;
268 /* Externally visible variables must be output. The exception is
269 COMDAT variables that must be output only when they are needed. */
270 if (TREE_PUBLIC (decl) && !DECL_COMDAT (decl))
271 return true;
273 return false;
276 /* Head and terminator of the queue of nodes to be processed while building
277 callgraph. */
279 static symtab_node symtab_terminator;
280 static symtab_node *queued_nodes = &symtab_terminator;
282 /* Add NODE to queue starting at QUEUED_NODES.
283 The queue is linked via AUX pointers and terminated by pointer to 1. */
285 static void
286 enqueue_node (symtab_node *node)
288 if (node->aux)
289 return;
290 gcc_checking_assert (queued_nodes);
291 node->aux = queued_nodes;
292 queued_nodes = node;
295 /* Process CGRAPH_NEW_FUNCTIONS and perform actions necessary to add these
296 functions into callgraph in a way so they look like ordinary reachable
297 functions inserted into callgraph already at construction time. */
299 void
300 cgraph_process_new_functions (void)
302 tree fndecl;
303 struct cgraph_node *node;
304 cgraph_node_set_iterator csi;
306 if (!cgraph_new_nodes)
307 return;
308 handle_alias_pairs ();
309 /* Note that this queue may grow as its being processed, as the new
310 functions may generate new ones. */
311 for (csi = csi_start (cgraph_new_nodes); !csi_end_p (csi); csi_next (&csi))
313 node = csi_node (csi);
314 fndecl = node->decl;
315 switch (cgraph_state)
317 case CGRAPH_STATE_CONSTRUCTION:
318 /* At construction time we just need to finalize function and move
319 it into reachable functions list. */
321 cgraph_finalize_function (fndecl, false);
322 node->call_function_insertion_hooks ();
323 enqueue_node (node);
324 break;
326 case CGRAPH_STATE_IPA:
327 case CGRAPH_STATE_IPA_SSA:
328 /* When IPA optimization already started, do all essential
329 transformations that has been already performed on the whole
330 cgraph but not on this function. */
332 gimple_register_cfg_hooks ();
333 if (!node->analyzed)
334 node->analyze ();
335 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
336 if (cgraph_state == CGRAPH_STATE_IPA_SSA
337 && !gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
338 g->get_passes ()->execute_early_local_passes ();
339 else if (inline_summary_vec != NULL)
340 compute_inline_parameters (node, true);
341 free_dominance_info (CDI_POST_DOMINATORS);
342 free_dominance_info (CDI_DOMINATORS);
343 pop_cfun ();
344 node->call_function_insertion_hooks ();
345 break;
347 case CGRAPH_STATE_EXPANSION:
348 /* Functions created during expansion shall be compiled
349 directly. */
350 node->process = 0;
351 node->call_function_insertion_hooks ();
352 expand_function (node);
353 break;
355 default:
356 gcc_unreachable ();
357 break;
360 free_cgraph_node_set (cgraph_new_nodes);
361 cgraph_new_nodes = NULL;
364 /* As an GCC extension we allow redefinition of the function. The
365 semantics when both copies of bodies differ is not well defined.
366 We replace the old body with new body so in unit at a time mode
367 we always use new body, while in normal mode we may end up with
368 old body inlined into some functions and new body expanded and
369 inlined in others.
371 ??? It may make more sense to use one body for inlining and other
372 body for expanding the function but this is difficult to do. */
374 void
375 cgraph_node::reset (void)
377 /* If process is set, then we have already begun whole-unit analysis.
378 This is *not* testing for whether we've already emitted the function.
379 That case can be sort-of legitimately seen with real function redefinition
380 errors. I would argue that the front end should never present us with
381 such a case, but don't enforce that for now. */
382 gcc_assert (!process);
384 /* Reset our data structures so we can analyze the function again. */
385 memset (&local, 0, sizeof (local));
386 memset (&global, 0, sizeof (global));
387 memset (&rtl, 0, sizeof (rtl));
388 analyzed = false;
389 definition = false;
390 alias = false;
391 weakref = false;
392 cpp_implicit_alias = false;
394 remove_callees ();
395 remove_all_references ();
398 /* Return true when there are references to NODE. */
400 static bool
401 referred_to_p (symtab_node *node)
403 struct ipa_ref *ref = NULL;
405 /* See if there are any references at all. */
406 if (node->iterate_referring (0, ref))
407 return true;
408 /* For functions check also calls. */
409 cgraph_node *cn = dyn_cast <cgraph_node *> (node);
410 if (cn && cn->callers)
411 return true;
412 return false;
415 /* DECL has been parsed. Take it, queue it, compile it at the whim of the
416 logic in effect. If NO_COLLECT is true, then our caller cannot stand to have
417 the garbage collector run at the moment. We would need to either create
418 a new GC context, or just not compile right now. */
420 void
421 cgraph_finalize_function (tree decl, bool no_collect)
423 struct cgraph_node *node = cgraph_node::get_create (decl);
425 if (node->definition)
427 /* Nested functions should only be defined once. */
428 gcc_assert (!DECL_CONTEXT (decl)
429 || TREE_CODE (DECL_CONTEXT (decl)) != FUNCTION_DECL);
430 node->reset ();
431 node->local.redefined_extern_inline = true;
434 notice_global_symbol (decl);
435 node->definition = true;
436 node->lowered = DECL_STRUCT_FUNCTION (decl)->cfg != NULL;
438 /* With -fkeep-inline-functions we are keeping all inline functions except
439 for extern inline ones. */
440 if (flag_keep_inline_functions
441 && DECL_DECLARED_INLINE_P (decl)
442 && !DECL_EXTERNAL (decl)
443 && !DECL_DISREGARD_INLINE_LIMITS (decl))
444 node->force_output = 1;
446 /* When not optimizing, also output the static functions. (see
447 PR24561), but don't do so for always_inline functions, functions
448 declared inline and nested functions. These were optimized out
449 in the original implementation and it is unclear whether we want
450 to change the behavior here. */
451 if ((!optimize
452 && !node->cpp_implicit_alias
453 && !DECL_DISREGARD_INLINE_LIMITS (decl)
454 && !DECL_DECLARED_INLINE_P (decl)
455 && !(DECL_CONTEXT (decl)
456 && TREE_CODE (DECL_CONTEXT (decl)) == FUNCTION_DECL))
457 && !DECL_COMDAT (decl) && !DECL_EXTERNAL (decl))
458 node->force_output = 1;
460 /* If we've not yet emitted decl, tell the debug info about it. */
461 if (!TREE_ASM_WRITTEN (decl))
462 (*debug_hooks->deferred_inline_function) (decl);
464 /* Possibly warn about unused parameters. */
465 if (warn_unused_parameter)
466 do_warn_unused_parameter (decl);
468 if (!no_collect)
469 ggc_collect ();
471 if (cgraph_state == CGRAPH_STATE_CONSTRUCTION
472 && (decide_is_symbol_needed (node)
473 || referred_to_p (node)))
474 enqueue_node (node);
477 /* Add the function FNDECL to the call graph.
478 Unlike cgraph_finalize_function, this function is intended to be used
479 by middle end and allows insertion of new function at arbitrary point
480 of compilation. The function can be either in high, low or SSA form
481 GIMPLE.
483 The function is assumed to be reachable and have address taken (so no
484 API breaking optimizations are performed on it).
486 Main work done by this function is to enqueue the function for later
487 processing to avoid need the passes to be re-entrant. */
489 void
490 cgraph_node::add_new_function (tree fndecl, bool lowered)
492 gcc::pass_manager *passes = g->get_passes ();
493 struct cgraph_node *node;
494 switch (cgraph_state)
496 case CGRAPH_STATE_PARSING:
497 cgraph_finalize_function (fndecl, false);
498 break;
499 case CGRAPH_STATE_CONSTRUCTION:
500 /* Just enqueue function to be processed at nearest occurrence. */
501 node = cgraph_node::get_create (fndecl);
502 if (lowered)
503 node->lowered = true;
504 if (!cgraph_new_nodes)
505 cgraph_new_nodes = cgraph_node_set_new ();
506 cgraph_node_set_add (cgraph_new_nodes, node);
507 break;
509 case CGRAPH_STATE_IPA:
510 case CGRAPH_STATE_IPA_SSA:
511 case CGRAPH_STATE_EXPANSION:
512 /* Bring the function into finalized state and enqueue for later
513 analyzing and compilation. */
514 node = cgraph_node::get_create (fndecl);
515 node->local.local = false;
516 node->definition = true;
517 node->force_output = true;
518 if (!lowered && cgraph_state == CGRAPH_STATE_EXPANSION)
520 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
521 gimple_register_cfg_hooks ();
522 bitmap_obstack_initialize (NULL);
523 execute_pass_list (cfun, passes->all_lowering_passes);
524 passes->execute_early_local_passes ();
525 bitmap_obstack_release (NULL);
526 pop_cfun ();
528 lowered = true;
530 if (lowered)
531 node->lowered = true;
532 if (!cgraph_new_nodes)
533 cgraph_new_nodes = cgraph_node_set_new ();
534 cgraph_node_set_add (cgraph_new_nodes, node);
535 break;
537 case CGRAPH_STATE_FINISHED:
538 /* At the very end of compilation we have to do all the work up
539 to expansion. */
540 node = cgraph_node::create (fndecl);
541 if (lowered)
542 node->lowered = true;
543 node->definition = true;
544 node->analyze ();
545 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
546 gimple_register_cfg_hooks ();
547 bitmap_obstack_initialize (NULL);
548 if (!gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
549 g->get_passes ()->execute_early_local_passes ();
550 bitmap_obstack_release (NULL);
551 pop_cfun ();
552 expand_function (node);
553 break;
555 default:
556 gcc_unreachable ();
559 /* Set a personality if required and we already passed EH lowering. */
560 if (lowered
561 && (function_needs_eh_personality (DECL_STRUCT_FUNCTION (fndecl))
562 == eh_personality_lang))
563 DECL_FUNCTION_PERSONALITY (fndecl) = lang_hooks.eh_personality ();
566 /* Add a top-level asm statement to the list. */
568 struct asm_node *
569 add_asm_node (tree asm_str)
571 struct asm_node *node;
573 node = ggc_cleared_alloc<asm_node> ();
574 node->asm_str = asm_str;
575 node->order = symtab_order++;
576 node->next = NULL;
577 if (asm_nodes == NULL)
578 asm_nodes = node;
579 else
580 asm_last_node->next = node;
581 asm_last_node = node;
582 return node;
585 /* Output all asm statements we have stored up to be output. */
587 static void
588 output_asm_statements (void)
590 struct asm_node *can;
592 if (seen_error ())
593 return;
595 for (can = asm_nodes; can; can = can->next)
596 assemble_asm (can->asm_str);
597 asm_nodes = NULL;
600 /* Analyze the function scheduled to be output. */
601 void
602 cgraph_node::analyze (void)
604 tree decl = this->decl;
605 location_t saved_loc = input_location;
606 input_location = DECL_SOURCE_LOCATION (decl);
608 if (thunk.thunk_p)
610 create_edge (cgraph_node::get (thunk.alias),
611 NULL, 0, CGRAPH_FREQ_BASE);
612 if (!expand_thunk (false, false))
614 thunk.alias = NULL;
615 analyzed = true;
616 return;
618 thunk.alias = NULL;
620 if (alias)
621 resolve_alias (cgraph_node::get (alias_target));
622 else if (dispatcher_function)
624 /* Generate the dispatcher body of multi-versioned functions. */
625 struct cgraph_function_version_info *dispatcher_version_info
626 = function_version ();
627 if (dispatcher_version_info != NULL
628 && (dispatcher_version_info->dispatcher_resolver
629 == NULL_TREE))
631 tree resolver = NULL_TREE;
632 gcc_assert (targetm.generate_version_dispatcher_body);
633 resolver = targetm.generate_version_dispatcher_body (this);
634 gcc_assert (resolver != NULL_TREE);
637 else
639 push_cfun (DECL_STRUCT_FUNCTION (decl));
641 assign_assembler_name_if_neeeded (decl);
643 /* Make sure to gimplify bodies only once. During analyzing a
644 function we lower it, which will require gimplified nested
645 functions, so we can end up here with an already gimplified
646 body. */
647 if (!gimple_has_body_p (decl))
648 gimplify_function_tree (decl);
649 dump_function (TDI_generic, decl);
651 /* Lower the function. */
652 if (!lowered)
654 if (nested)
655 lower_nested_functions (decl);
656 gcc_assert (!nested);
658 gimple_register_cfg_hooks ();
659 bitmap_obstack_initialize (NULL);
660 execute_pass_list (cfun, g->get_passes ()->all_lowering_passes);
661 free_dominance_info (CDI_POST_DOMINATORS);
662 free_dominance_info (CDI_DOMINATORS);
663 compact_blocks ();
664 bitmap_obstack_release (NULL);
665 lowered = true;
668 pop_cfun ();
670 analyzed = true;
672 input_location = saved_loc;
675 /* C++ frontend produce same body aliases all over the place, even before PCH
676 gets streamed out. It relies on us linking the aliases with their function
677 in order to do the fixups, but ipa-ref is not PCH safe. Consequentely we
678 first produce aliases without links, but once C++ FE is sure he won't sream
679 PCH we build the links via this function. */
681 void
682 cgraph_process_same_body_aliases (void)
684 symtab_node *node;
685 FOR_EACH_SYMBOL (node)
686 if (node->cpp_implicit_alias && !node->analyzed)
687 node->resolve_alias
688 (TREE_CODE (node->alias_target) == VAR_DECL
689 ? (symtab_node *)varpool_node::get_create (node->alias_target)
690 : (symtab_node *)cgraph_node::get_create (node->alias_target));
691 cpp_implicit_aliases_done = true;
694 /* Process attributes common for vars and functions. */
696 static void
697 process_common_attributes (tree decl)
699 tree weakref = lookup_attribute ("weakref", DECL_ATTRIBUTES (decl));
701 if (weakref && !lookup_attribute ("alias", DECL_ATTRIBUTES (decl)))
703 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
704 "%<weakref%> attribute should be accompanied with"
705 " an %<alias%> attribute");
706 DECL_WEAK (decl) = 0;
707 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
708 DECL_ATTRIBUTES (decl));
712 /* Look for externally_visible and used attributes and mark cgraph nodes
713 accordingly.
715 We cannot mark the nodes at the point the attributes are processed (in
716 handle_*_attribute) because the copy of the declarations available at that
717 point may not be canonical. For example, in:
719 void f();
720 void f() __attribute__((used));
722 the declaration we see in handle_used_attribute will be the second
723 declaration -- but the front end will subsequently merge that declaration
724 with the original declaration and discard the second declaration.
726 Furthermore, we can't mark these nodes in cgraph_finalize_function because:
728 void f() {}
729 void f() __attribute__((externally_visible));
731 is valid.
733 So, we walk the nodes at the end of the translation unit, applying the
734 attributes at that point. */
736 static void
737 process_function_and_variable_attributes (struct cgraph_node *first,
738 varpool_node *first_var)
740 struct cgraph_node *node;
741 varpool_node *vnode;
743 for (node = cgraph_first_function (); node != first;
744 node = cgraph_next_function (node))
746 tree decl = node->decl;
747 if (DECL_PRESERVE_P (decl))
748 node->mark_force_output ();
749 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
751 if (! TREE_PUBLIC (node->decl))
752 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
753 "%<externally_visible%>"
754 " attribute have effect only on public objects");
756 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
757 && (node->definition && !node->alias))
759 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
760 "%<weakref%> attribute ignored"
761 " because function is defined");
762 DECL_WEAK (decl) = 0;
763 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
764 DECL_ATTRIBUTES (decl));
767 if (lookup_attribute ("always_inline", DECL_ATTRIBUTES (decl))
768 && !DECL_DECLARED_INLINE_P (decl)
769 /* redefining extern inline function makes it DECL_UNINLINABLE. */
770 && !DECL_UNINLINABLE (decl))
771 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
772 "always_inline function might not be inlinable");
774 process_common_attributes (decl);
776 for (vnode = varpool_first_variable (); vnode != first_var;
777 vnode = varpool_next_variable (vnode))
779 tree decl = vnode->decl;
780 if (DECL_EXTERNAL (decl)
781 && DECL_INITIAL (decl))
782 varpool_node::finalize_decl (decl);
783 if (DECL_PRESERVE_P (decl))
784 vnode->force_output = true;
785 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
787 if (! TREE_PUBLIC (vnode->decl))
788 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
789 "%<externally_visible%>"
790 " attribute have effect only on public objects");
792 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
793 && vnode->definition
794 && DECL_INITIAL (decl))
796 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
797 "%<weakref%> attribute ignored"
798 " because variable is initialized");
799 DECL_WEAK (decl) = 0;
800 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
801 DECL_ATTRIBUTES (decl));
803 process_common_attributes (decl);
807 /* Mark DECL as finalized. By finalizing the declaration, frontend instruct the
808 middle end to output the variable to asm file, if needed or externally
809 visible. */
811 void
812 varpool_node::finalize_decl (tree decl)
814 varpool_node *node = varpool_node::get_create (decl);
816 gcc_assert (TREE_STATIC (decl) || DECL_EXTERNAL (decl));
818 if (node->definition)
819 return;
820 notice_global_symbol (decl);
821 node->definition = true;
822 if (TREE_THIS_VOLATILE (decl) || DECL_PRESERVE_P (decl)
823 /* Traditionally we do not eliminate static variables when not
824 optimizing and when not doing toplevel reoder. */
825 || (!flag_toplevel_reorder && !DECL_COMDAT (node->decl)
826 && !DECL_ARTIFICIAL (node->decl)))
827 node->force_output = true;
829 if (cgraph_state == CGRAPH_STATE_CONSTRUCTION
830 && (decide_is_symbol_needed (node)
831 || referred_to_p (node)))
832 enqueue_node (node);
833 if (cgraph_state >= CGRAPH_STATE_IPA_SSA)
834 node->analyze ();
835 /* Some frontends produce various interface variables after compilation
836 finished. */
837 if (cgraph_state == CGRAPH_STATE_FINISHED
838 || (!flag_toplevel_reorder && cgraph_state == CGRAPH_STATE_EXPANSION))
839 node->assemble_decl ();
842 /* EDGE is an polymorphic call. Mark all possible targets as reachable
843 and if there is only one target, perform trivial devirtualization.
844 REACHABLE_CALL_TARGETS collects target lists we already walked to
845 avoid udplicate work. */
847 static void
848 walk_polymorphic_call_targets (pointer_set_t *reachable_call_targets,
849 struct cgraph_edge *edge)
851 unsigned int i;
852 void *cache_token;
853 bool final;
854 vec <cgraph_node *>targets
855 = possible_polymorphic_call_targets
856 (edge, &final, &cache_token);
858 if (!pointer_set_insert (reachable_call_targets,
859 cache_token))
861 if (cgraph_dump_file)
862 dump_possible_polymorphic_call_targets
863 (cgraph_dump_file, edge);
865 for (i = 0; i < targets.length (); i++)
867 /* Do not bother to mark virtual methods in anonymous namespace;
868 either we will find use of virtual table defining it, or it is
869 unused. */
870 if (targets[i]->definition
871 && TREE_CODE
872 (TREE_TYPE (targets[i]->decl))
873 == METHOD_TYPE
874 && !type_in_anonymous_namespace_p
875 (method_class_type
876 (TREE_TYPE (targets[i]->decl))))
877 enqueue_node (targets[i]);
881 /* Very trivial devirtualization; when the type is
882 final or anonymous (so we know all its derivation)
883 and there is only one possible virtual call target,
884 make the edge direct. */
885 if (final)
887 if (targets.length () <= 1 && dbg_cnt (devirt))
889 cgraph_node *target;
890 if (targets.length () == 1)
891 target = targets[0];
892 else
893 target = cgraph_node::create
894 (builtin_decl_implicit (BUILT_IN_UNREACHABLE));
896 if (cgraph_dump_file)
898 fprintf (cgraph_dump_file,
899 "Devirtualizing call: ");
900 print_gimple_stmt (cgraph_dump_file,
901 edge->call_stmt, 0,
902 TDF_SLIM);
904 if (dump_enabled_p ())
906 location_t locus = gimple_location_safe (edge->call_stmt);
907 dump_printf_loc (MSG_OPTIMIZED_LOCATIONS, locus,
908 "devirtualizing call in %s to %s\n",
909 edge->caller->name (), target->name ());
912 cgraph_make_edge_direct (edge, target);
913 cgraph_redirect_edge_call_stmt_to_callee (edge);
914 if (cgraph_dump_file)
916 fprintf (cgraph_dump_file,
917 "Devirtualized as: ");
918 print_gimple_stmt (cgraph_dump_file,
919 edge->call_stmt, 0,
920 TDF_SLIM);
927 /* Discover all functions and variables that are trivially needed, analyze
928 them as well as all functions and variables referred by them */
930 static void
931 analyze_functions (void)
933 /* Keep track of already processed nodes when called multiple times for
934 intermodule optimization. */
935 static struct cgraph_node *first_analyzed;
936 struct cgraph_node *first_handled = first_analyzed;
937 static varpool_node *first_analyzed_var;
938 varpool_node *first_handled_var = first_analyzed_var;
939 struct pointer_set_t *reachable_call_targets = pointer_set_create ();
941 symtab_node *node;
942 symtab_node *next;
943 int i;
944 struct ipa_ref *ref;
945 bool changed = true;
946 location_t saved_loc = input_location;
948 bitmap_obstack_initialize (NULL);
949 cgraph_state = CGRAPH_STATE_CONSTRUCTION;
950 input_location = UNKNOWN_LOCATION;
952 /* Ugly, but the fixup can not happen at a time same body alias is created;
953 C++ FE is confused about the COMDAT groups being right. */
954 if (cpp_implicit_aliases_done)
955 FOR_EACH_SYMBOL (node)
956 if (node->cpp_implicit_alias)
957 node->fixup_same_cpp_alias_visibility (node->get_alias_target ());
958 if (optimize && flag_devirtualize)
959 build_type_inheritance_graph ();
961 /* Analysis adds static variables that in turn adds references to new functions.
962 So we need to iterate the process until it stabilize. */
963 while (changed)
965 changed = false;
966 process_function_and_variable_attributes (first_analyzed,
967 first_analyzed_var);
969 /* First identify the trivially needed symbols. */
970 for (node = symtab_nodes;
971 node != first_analyzed
972 && node != first_analyzed_var; node = node->next)
974 /* Convert COMDAT group designators to IDENTIFIER_NODEs. */
975 node->get_comdat_group_id ();
976 if (decide_is_symbol_needed (node))
978 enqueue_node (node);
979 if (!changed && cgraph_dump_file)
980 fprintf (cgraph_dump_file, "Trivially needed symbols:");
981 changed = true;
982 if (cgraph_dump_file)
983 fprintf (cgraph_dump_file, " %s", node->asm_name ());
984 if (!changed && cgraph_dump_file)
985 fprintf (cgraph_dump_file, "\n");
987 if (node == first_analyzed
988 || node == first_analyzed_var)
989 break;
991 cgraph_process_new_functions ();
992 first_analyzed_var = varpool_first_variable ();
993 first_analyzed = cgraph_first_function ();
995 if (changed && cgraph_dump_file)
996 fprintf (cgraph_dump_file, "\n");
998 /* Lower representation, build callgraph edges and references for all trivially
999 needed symbols and all symbols referred by them. */
1000 while (queued_nodes != &symtab_terminator)
1002 changed = true;
1003 node = queued_nodes;
1004 queued_nodes = (symtab_node *)queued_nodes->aux;
1005 cgraph_node *cnode = dyn_cast <cgraph_node *> (node);
1006 if (cnode && cnode->definition)
1008 struct cgraph_edge *edge;
1009 tree decl = cnode->decl;
1011 /* ??? It is possible to create extern inline function
1012 and later using weak alias attribute to kill its body.
1013 See gcc.c-torture/compile/20011119-1.c */
1014 if (!DECL_STRUCT_FUNCTION (decl)
1015 && !cnode->alias
1016 && !cnode->thunk.thunk_p
1017 && !cnode->dispatcher_function)
1019 cnode->reset ();
1020 cnode->local.redefined_extern_inline = true;
1021 continue;
1024 if (!cnode->analyzed)
1025 cnode->analyze ();
1027 for (edge = cnode->callees; edge; edge = edge->next_callee)
1028 if (edge->callee->definition)
1029 enqueue_node (edge->callee);
1030 if (optimize && flag_devirtualize)
1032 struct cgraph_edge *next;
1034 for (edge = cnode->indirect_calls; edge; edge = next)
1036 next = edge->next_callee;
1037 if (edge->indirect_info->polymorphic)
1038 walk_polymorphic_call_targets (reachable_call_targets,
1039 edge);
1043 /* If decl is a clone of an abstract function,
1044 mark that abstract function so that we don't release its body.
1045 The DECL_INITIAL() of that abstract function declaration
1046 will be later needed to output debug info. */
1047 if (DECL_ABSTRACT_ORIGIN (decl))
1049 struct cgraph_node *origin_node
1050 = cgraph_node::get_create (DECL_ABSTRACT_ORIGIN (decl));
1051 origin_node->used_as_abstract_origin = true;
1054 else
1056 varpool_node *vnode = dyn_cast <varpool_node *> (node);
1057 if (vnode && vnode->definition && !vnode->analyzed)
1058 vnode->analyze ();
1061 if (node->same_comdat_group)
1063 symtab_node *next;
1064 for (next = node->same_comdat_group;
1065 next != node;
1066 next = next->same_comdat_group)
1067 enqueue_node (next);
1069 for (i = 0; node->iterate_reference (i, ref); i++)
1070 if (ref->referred->definition)
1071 enqueue_node (ref->referred);
1072 cgraph_process_new_functions ();
1075 if (optimize && flag_devirtualize)
1076 update_type_inheritance_graph ();
1078 /* Collect entry points to the unit. */
1079 if (cgraph_dump_file)
1081 fprintf (cgraph_dump_file, "\n\nInitial ");
1082 symtab_node::dump_table (cgraph_dump_file);
1085 if (cgraph_dump_file)
1086 fprintf (cgraph_dump_file, "\nRemoving unused symbols:");
1088 for (node = symtab_nodes;
1089 node != first_handled
1090 && node != first_handled_var; node = next)
1092 next = node->next;
1093 if (!node->aux && !referred_to_p (node))
1095 if (cgraph_dump_file)
1096 fprintf (cgraph_dump_file, " %s", node->name ());
1097 node->remove ();
1098 continue;
1100 if (cgraph_node *cnode = dyn_cast <cgraph_node *> (node))
1102 tree decl = node->decl;
1104 if (cnode->definition && !gimple_has_body_p (decl)
1105 && !cnode->alias
1106 && !cnode->thunk.thunk_p)
1107 cnode->reset ();
1109 gcc_assert (!cnode->definition || cnode->thunk.thunk_p
1110 || cnode->alias
1111 || gimple_has_body_p (decl));
1112 gcc_assert (cnode->analyzed == cnode->definition);
1114 node->aux = NULL;
1116 for (;node; node = node->next)
1117 node->aux = NULL;
1118 first_analyzed = cgraph_first_function ();
1119 first_analyzed_var = varpool_first_variable ();
1120 if (cgraph_dump_file)
1122 fprintf (cgraph_dump_file, "\n\nReclaimed ");
1123 symtab_node::dump_table (cgraph_dump_file);
1125 bitmap_obstack_release (NULL);
1126 pointer_set_destroy (reachable_call_targets);
1127 ggc_collect ();
1128 /* Initialize assembler name hash, in particular we want to trigger C++
1129 mangling and same body alias creation before we free DECL_ARGUMENTS
1130 used by it. */
1131 if (!seen_error ())
1132 symtab_initialize_asm_name_hash ();
1134 input_location = saved_loc;
1137 /* Translate the ugly representation of aliases as alias pairs into nice
1138 representation in callgraph. We don't handle all cases yet,
1139 unfortunately. */
1141 static void
1142 handle_alias_pairs (void)
1144 alias_pair *p;
1145 unsigned i;
1147 for (i = 0; alias_pairs && alias_pairs->iterate (i, &p);)
1149 symtab_node *target_node = symtab_node_for_asm (p->target);
1151 /* Weakrefs with target not defined in current unit are easy to handle:
1152 they behave just as external variables except we need to note the
1153 alias flag to later output the weakref pseudo op into asm file. */
1154 if (!target_node
1155 && lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)) != NULL)
1157 symtab_node *node = symtab_node::get (p->decl);
1158 if (node)
1160 node->alias_target = p->target;
1161 node->weakref = true;
1162 node->alias = true;
1164 alias_pairs->unordered_remove (i);
1165 continue;
1167 else if (!target_node)
1169 error ("%q+D aliased to undefined symbol %qE", p->decl, p->target);
1170 symtab_node *node = symtab_node::get (p->decl);
1171 if (node)
1172 node->alias = false;
1173 alias_pairs->unordered_remove (i);
1174 continue;
1177 if (DECL_EXTERNAL (target_node->decl)
1178 /* We use local aliases for C++ thunks to force the tailcall
1179 to bind locally. This is a hack - to keep it working do
1180 the following (which is not strictly correct). */
1181 && (! TREE_CODE (target_node->decl) == FUNCTION_DECL
1182 || ! DECL_VIRTUAL_P (target_node->decl))
1183 && ! lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)))
1185 error ("%q+D aliased to external symbol %qE",
1186 p->decl, p->target);
1189 if (TREE_CODE (p->decl) == FUNCTION_DECL
1190 && target_node && is_a <cgraph_node *> (target_node))
1192 struct cgraph_node *src_node = cgraph_node::get (p->decl);
1193 if (src_node && src_node->definition)
1194 src_node->reset ();
1195 cgraph_node::create_alias (p->decl, target_node->decl);
1196 alias_pairs->unordered_remove (i);
1198 else if (TREE_CODE (p->decl) == VAR_DECL
1199 && target_node && is_a <varpool_node *> (target_node))
1201 varpool_node::create_alias (p->decl, target_node->decl);
1202 alias_pairs->unordered_remove (i);
1204 else
1206 error ("%q+D alias in between function and variable is not supported",
1207 p->decl);
1208 warning (0, "%q+D aliased declaration",
1209 target_node->decl);
1210 alias_pairs->unordered_remove (i);
1213 vec_free (alias_pairs);
1217 /* Figure out what functions we want to assemble. */
1219 static void
1220 mark_functions_to_output (void)
1222 struct cgraph_node *node;
1223 #ifdef ENABLE_CHECKING
1224 bool check_same_comdat_groups = false;
1226 FOR_EACH_FUNCTION (node)
1227 gcc_assert (!node->process);
1228 #endif
1230 FOR_EACH_FUNCTION (node)
1232 tree decl = node->decl;
1234 gcc_assert (!node->process || node->same_comdat_group);
1235 if (node->process)
1236 continue;
1238 /* We need to output all local functions that are used and not
1239 always inlined, as well as those that are reachable from
1240 outside the current compilation unit. */
1241 if (node->analyzed
1242 && !node->thunk.thunk_p
1243 && !node->alias
1244 && !node->global.inlined_to
1245 && !TREE_ASM_WRITTEN (decl)
1246 && !DECL_EXTERNAL (decl))
1248 node->process = 1;
1249 if (node->same_comdat_group)
1251 struct cgraph_node *next;
1252 for (next = dyn_cast<cgraph_node *> (node->same_comdat_group);
1253 next != node;
1254 next = dyn_cast<cgraph_node *> (next->same_comdat_group))
1255 if (!next->thunk.thunk_p && !next->alias
1256 && !next->comdat_local_p ())
1257 next->process = 1;
1260 else if (node->same_comdat_group)
1262 #ifdef ENABLE_CHECKING
1263 check_same_comdat_groups = true;
1264 #endif
1266 else
1268 /* We should've reclaimed all functions that are not needed. */
1269 #ifdef ENABLE_CHECKING
1270 if (!node->global.inlined_to
1271 && gimple_has_body_p (decl)
1272 /* FIXME: in ltrans unit when offline copy is outside partition but inline copies
1273 are inside partition, we can end up not removing the body since we no longer
1274 have analyzed node pointing to it. */
1275 && !node->in_other_partition
1276 && !node->alias
1277 && !node->clones
1278 && !DECL_EXTERNAL (decl))
1280 node->debug ();
1281 internal_error ("failed to reclaim unneeded function");
1283 #endif
1284 gcc_assert (node->global.inlined_to
1285 || !gimple_has_body_p (decl)
1286 || node->in_other_partition
1287 || node->clones
1288 || DECL_ARTIFICIAL (decl)
1289 || DECL_EXTERNAL (decl));
1294 #ifdef ENABLE_CHECKING
1295 if (check_same_comdat_groups)
1296 FOR_EACH_FUNCTION (node)
1297 if (node->same_comdat_group && !node->process)
1299 tree decl = node->decl;
1300 if (!node->global.inlined_to
1301 && gimple_has_body_p (decl)
1302 /* FIXME: in an ltrans unit when the offline copy is outside a
1303 partition but inline copies are inside a partition, we can
1304 end up not removing the body since we no longer have an
1305 analyzed node pointing to it. */
1306 && !node->in_other_partition
1307 && !node->clones
1308 && !DECL_EXTERNAL (decl))
1310 node->debug ();
1311 internal_error ("failed to reclaim unneeded function in same "
1312 "comdat group");
1315 #endif
1318 /* DECL is FUNCTION_DECL. Initialize datastructures so DECL is a function
1319 in lowered gimple form. IN_SSA is true if the gimple is in SSA.
1321 Set current_function_decl and cfun to newly constructed empty function body.
1322 return basic block in the function body. */
1324 basic_block
1325 init_lowered_empty_function (tree decl, bool in_ssa)
1327 basic_block bb;
1329 current_function_decl = decl;
1330 allocate_struct_function (decl, false);
1331 gimple_register_cfg_hooks ();
1332 init_empty_tree_cfg ();
1334 if (in_ssa)
1336 init_tree_ssa (cfun);
1337 init_ssa_operands (cfun);
1338 cfun->gimple_df->in_ssa_p = true;
1339 cfun->curr_properties |= PROP_ssa;
1342 DECL_INITIAL (decl) = make_node (BLOCK);
1344 DECL_SAVED_TREE (decl) = error_mark_node;
1345 cfun->curr_properties |= (PROP_gimple_lcf | PROP_gimple_leh | PROP_gimple_any
1346 | PROP_cfg | PROP_loops);
1348 set_loops_for_fn (cfun, ggc_cleared_alloc<loops> ());
1349 init_loops_structure (cfun, loops_for_fn (cfun), 1);
1350 loops_for_fn (cfun)->state |= LOOPS_MAY_HAVE_MULTIPLE_LATCHES;
1352 /* Create BB for body of the function and connect it properly. */
1353 bb = create_basic_block (NULL, (void *) 0, ENTRY_BLOCK_PTR_FOR_FN (cfun));
1354 make_edge (ENTRY_BLOCK_PTR_FOR_FN (cfun), bb, EDGE_FALLTHRU);
1355 make_edge (bb, EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
1356 add_bb_to_loop (bb, ENTRY_BLOCK_PTR_FOR_FN (cfun)->loop_father);
1358 return bb;
1361 /* Adjust PTR by the constant FIXED_OFFSET, and by the vtable
1362 offset indicated by VIRTUAL_OFFSET, if that is
1363 non-null. THIS_ADJUSTING is nonzero for a this adjusting thunk and
1364 zero for a result adjusting thunk. */
1366 static tree
1367 thunk_adjust (gimple_stmt_iterator * bsi,
1368 tree ptr, bool this_adjusting,
1369 HOST_WIDE_INT fixed_offset, tree virtual_offset)
1371 gimple stmt;
1372 tree ret;
1374 if (this_adjusting
1375 && fixed_offset != 0)
1377 stmt = gimple_build_assign
1378 (ptr, fold_build_pointer_plus_hwi_loc (input_location,
1379 ptr,
1380 fixed_offset));
1381 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1384 /* If there's a virtual offset, look up that value in the vtable and
1385 adjust the pointer again. */
1386 if (virtual_offset)
1388 tree vtabletmp;
1389 tree vtabletmp2;
1390 tree vtabletmp3;
1392 if (!vtable_entry_type)
1394 tree vfunc_type = make_node (FUNCTION_TYPE);
1395 TREE_TYPE (vfunc_type) = integer_type_node;
1396 TYPE_ARG_TYPES (vfunc_type) = NULL_TREE;
1397 layout_type (vfunc_type);
1399 vtable_entry_type = build_pointer_type (vfunc_type);
1402 vtabletmp =
1403 create_tmp_reg (build_pointer_type
1404 (build_pointer_type (vtable_entry_type)), "vptr");
1406 /* The vptr is always at offset zero in the object. */
1407 stmt = gimple_build_assign (vtabletmp,
1408 build1 (NOP_EXPR, TREE_TYPE (vtabletmp),
1409 ptr));
1410 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1412 /* Form the vtable address. */
1413 vtabletmp2 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp)),
1414 "vtableaddr");
1415 stmt = gimple_build_assign (vtabletmp2,
1416 build_simple_mem_ref (vtabletmp));
1417 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1419 /* Find the entry with the vcall offset. */
1420 stmt = gimple_build_assign (vtabletmp2,
1421 fold_build_pointer_plus_loc (input_location,
1422 vtabletmp2,
1423 virtual_offset));
1424 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1426 /* Get the offset itself. */
1427 vtabletmp3 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp2)),
1428 "vcalloffset");
1429 stmt = gimple_build_assign (vtabletmp3,
1430 build_simple_mem_ref (vtabletmp2));
1431 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1433 /* Adjust the `this' pointer. */
1434 ptr = fold_build_pointer_plus_loc (input_location, ptr, vtabletmp3);
1435 ptr = force_gimple_operand_gsi (bsi, ptr, true, NULL_TREE, false,
1436 GSI_CONTINUE_LINKING);
1439 if (!this_adjusting
1440 && fixed_offset != 0)
1441 /* Adjust the pointer by the constant. */
1443 tree ptrtmp;
1445 if (TREE_CODE (ptr) == VAR_DECL)
1446 ptrtmp = ptr;
1447 else
1449 ptrtmp = create_tmp_reg (TREE_TYPE (ptr), "ptr");
1450 stmt = gimple_build_assign (ptrtmp, ptr);
1451 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1453 ptr = fold_build_pointer_plus_hwi_loc (input_location,
1454 ptrtmp, fixed_offset);
1457 /* Emit the statement and gimplify the adjustment expression. */
1458 ret = create_tmp_reg (TREE_TYPE (ptr), "adjusted_this");
1459 stmt = gimple_build_assign (ret, ptr);
1460 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1462 return ret;
1465 /* Expand thunk NODE to gimple if possible.
1466 When FORCE_GIMPLE_THUNK is true, gimple thunk is created and
1467 no assembler is produced.
1468 When OUTPUT_ASM_THUNK is true, also produce assembler for
1469 thunks that are not lowered. */
1471 bool
1472 cgraph_node::expand_thunk (bool output_asm_thunks, bool force_gimple_thunk)
1474 bool this_adjusting = thunk.this_adjusting;
1475 HOST_WIDE_INT fixed_offset = thunk.fixed_offset;
1476 HOST_WIDE_INT virtual_value = thunk.virtual_value;
1477 tree virtual_offset = NULL;
1478 tree alias = callees->callee->decl;
1479 tree thunk_fndecl = decl;
1480 tree a;
1483 if (!force_gimple_thunk && this_adjusting
1484 && targetm.asm_out.can_output_mi_thunk (thunk_fndecl, fixed_offset,
1485 virtual_value, alias))
1487 const char *fnname;
1488 tree fn_block;
1489 tree restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1491 if (!output_asm_thunks)
1492 return false;
1494 if (in_lto_p)
1495 get_body ();
1496 a = DECL_ARGUMENTS (thunk_fndecl);
1498 current_function_decl = thunk_fndecl;
1500 /* Ensure thunks are emitted in their correct sections. */
1501 resolve_unique_section (thunk_fndecl, 0, flag_function_sections);
1503 DECL_RESULT (thunk_fndecl)
1504 = build_decl (DECL_SOURCE_LOCATION (thunk_fndecl),
1505 RESULT_DECL, 0, restype);
1506 DECL_CONTEXT (DECL_RESULT (thunk_fndecl)) = thunk_fndecl;
1507 fnname = IDENTIFIER_POINTER (DECL_ASSEMBLER_NAME (thunk_fndecl));
1509 /* The back end expects DECL_INITIAL to contain a BLOCK, so we
1510 create one. */
1511 fn_block = make_node (BLOCK);
1512 BLOCK_VARS (fn_block) = a;
1513 DECL_INITIAL (thunk_fndecl) = fn_block;
1514 init_function_start (thunk_fndecl);
1515 cfun->is_thunk = 1;
1516 insn_locations_init ();
1517 set_curr_insn_location (DECL_SOURCE_LOCATION (thunk_fndecl));
1518 prologue_location = curr_insn_location ();
1519 assemble_start_function (thunk_fndecl, fnname);
1521 targetm.asm_out.output_mi_thunk (asm_out_file, thunk_fndecl,
1522 fixed_offset, virtual_value, alias);
1524 assemble_end_function (thunk_fndecl, fnname);
1525 insn_locations_finalize ();
1526 init_insn_lengths ();
1527 free_after_compilation (cfun);
1528 set_cfun (NULL);
1529 TREE_ASM_WRITTEN (thunk_fndecl) = 1;
1530 thunk.thunk_p = false;
1531 analyzed = false;
1533 else
1535 tree restype;
1536 basic_block bb, then_bb, else_bb, return_bb;
1537 gimple_stmt_iterator bsi;
1538 int nargs = 0;
1539 tree arg;
1540 int i;
1541 tree resdecl;
1542 tree restmp = NULL;
1544 gimple call;
1545 gimple ret;
1547 if (in_lto_p)
1548 get_body ();
1549 a = DECL_ARGUMENTS (thunk_fndecl);
1551 current_function_decl = thunk_fndecl;
1553 /* Ensure thunks are emitted in their correct sections. */
1554 resolve_unique_section (thunk_fndecl, 0, flag_function_sections);
1556 DECL_IGNORED_P (thunk_fndecl) = 1;
1557 bitmap_obstack_initialize (NULL);
1559 if (thunk.virtual_offset_p)
1560 virtual_offset = size_int (virtual_value);
1562 /* Build the return declaration for the function. */
1563 restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1564 if (DECL_RESULT (thunk_fndecl) == NULL_TREE)
1566 resdecl = build_decl (input_location, RESULT_DECL, 0, restype);
1567 DECL_ARTIFICIAL (resdecl) = 1;
1568 DECL_IGNORED_P (resdecl) = 1;
1569 DECL_RESULT (thunk_fndecl) = resdecl;
1570 DECL_CONTEXT (DECL_RESULT (thunk_fndecl)) = thunk_fndecl;
1572 else
1573 resdecl = DECL_RESULT (thunk_fndecl);
1575 bb = then_bb = else_bb = return_bb = init_lowered_empty_function (thunk_fndecl, true);
1577 bsi = gsi_start_bb (bb);
1579 /* Build call to the function being thunked. */
1580 if (!VOID_TYPE_P (restype))
1582 if (DECL_BY_REFERENCE (resdecl))
1583 restmp = gimple_fold_indirect_ref (resdecl);
1584 else if (!is_gimple_reg_type (restype))
1586 restmp = resdecl;
1587 add_local_decl (cfun, restmp);
1588 BLOCK_VARS (DECL_INITIAL (current_function_decl)) = restmp;
1590 else
1591 restmp = create_tmp_reg (restype, "retval");
1594 for (arg = a; arg; arg = DECL_CHAIN (arg))
1595 nargs++;
1596 auto_vec<tree> vargs (nargs);
1597 if (this_adjusting)
1598 vargs.quick_push (thunk_adjust (&bsi, a, 1, fixed_offset,
1599 virtual_offset));
1600 else if (nargs)
1601 vargs.quick_push (a);
1603 if (nargs)
1604 for (i = 1, arg = DECL_CHAIN (a); i < nargs; i++, arg = DECL_CHAIN (arg))
1606 tree tmp = arg;
1607 if (!is_gimple_val (arg))
1609 tmp = create_tmp_reg (TYPE_MAIN_VARIANT
1610 (TREE_TYPE (arg)), "arg");
1611 gimple stmt = gimple_build_assign (tmp, arg);
1612 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1614 vargs.quick_push (tmp);
1616 call = gimple_build_call_vec (build_fold_addr_expr_loc (0, alias), vargs);
1617 callees->call_stmt = call;
1618 gimple_call_set_from_thunk (call, true);
1619 if (restmp)
1621 gimple_call_set_lhs (call, restmp);
1622 gcc_assert (useless_type_conversion_p (TREE_TYPE (restmp),
1623 TREE_TYPE (TREE_TYPE (alias))));
1625 gsi_insert_after (&bsi, call, GSI_NEW_STMT);
1626 if (!(gimple_call_flags (call) & ECF_NORETURN))
1628 if (restmp && !this_adjusting
1629 && (fixed_offset || virtual_offset))
1631 tree true_label = NULL_TREE;
1633 if (TREE_CODE (TREE_TYPE (restmp)) == POINTER_TYPE)
1635 gimple stmt;
1636 /* If the return type is a pointer, we need to
1637 protect against NULL. We know there will be an
1638 adjustment, because that's why we're emitting a
1639 thunk. */
1640 then_bb = create_basic_block (NULL, (void *) 0, bb);
1641 return_bb = create_basic_block (NULL, (void *) 0, then_bb);
1642 else_bb = create_basic_block (NULL, (void *) 0, else_bb);
1643 add_bb_to_loop (then_bb, bb->loop_father);
1644 add_bb_to_loop (return_bb, bb->loop_father);
1645 add_bb_to_loop (else_bb, bb->loop_father);
1646 remove_edge (single_succ_edge (bb));
1647 true_label = gimple_block_label (then_bb);
1648 stmt = gimple_build_cond (NE_EXPR, restmp,
1649 build_zero_cst (TREE_TYPE (restmp)),
1650 NULL_TREE, NULL_TREE);
1651 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1652 make_edge (bb, then_bb, EDGE_TRUE_VALUE);
1653 make_edge (bb, else_bb, EDGE_FALSE_VALUE);
1654 make_edge (return_bb, EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
1655 make_edge (then_bb, return_bb, EDGE_FALLTHRU);
1656 make_edge (else_bb, return_bb, EDGE_FALLTHRU);
1657 bsi = gsi_last_bb (then_bb);
1660 restmp = thunk_adjust (&bsi, restmp, /*this_adjusting=*/0,
1661 fixed_offset, virtual_offset);
1662 if (true_label)
1664 gimple stmt;
1665 bsi = gsi_last_bb (else_bb);
1666 stmt = gimple_build_assign (restmp,
1667 build_zero_cst (TREE_TYPE (restmp)));
1668 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1669 bsi = gsi_last_bb (return_bb);
1672 else
1673 gimple_call_set_tail (call, true);
1675 /* Build return value. */
1676 ret = gimple_build_return (restmp);
1677 gsi_insert_after (&bsi, ret, GSI_NEW_STMT);
1679 else
1681 gimple_call_set_tail (call, true);
1682 remove_edge (single_succ_edge (bb));
1685 cfun->gimple_df->in_ssa_p = true;
1686 /* FIXME: C++ FE should stop setting TREE_ASM_WRITTEN on thunks. */
1687 TREE_ASM_WRITTEN (thunk_fndecl) = false;
1688 delete_unreachable_blocks ();
1689 update_ssa (TODO_update_ssa);
1690 #ifdef ENABLE_CHECKING
1691 verify_flow_info ();
1692 #endif
1693 free_dominance_info (CDI_DOMINATORS);
1695 /* Since we want to emit the thunk, we explicitly mark its name as
1696 referenced. */
1697 thunk.thunk_p = false;
1698 lowered = true;
1699 bitmap_obstack_release (NULL);
1701 current_function_decl = NULL;
1702 set_cfun (NULL);
1703 return true;
1706 /* Assemble thunks and aliases associated to NODE. */
1708 static void
1709 assemble_thunks_and_aliases (struct cgraph_node *node)
1711 struct cgraph_edge *e;
1712 struct ipa_ref *ref;
1714 for (e = node->callers; e;)
1715 if (e->caller->thunk.thunk_p)
1717 struct cgraph_node *thunk = e->caller;
1719 e = e->next_caller;
1720 thunk->expand_thunk (true, false);
1721 assemble_thunks_and_aliases (thunk);
1723 else
1724 e = e->next_caller;
1726 FOR_EACH_ALIAS (node, ref)
1728 struct cgraph_node *alias = dyn_cast <cgraph_node *> (ref->referring);
1729 bool saved_written = TREE_ASM_WRITTEN (node->decl);
1731 /* Force assemble_alias to really output the alias this time instead
1732 of buffering it in same alias pairs. */
1733 TREE_ASM_WRITTEN (node->decl) = 1;
1734 do_assemble_alias (alias->decl,
1735 DECL_ASSEMBLER_NAME (node->decl));
1736 assemble_thunks_and_aliases (alias);
1737 TREE_ASM_WRITTEN (node->decl) = saved_written;
1741 /* Expand function specified by NODE. */
1743 static void
1744 expand_function (struct cgraph_node *node)
1746 tree decl = node->decl;
1747 location_t saved_loc;
1749 /* We ought to not compile any inline clones. */
1750 gcc_assert (!node->global.inlined_to);
1752 announce_function (decl);
1753 node->process = 0;
1754 gcc_assert (node->lowered);
1755 node->get_body ();
1757 /* Generate RTL for the body of DECL. */
1759 timevar_push (TV_REST_OF_COMPILATION);
1761 gcc_assert (cgraph_global_info_ready);
1763 /* Initialize the default bitmap obstack. */
1764 bitmap_obstack_initialize (NULL);
1766 /* Initialize the RTL code for the function. */
1767 current_function_decl = decl;
1768 saved_loc = input_location;
1769 input_location = DECL_SOURCE_LOCATION (decl);
1770 init_function_start (decl);
1772 gimple_register_cfg_hooks ();
1774 bitmap_obstack_initialize (&reg_obstack); /* FIXME, only at RTL generation*/
1776 execute_all_ipa_transforms ();
1778 /* Perform all tree transforms and optimizations. */
1780 /* Signal the start of passes. */
1781 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_START, NULL);
1783 execute_pass_list (cfun, g->get_passes ()->all_passes);
1785 /* Signal the end of passes. */
1786 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_END, NULL);
1788 bitmap_obstack_release (&reg_obstack);
1790 /* Release the default bitmap obstack. */
1791 bitmap_obstack_release (NULL);
1793 /* If requested, warn about function definitions where the function will
1794 return a value (usually of some struct or union type) which itself will
1795 take up a lot of stack space. */
1796 if (warn_larger_than && !DECL_EXTERNAL (decl) && TREE_TYPE (decl))
1798 tree ret_type = TREE_TYPE (TREE_TYPE (decl));
1800 if (ret_type && TYPE_SIZE_UNIT (ret_type)
1801 && TREE_CODE (TYPE_SIZE_UNIT (ret_type)) == INTEGER_CST
1802 && 0 < compare_tree_int (TYPE_SIZE_UNIT (ret_type),
1803 larger_than_size))
1805 unsigned int size_as_int
1806 = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (ret_type));
1808 if (compare_tree_int (TYPE_SIZE_UNIT (ret_type), size_as_int) == 0)
1809 warning (OPT_Wlarger_than_, "size of return value of %q+D is %u bytes",
1810 decl, size_as_int);
1811 else
1812 warning (OPT_Wlarger_than_, "size of return value of %q+D is larger than %wd bytes",
1813 decl, larger_than_size);
1817 gimple_set_body (decl, NULL);
1818 if (DECL_STRUCT_FUNCTION (decl) == 0
1819 && !cgraph_node::get (decl)->origin)
1821 /* Stop pointing to the local nodes about to be freed.
1822 But DECL_INITIAL must remain nonzero so we know this
1823 was an actual function definition.
1824 For a nested function, this is done in c_pop_function_context.
1825 If rest_of_compilation set this to 0, leave it 0. */
1826 if (DECL_INITIAL (decl) != 0)
1827 DECL_INITIAL (decl) = error_mark_node;
1830 input_location = saved_loc;
1832 ggc_collect ();
1833 timevar_pop (TV_REST_OF_COMPILATION);
1835 /* Make sure that BE didn't give up on compiling. */
1836 gcc_assert (TREE_ASM_WRITTEN (decl));
1837 set_cfun (NULL);
1838 current_function_decl = NULL;
1840 /* It would make a lot more sense to output thunks before function body to get more
1841 forward and lest backwarding jumps. This however would need solving problem
1842 with comdats. See PR48668. Also aliases must come after function itself to
1843 make one pass assemblers, like one on AIX, happy. See PR 50689.
1844 FIXME: Perhaps thunks should be move before function IFF they are not in comdat
1845 groups. */
1846 assemble_thunks_and_aliases (node);
1847 node->release_body ();
1848 /* Eliminate all call edges. This is important so the GIMPLE_CALL no longer
1849 points to the dead function body. */
1850 node->remove_callees ();
1851 node->remove_all_references ();
1854 /* Node comparer that is responsible for the order that corresponds
1855 to time when a function was launched for the first time. */
1857 static int
1858 node_cmp (const void *pa, const void *pb)
1860 const struct cgraph_node *a = *(const struct cgraph_node * const *) pa;
1861 const struct cgraph_node *b = *(const struct cgraph_node * const *) pb;
1863 /* Functions with time profile must be before these without profile. */
1864 if (!a->tp_first_run || !b->tp_first_run)
1865 return a->tp_first_run - b->tp_first_run;
1867 return a->tp_first_run != b->tp_first_run
1868 ? b->tp_first_run - a->tp_first_run
1869 : b->order - a->order;
1872 /* Expand all functions that must be output.
1874 Attempt to topologically sort the nodes so function is output when
1875 all called functions are already assembled to allow data to be
1876 propagated across the callgraph. Use a stack to get smaller distance
1877 between a function and its callees (later we may choose to use a more
1878 sophisticated algorithm for function reordering; we will likely want
1879 to use subsections to make the output functions appear in top-down
1880 order). */
1882 static void
1883 expand_all_functions (void)
1885 struct cgraph_node *node;
1886 struct cgraph_node **order = XCNEWVEC (struct cgraph_node *, cgraph_n_nodes);
1887 unsigned int expanded_func_count = 0, profiled_func_count = 0;
1888 int order_pos, new_order_pos = 0;
1889 int i;
1891 order_pos = ipa_reverse_postorder (order);
1892 gcc_assert (order_pos == cgraph_n_nodes);
1894 /* Garbage collector may remove inline clones we eliminate during
1895 optimization. So we must be sure to not reference them. */
1896 for (i = 0; i < order_pos; i++)
1897 if (order[i]->process)
1898 order[new_order_pos++] = order[i];
1900 if (flag_profile_reorder_functions)
1901 qsort (order, new_order_pos, sizeof (struct cgraph_node *), node_cmp);
1903 for (i = new_order_pos - 1; i >= 0; i--)
1905 node = order[i];
1907 if (node->process)
1909 expanded_func_count++;
1910 if(node->tp_first_run)
1911 profiled_func_count++;
1913 if (cgraph_dump_file)
1914 fprintf (cgraph_dump_file, "Time profile order in expand_all_functions:%s:%d\n", node->asm_name (), node->tp_first_run);
1916 node->process = 0;
1917 expand_function (node);
1921 if (dump_file)
1922 fprintf (dump_file, "Expanded functions with time profile (%s):%u/%u\n",
1923 main_input_filename, profiled_func_count, expanded_func_count);
1925 if (cgraph_dump_file && flag_profile_reorder_functions)
1926 fprintf (cgraph_dump_file, "Expanded functions with time profile:%u/%u\n",
1927 profiled_func_count, expanded_func_count);
1929 cgraph_process_new_functions ();
1930 free_gimplify_stack ();
1932 free (order);
1935 /* This is used to sort the node types by the cgraph order number. */
1937 enum cgraph_order_sort_kind
1939 ORDER_UNDEFINED = 0,
1940 ORDER_FUNCTION,
1941 ORDER_VAR,
1942 ORDER_ASM
1945 struct cgraph_order_sort
1947 enum cgraph_order_sort_kind kind;
1948 union
1950 struct cgraph_node *f;
1951 varpool_node *v;
1952 struct asm_node *a;
1953 } u;
1956 /* Output all functions, variables, and asm statements in the order
1957 according to their order fields, which is the order in which they
1958 appeared in the file. This implements -fno-toplevel-reorder. In
1959 this mode we may output functions and variables which don't really
1960 need to be output. */
1962 static void
1963 output_in_order (void)
1965 int max;
1966 struct cgraph_order_sort *nodes;
1967 int i;
1968 struct cgraph_node *pf;
1969 varpool_node *pv;
1970 struct asm_node *pa;
1972 max = symtab_order;
1973 nodes = XCNEWVEC (struct cgraph_order_sort, max);
1975 FOR_EACH_DEFINED_FUNCTION (pf)
1977 if (pf->process && !pf->thunk.thunk_p && !pf->alias)
1979 i = pf->order;
1980 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
1981 nodes[i].kind = ORDER_FUNCTION;
1982 nodes[i].u.f = pf;
1986 FOR_EACH_DEFINED_VARIABLE (pv)
1987 if (!DECL_EXTERNAL (pv->decl))
1989 i = pv->order;
1990 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
1991 nodes[i].kind = ORDER_VAR;
1992 nodes[i].u.v = pv;
1995 for (pa = asm_nodes; pa; pa = pa->next)
1997 i = pa->order;
1998 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
1999 nodes[i].kind = ORDER_ASM;
2000 nodes[i].u.a = pa;
2003 /* In toplevel reorder mode we output all statics; mark them as needed. */
2005 for (i = 0; i < max; ++i)
2006 if (nodes[i].kind == ORDER_VAR)
2007 nodes[i].u.v->finalize_named_section_flags ();
2009 for (i = 0; i < max; ++i)
2011 switch (nodes[i].kind)
2013 case ORDER_FUNCTION:
2014 nodes[i].u.f->process = 0;
2015 expand_function (nodes[i].u.f);
2016 break;
2018 case ORDER_VAR:
2019 nodes[i].u.v->assemble_decl ();
2020 break;
2022 case ORDER_ASM:
2023 assemble_asm (nodes[i].u.a->asm_str);
2024 break;
2026 case ORDER_UNDEFINED:
2027 break;
2029 default:
2030 gcc_unreachable ();
2034 asm_nodes = NULL;
2035 free (nodes);
2038 static void
2039 ipa_passes (void)
2041 gcc::pass_manager *passes = g->get_passes ();
2043 set_cfun (NULL);
2044 current_function_decl = NULL;
2045 gimple_register_cfg_hooks ();
2046 bitmap_obstack_initialize (NULL);
2048 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_START, NULL);
2050 if (!in_lto_p)
2052 execute_ipa_pass_list (passes->all_small_ipa_passes);
2053 if (seen_error ())
2054 return;
2057 /* We never run removal of unreachable nodes after early passes. This is
2058 because TODO is run before the subpasses. It is important to remove
2059 the unreachable functions to save works at IPA level and to get LTO
2060 symbol tables right. */
2061 symtab_remove_unreachable_nodes (true, cgraph_dump_file);
2063 /* If pass_all_early_optimizations was not scheduled, the state of
2064 the cgraph will not be properly updated. Update it now. */
2065 if (cgraph_state < CGRAPH_STATE_IPA_SSA)
2066 cgraph_state = CGRAPH_STATE_IPA_SSA;
2068 if (!in_lto_p)
2070 /* Generate coverage variables and constructors. */
2071 coverage_finish ();
2073 /* Process new functions added. */
2074 set_cfun (NULL);
2075 current_function_decl = NULL;
2076 cgraph_process_new_functions ();
2078 execute_ipa_summary_passes
2079 ((ipa_opt_pass_d *) passes->all_regular_ipa_passes);
2082 /* Some targets need to handle LTO assembler output specially. */
2083 if (flag_generate_lto)
2084 targetm.asm_out.lto_start ();
2086 if (!in_lto_p)
2087 ipa_write_summaries ();
2089 if (flag_generate_lto)
2090 targetm.asm_out.lto_end ();
2092 if (!flag_ltrans && (in_lto_p || !flag_lto || flag_fat_lto_objects))
2093 execute_ipa_pass_list (passes->all_regular_ipa_passes);
2094 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_END, NULL);
2096 bitmap_obstack_release (NULL);
2100 /* Return string alias is alias of. */
2102 static tree
2103 get_alias_symbol (tree decl)
2105 tree alias = lookup_attribute ("alias", DECL_ATTRIBUTES (decl));
2106 return get_identifier (TREE_STRING_POINTER
2107 (TREE_VALUE (TREE_VALUE (alias))));
2111 /* Weakrefs may be associated to external decls and thus not output
2112 at expansion time. Emit all necessary aliases. */
2114 static void
2115 output_weakrefs (void)
2117 symtab_node *node;
2118 FOR_EACH_SYMBOL (node)
2119 if (node->alias
2120 && !TREE_ASM_WRITTEN (node->decl)
2121 && node->weakref)
2123 tree target;
2125 /* Weakrefs are special by not requiring target definition in current
2126 compilation unit. It is thus bit hard to work out what we want to
2127 alias.
2128 When alias target is defined, we need to fetch it from symtab reference,
2129 otherwise it is pointed to by alias_target. */
2130 if (node->alias_target)
2131 target = (DECL_P (node->alias_target)
2132 ? DECL_ASSEMBLER_NAME (node->alias_target)
2133 : node->alias_target);
2134 else if (node->analyzed)
2135 target = DECL_ASSEMBLER_NAME (node->get_alias_target ()->decl);
2136 else
2138 gcc_unreachable ();
2139 target = get_alias_symbol (node->decl);
2141 do_assemble_alias (node->decl, target);
2145 /* Initialize callgraph dump file. */
2147 void
2148 init_cgraph (void)
2150 if (!cgraph_dump_file)
2151 cgraph_dump_file = dump_begin (TDI_cgraph, NULL);
2155 /* Perform simple optimizations based on callgraph. */
2157 void
2158 compile (void)
2160 if (seen_error ())
2161 return;
2163 #ifdef ENABLE_CHECKING
2164 symtab_node::verify_symtab_nodes ();
2165 #endif
2167 timevar_push (TV_CGRAPHOPT);
2168 if (pre_ipa_mem_report)
2170 fprintf (stderr, "Memory consumption before IPA\n");
2171 dump_memory_report (false);
2173 if (!quiet_flag)
2174 fprintf (stderr, "Performing interprocedural optimizations\n");
2175 cgraph_state = CGRAPH_STATE_IPA;
2177 /* If LTO is enabled, initialize the streamer hooks needed by GIMPLE. */
2178 if (flag_lto)
2179 lto_streamer_hooks_init ();
2181 /* Don't run the IPA passes if there was any error or sorry messages. */
2182 if (!seen_error ())
2183 ipa_passes ();
2185 /* Do nothing else if any IPA pass found errors or if we are just streaming LTO. */
2186 if (seen_error ()
2187 || (!in_lto_p && flag_lto && !flag_fat_lto_objects))
2189 timevar_pop (TV_CGRAPHOPT);
2190 return;
2193 /* This pass remove bodies of extern inline functions we never inlined.
2194 Do this later so other IPA passes see what is really going on. */
2195 symtab_remove_unreachable_nodes (false, dump_file);
2196 cgraph_global_info_ready = true;
2197 if (cgraph_dump_file)
2199 fprintf (cgraph_dump_file, "Optimized ");
2200 symtab_node:: dump_table (cgraph_dump_file);
2202 if (post_ipa_mem_report)
2204 fprintf (stderr, "Memory consumption after IPA\n");
2205 dump_memory_report (false);
2207 timevar_pop (TV_CGRAPHOPT);
2209 /* Output everything. */
2210 (*debug_hooks->assembly_start) ();
2211 if (!quiet_flag)
2212 fprintf (stderr, "Assembling functions:\n");
2213 #ifdef ENABLE_CHECKING
2214 symtab_node::verify_symtab_nodes ();
2215 #endif
2217 cgraph_materialize_all_clones ();
2218 bitmap_obstack_initialize (NULL);
2219 execute_ipa_pass_list (g->get_passes ()->all_late_ipa_passes);
2220 symtab_remove_unreachable_nodes (true, dump_file);
2221 #ifdef ENABLE_CHECKING
2222 symtab_node::verify_symtab_nodes ();
2223 #endif
2224 bitmap_obstack_release (NULL);
2225 mark_functions_to_output ();
2227 /* When weakref support is missing, we autmatically translate all
2228 references to NODE to references to its ultimate alias target.
2229 The renaming mechanizm uses flag IDENTIFIER_TRANSPARENT_ALIAS and
2230 TREE_CHAIN.
2232 Set up this mapping before we output any assembler but once we are sure
2233 that all symbol renaming is done.
2235 FIXME: All this uglyness can go away if we just do renaming at gimple
2236 level by physically rewritting the IL. At the moment we can only redirect
2237 calls, so we need infrastructure for renaming references as well. */
2238 #ifndef ASM_OUTPUT_WEAKREF
2239 symtab_node *node;
2241 FOR_EACH_SYMBOL (node)
2242 if (node->alias
2243 && lookup_attribute ("weakref", DECL_ATTRIBUTES (node->decl)))
2245 IDENTIFIER_TRANSPARENT_ALIAS
2246 (DECL_ASSEMBLER_NAME (node->decl)) = 1;
2247 TREE_CHAIN (DECL_ASSEMBLER_NAME (node->decl))
2248 = (node->alias_target ? node->alias_target
2249 : DECL_ASSEMBLER_NAME (node->get_alias_target ()->decl));
2251 #endif
2253 cgraph_state = CGRAPH_STATE_EXPANSION;
2255 if (!flag_toplevel_reorder)
2256 output_in_order ();
2257 else
2259 output_asm_statements ();
2261 expand_all_functions ();
2262 varpool_node::output_variables ();
2265 cgraph_process_new_functions ();
2266 cgraph_state = CGRAPH_STATE_FINISHED;
2267 output_weakrefs ();
2269 if (cgraph_dump_file)
2271 fprintf (cgraph_dump_file, "\nFinal ");
2272 symtab_node::dump_table (cgraph_dump_file);
2274 #ifdef ENABLE_CHECKING
2275 symtab_node::verify_symtab_nodes ();
2276 /* Double check that all inline clones are gone and that all
2277 function bodies have been released from memory. */
2278 if (!seen_error ())
2280 struct cgraph_node *node;
2281 bool error_found = false;
2283 FOR_EACH_DEFINED_FUNCTION (node)
2284 if (node->global.inlined_to
2285 || gimple_has_body_p (node->decl))
2287 error_found = true;
2288 node->debug ();
2290 if (error_found)
2291 internal_error ("nodes with unreleased memory found");
2293 #endif
2297 /* Analyze the whole compilation unit once it is parsed completely. */
2299 void
2300 finalize_compilation_unit (void)
2302 timevar_push (TV_CGRAPH);
2304 /* If we're here there's no current function anymore. Some frontends
2305 are lazy in clearing these. */
2306 current_function_decl = NULL;
2307 set_cfun (NULL);
2309 /* Do not skip analyzing the functions if there were errors, we
2310 miss diagnostics for following functions otherwise. */
2312 /* Emit size functions we didn't inline. */
2313 finalize_size_functions ();
2315 /* Mark alias targets necessary and emit diagnostics. */
2316 handle_alias_pairs ();
2318 if (!quiet_flag)
2320 fprintf (stderr, "\nAnalyzing compilation unit\n");
2321 fflush (stderr);
2324 if (flag_dump_passes)
2325 dump_passes ();
2327 /* Gimplify and lower all functions, compute reachability and
2328 remove unreachable nodes. */
2329 analyze_functions ();
2331 /* Mark alias targets necessary and emit diagnostics. */
2332 handle_alias_pairs ();
2334 /* Gimplify and lower thunks. */
2335 analyze_functions ();
2337 /* Finally drive the pass manager. */
2338 compile ();
2340 timevar_pop (TV_CGRAPH);
2343 /* Creates a wrapper from cgraph_node to TARGET node. Thunk is used for this
2344 kind of wrapper method. */
2346 void
2347 cgraph_node::create_wrapper (struct cgraph_node *target)
2349 /* Preserve DECL_RESULT so we get right by reference flag. */
2350 tree decl_result = DECL_RESULT (decl);
2352 /* Remove the function's body. */
2353 release_body ();
2354 reset ();
2356 DECL_RESULT (decl) = decl_result;
2357 DECL_INITIAL (decl) = NULL;
2358 allocate_struct_function (decl, false);
2359 set_cfun (NULL);
2361 /* Turn alias into thunk and expand it into GIMPLE representation. */
2362 definition = true;
2363 thunk.thunk_p = true;
2364 thunk.this_adjusting = false;
2366 struct cgraph_edge *e = create_edge (target, NULL, 0, CGRAPH_FREQ_BASE);
2368 if (!expand_thunk (false, true))
2369 analyzed = true;
2371 e->call_stmt_cannot_inline_p = true;
2373 /* Inline summary set-up. */
2374 analyze ();
2375 inline_analyze_function (this);
2378 #include "gt-cgraphunit.h"