Daily bump.
[official-gcc.git] / gcc / cgraphunit.c
blob04ef6faec44d6954fa2a20e0ced82c198493e2f0
1 /* Driver of optimization process
2 Copyright (C) 2003-2014 Free Software Foundation, Inc.
3 Contributed by Jan Hubicka
5 This file is part of GCC.
7 GCC is free software; you can redistribute it and/or modify it under
8 the terms of the GNU General Public License as published by the Free
9 Software Foundation; either version 3, or (at your option) any later
10 version.
12 GCC is distributed in the hope that it will be useful, but WITHOUT ANY
13 WARRANTY; without even the implied warranty of MERCHANTABILITY or
14 FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
15 for more details.
17 You should have received a copy of the GNU General Public License
18 along with GCC; see the file COPYING3. If not see
19 <http://www.gnu.org/licenses/>. */
21 /* This module implements main driver of compilation process.
23 The main scope of this file is to act as an interface in between
24 tree based frontends and the backend.
26 The front-end is supposed to use following functionality:
28 - cgraph_finalize_function
30 This function is called once front-end has parsed whole body of function
31 and it is certain that the function body nor the declaration will change.
33 (There is one exception needed for implementing GCC extern inline
34 function.)
36 - varpool_finalize_decl
38 This function has same behavior as the above but is used for static
39 variables.
41 - add_asm_node
43 Insert new toplevel ASM statement
45 - finalize_compilation_unit
47 This function is called once (source level) compilation unit is finalized
48 and it will no longer change.
50 The symbol table is constructed starting from the trivially needed
51 symbols finalized by the frontend. Functions are lowered into
52 GIMPLE representation and callgraph/reference lists are constructed.
53 Those are used to discover other necessary functions and variables.
55 At the end the bodies of unreachable functions are removed.
57 The function can be called multiple times when multiple source level
58 compilation units are combined.
60 - compile
62 This passes control to the back-end. Optimizations are performed and
63 final assembler is generated. This is done in the following way. Note
64 that with link time optimization the process is split into three
65 stages (compile time, linktime analysis and parallel linktime as
66 indicated bellow).
68 Compile time:
70 1) Inter-procedural optimization.
71 (ipa_passes)
73 This part is further split into:
75 a) early optimizations. These are local passes executed in
76 the topological order on the callgraph.
78 The purpose of early optimiations is to optimize away simple
79 things that may otherwise confuse IP analysis. Very simple
80 propagation across the callgraph is done i.e. to discover
81 functions without side effects and simple inlining is performed.
83 b) early small interprocedural passes.
85 Those are interprocedural passes executed only at compilation
86 time. These include, for example, transational memory lowering,
87 unreachable code removal and other simple transformations.
89 c) IP analysis stage. All interprocedural passes do their
90 analysis.
92 Interprocedural passes differ from small interprocedural
93 passes by their ability to operate across whole program
94 at linktime. Their analysis stage is performed early to
95 both reduce linking times and linktime memory usage by
96 not having to represent whole program in memory.
98 d) LTO sreaming. When doing LTO, everything important gets
99 streamed into the object file.
101 Compile time and or linktime analysis stage (WPA):
103 At linktime units gets streamed back and symbol table is
104 merged. Function bodies are not streamed in and not
105 available.
106 e) IP propagation stage. All IP passes execute their
107 IP propagation. This is done based on the earlier analysis
108 without having function bodies at hand.
109 f) Ltrans streaming. When doing WHOPR LTO, the program
110 is partitioned and streamed into multple object files.
112 Compile time and/or parallel linktime stage (ltrans)
114 Each of the object files is streamed back and compiled
115 separately. Now the function bodies becomes available
116 again.
118 2) Virtual clone materialization
119 (cgraph_materialize_clone)
121 IP passes can produce copies of existing functoins (such
122 as versioned clones or inline clones) without actually
123 manipulating their bodies by creating virtual clones in
124 the callgraph. At this time the virtual clones are
125 turned into real functions
126 3) IP transformation
128 All IP passes transform function bodies based on earlier
129 decision of the IP propagation.
131 4) late small IP passes
133 Simple IP passes working within single program partition.
135 5) Expansion
136 (expand_all_functions)
138 At this stage functions that needs to be output into
139 assembler are identified and compiled in topological order
140 6) Output of variables and aliases
141 Now it is known what variable references was not optimized
142 out and thus all variables are output to the file.
144 Note that with -fno-toplevel-reorder passes 5 and 6
145 are combined together in cgraph_output_in_order.
147 Finally there are functions to manipulate the callgraph from
148 backend.
149 - cgraph_add_new_function is used to add backend produced
150 functions introduced after the unit is finalized.
151 The functions are enqueue for later processing and inserted
152 into callgraph with cgraph_process_new_functions.
154 - cgraph_function_versioning
156 produces a copy of function into new one (a version)
157 and apply simple transformations
160 #include "config.h"
161 #include "system.h"
162 #include "coretypes.h"
163 #include "tm.h"
164 #include "tree.h"
165 #include "varasm.h"
166 #include "stor-layout.h"
167 #include "stringpool.h"
168 #include "output.h"
169 #include "rtl.h"
170 #include "basic-block.h"
171 #include "tree-ssa-alias.h"
172 #include "internal-fn.h"
173 #include "gimple-fold.h"
174 #include "gimple-expr.h"
175 #include "is-a.h"
176 #include "gimple.h"
177 #include "gimplify.h"
178 #include "gimple-iterator.h"
179 #include "gimplify-me.h"
180 #include "gimple-ssa.h"
181 #include "tree-cfg.h"
182 #include "tree-into-ssa.h"
183 #include "tree-ssa.h"
184 #include "tree-inline.h"
185 #include "langhooks.h"
186 #include "toplev.h"
187 #include "flags.h"
188 #include "debug.h"
189 #include "target.h"
190 #include "diagnostic.h"
191 #include "params.h"
192 #include "fibheap.h"
193 #include "intl.h"
194 #include "function.h"
195 #include "ipa-prop.h"
196 #include "tree-iterator.h"
197 #include "tree-pass.h"
198 #include "tree-dump.h"
199 #include "gimple-pretty-print.h"
200 #include "output.h"
201 #include "coverage.h"
202 #include "plugin.h"
203 #include "ipa-inline.h"
204 #include "ipa-utils.h"
205 #include "lto-streamer.h"
206 #include "except.h"
207 #include "cfgloop.h"
208 #include "regset.h" /* FIXME: For reg_obstack. */
209 #include "context.h"
210 #include "pass_manager.h"
211 #include "tree-nested.h"
212 #include "gimplify.h"
214 /* Queue of cgraph nodes scheduled to be added into cgraph. This is a
215 secondary queue used during optimization to accommodate passes that
216 may generate new functions that need to be optimized and expanded. */
217 cgraph_node_set cgraph_new_nodes;
219 static void expand_all_functions (void);
220 static void mark_functions_to_output (void);
221 static void expand_function (struct cgraph_node *);
222 static void analyze_function (struct cgraph_node *);
223 static void handle_alias_pairs (void);
225 FILE *cgraph_dump_file;
227 /* Linked list of cgraph asm nodes. */
228 struct asm_node *asm_nodes;
230 /* Last node in cgraph_asm_nodes. */
231 static GTY(()) struct asm_node *asm_last_node;
233 /* Used for vtable lookup in thunk adjusting. */
234 static GTY (()) tree vtable_entry_type;
236 /* Determine if symbol DECL is needed. That is, visible to something
237 either outside this translation unit, something magic in the system
238 configury */
239 bool
240 decide_is_symbol_needed (symtab_node *node)
242 tree decl = node->decl;
244 /* Double check that no one output the function into assembly file
245 early. */
246 gcc_checking_assert (!DECL_ASSEMBLER_NAME_SET_P (decl)
247 || !TREE_SYMBOL_REFERENCED (DECL_ASSEMBLER_NAME (decl)));
249 if (!node->definition)
250 return false;
252 if (DECL_EXTERNAL (decl))
253 return false;
255 /* If the user told us it is used, then it must be so. */
256 if (node->force_output)
257 return true;
259 /* ABI forced symbols are needed when they are external. */
260 if (node->forced_by_abi && TREE_PUBLIC (decl))
261 return true;
263 /* Keep constructors, destructors and virtual functions. */
264 if (TREE_CODE (decl) == FUNCTION_DECL
265 && (DECL_STATIC_CONSTRUCTOR (decl) || DECL_STATIC_DESTRUCTOR (decl)))
266 return true;
268 /* Externally visible variables must be output. The exception is
269 COMDAT variables that must be output only when they are needed. */
270 if (TREE_PUBLIC (decl) && !DECL_COMDAT (decl))
271 return true;
273 return false;
276 /* Head and terminator of the queue of nodes to be processed while building
277 callgraph. */
279 static symtab_node symtab_terminator;
280 static symtab_node *queued_nodes = &symtab_terminator;
282 /* Add NODE to queue starting at QUEUED_NODES.
283 The queue is linked via AUX pointers and terminated by pointer to 1. */
285 static void
286 enqueue_node (symtab_node *node)
288 if (node->aux)
289 return;
290 gcc_checking_assert (queued_nodes);
291 node->aux = queued_nodes;
292 queued_nodes = node;
295 /* Process CGRAPH_NEW_FUNCTIONS and perform actions necessary to add these
296 functions into callgraph in a way so they look like ordinary reachable
297 functions inserted into callgraph already at construction time. */
299 void
300 cgraph_process_new_functions (void)
302 tree fndecl;
303 struct cgraph_node *node;
304 cgraph_node_set_iterator csi;
306 if (!cgraph_new_nodes)
307 return;
308 handle_alias_pairs ();
309 /* Note that this queue may grow as its being processed, as the new
310 functions may generate new ones. */
311 for (csi = csi_start (cgraph_new_nodes); !csi_end_p (csi); csi_next (&csi))
313 node = csi_node (csi);
314 fndecl = node->decl;
315 switch (cgraph_state)
317 case CGRAPH_STATE_CONSTRUCTION:
318 /* At construction time we just need to finalize function and move
319 it into reachable functions list. */
321 cgraph_finalize_function (fndecl, false);
322 cgraph_call_function_insertion_hooks (node);
323 enqueue_node (node);
324 break;
326 case CGRAPH_STATE_IPA:
327 case CGRAPH_STATE_IPA_SSA:
328 /* When IPA optimization already started, do all essential
329 transformations that has been already performed on the whole
330 cgraph but not on this function. */
332 gimple_register_cfg_hooks ();
333 if (!node->analyzed)
334 analyze_function (node);
335 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
336 if (cgraph_state == CGRAPH_STATE_IPA_SSA
337 && !gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
338 g->get_passes ()->execute_early_local_passes ();
339 else if (inline_summary_vec != NULL)
340 compute_inline_parameters (node, true);
341 free_dominance_info (CDI_POST_DOMINATORS);
342 free_dominance_info (CDI_DOMINATORS);
343 pop_cfun ();
344 cgraph_call_function_insertion_hooks (node);
345 break;
347 case CGRAPH_STATE_EXPANSION:
348 /* Functions created during expansion shall be compiled
349 directly. */
350 node->process = 0;
351 cgraph_call_function_insertion_hooks (node);
352 expand_function (node);
353 break;
355 default:
356 gcc_unreachable ();
357 break;
360 free_cgraph_node_set (cgraph_new_nodes);
361 cgraph_new_nodes = NULL;
364 /* As an GCC extension we allow redefinition of the function. The
365 semantics when both copies of bodies differ is not well defined.
366 We replace the old body with new body so in unit at a time mode
367 we always use new body, while in normal mode we may end up with
368 old body inlined into some functions and new body expanded and
369 inlined in others.
371 ??? It may make more sense to use one body for inlining and other
372 body for expanding the function but this is difficult to do. */
374 void
375 cgraph_reset_node (struct cgraph_node *node)
377 /* If node->process is set, then we have already begun whole-unit analysis.
378 This is *not* testing for whether we've already emitted the function.
379 That case can be sort-of legitimately seen with real function redefinition
380 errors. I would argue that the front end should never present us with
381 such a case, but don't enforce that for now. */
382 gcc_assert (!node->process);
384 /* Reset our data structures so we can analyze the function again. */
385 memset (&node->local, 0, sizeof (node->local));
386 memset (&node->global, 0, sizeof (node->global));
387 memset (&node->rtl, 0, sizeof (node->rtl));
388 node->analyzed = false;
389 node->definition = false;
390 node->alias = false;
391 node->weakref = false;
392 node->cpp_implicit_alias = false;
394 cgraph_node_remove_callees (node);
395 ipa_remove_all_references (&node->ref_list);
398 /* Return true when there are references to NODE. */
400 static bool
401 referred_to_p (symtab_node *node)
403 struct ipa_ref *ref;
405 /* See if there are any references at all. */
406 if (ipa_ref_list_referring_iterate (&node->ref_list, 0, ref))
407 return true;
408 /* For functions check also calls. */
409 cgraph_node *cn = dyn_cast <cgraph_node> (node);
410 if (cn && cn->callers)
411 return true;
412 return false;
415 /* DECL has been parsed. Take it, queue it, compile it at the whim of the
416 logic in effect. If NO_COLLECT is true, then our caller cannot stand to have
417 the garbage collector run at the moment. We would need to either create
418 a new GC context, or just not compile right now. */
420 void
421 cgraph_finalize_function (tree decl, bool no_collect)
423 struct cgraph_node *node = cgraph_get_create_node (decl);
425 if (node->definition)
427 /* Nested functions should only be defined once. */
428 gcc_assert (!DECL_CONTEXT (decl)
429 || TREE_CODE (DECL_CONTEXT (decl)) != FUNCTION_DECL);
430 cgraph_reset_node (node);
431 node->local.redefined_extern_inline = true;
434 notice_global_symbol (decl);
435 node->definition = true;
436 node->lowered = DECL_STRUCT_FUNCTION (decl)->cfg != NULL;
438 /* With -fkeep-inline-functions we are keeping all inline functions except
439 for extern inline ones. */
440 if (flag_keep_inline_functions
441 && DECL_DECLARED_INLINE_P (decl)
442 && !DECL_EXTERNAL (decl)
443 && !DECL_DISREGARD_INLINE_LIMITS (decl))
444 node->force_output = 1;
446 /* When not optimizing, also output the static functions. (see
447 PR24561), but don't do so for always_inline functions, functions
448 declared inline and nested functions. These were optimized out
449 in the original implementation and it is unclear whether we want
450 to change the behavior here. */
451 if ((!optimize
452 && !node->cpp_implicit_alias
453 && !DECL_DISREGARD_INLINE_LIMITS (decl)
454 && !DECL_DECLARED_INLINE_P (decl)
455 && !(DECL_CONTEXT (decl)
456 && TREE_CODE (DECL_CONTEXT (decl)) == FUNCTION_DECL))
457 && !DECL_COMDAT (decl) && !DECL_EXTERNAL (decl))
458 node->force_output = 1;
460 /* If we've not yet emitted decl, tell the debug info about it. */
461 if (!TREE_ASM_WRITTEN (decl))
462 (*debug_hooks->deferred_inline_function) (decl);
464 /* Possibly warn about unused parameters. */
465 if (warn_unused_parameter)
466 do_warn_unused_parameter (decl);
468 if (!no_collect)
469 ggc_collect ();
471 if (cgraph_state == CGRAPH_STATE_CONSTRUCTION
472 && (decide_is_symbol_needed (node)
473 || referred_to_p (node)))
474 enqueue_node (node);
477 /* Add the function FNDECL to the call graph.
478 Unlike cgraph_finalize_function, this function is intended to be used
479 by middle end and allows insertion of new function at arbitrary point
480 of compilation. The function can be either in high, low or SSA form
481 GIMPLE.
483 The function is assumed to be reachable and have address taken (so no
484 API breaking optimizations are performed on it).
486 Main work done by this function is to enqueue the function for later
487 processing to avoid need the passes to be re-entrant. */
489 void
490 cgraph_add_new_function (tree fndecl, bool lowered)
492 gcc::pass_manager *passes = g->get_passes ();
493 struct cgraph_node *node;
494 switch (cgraph_state)
496 case CGRAPH_STATE_PARSING:
497 cgraph_finalize_function (fndecl, false);
498 break;
499 case CGRAPH_STATE_CONSTRUCTION:
500 /* Just enqueue function to be processed at nearest occurrence. */
501 node = cgraph_create_node (fndecl);
502 if (lowered)
503 node->lowered = true;
504 if (!cgraph_new_nodes)
505 cgraph_new_nodes = cgraph_node_set_new ();
506 cgraph_node_set_add (cgraph_new_nodes, node);
507 break;
509 case CGRAPH_STATE_IPA:
510 case CGRAPH_STATE_IPA_SSA:
511 case CGRAPH_STATE_EXPANSION:
512 /* Bring the function into finalized state and enqueue for later
513 analyzing and compilation. */
514 node = cgraph_get_create_node (fndecl);
515 node->local.local = false;
516 node->definition = true;
517 node->force_output = true;
518 if (!lowered && cgraph_state == CGRAPH_STATE_EXPANSION)
520 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
521 gimple_register_cfg_hooks ();
522 bitmap_obstack_initialize (NULL);
523 execute_pass_list (passes->all_lowering_passes);
524 passes->execute_early_local_passes ();
525 bitmap_obstack_release (NULL);
526 pop_cfun ();
528 lowered = true;
530 if (lowered)
531 node->lowered = true;
532 if (!cgraph_new_nodes)
533 cgraph_new_nodes = cgraph_node_set_new ();
534 cgraph_node_set_add (cgraph_new_nodes, node);
535 break;
537 case CGRAPH_STATE_FINISHED:
538 /* At the very end of compilation we have to do all the work up
539 to expansion. */
540 node = cgraph_create_node (fndecl);
541 if (lowered)
542 node->lowered = true;
543 node->definition = true;
544 analyze_function (node);
545 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
546 gimple_register_cfg_hooks ();
547 bitmap_obstack_initialize (NULL);
548 if (!gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
549 g->get_passes ()->execute_early_local_passes ();
550 bitmap_obstack_release (NULL);
551 pop_cfun ();
552 expand_function (node);
553 break;
555 default:
556 gcc_unreachable ();
559 /* Set a personality if required and we already passed EH lowering. */
560 if (lowered
561 && (function_needs_eh_personality (DECL_STRUCT_FUNCTION (fndecl))
562 == eh_personality_lang))
563 DECL_FUNCTION_PERSONALITY (fndecl) = lang_hooks.eh_personality ();
566 /* Add a top-level asm statement to the list. */
568 struct asm_node *
569 add_asm_node (tree asm_str)
571 struct asm_node *node;
573 node = ggc_alloc_cleared_asm_node ();
574 node->asm_str = asm_str;
575 node->order = symtab_order++;
576 node->next = NULL;
577 if (asm_nodes == NULL)
578 asm_nodes = node;
579 else
580 asm_last_node->next = node;
581 asm_last_node = node;
582 return node;
585 /* Output all asm statements we have stored up to be output. */
587 static void
588 output_asm_statements (void)
590 struct asm_node *can;
592 if (seen_error ())
593 return;
595 for (can = asm_nodes; can; can = can->next)
596 assemble_asm (can->asm_str);
597 asm_nodes = NULL;
600 /* Analyze the function scheduled to be output. */
601 static void
602 analyze_function (struct cgraph_node *node)
604 tree decl = node->decl;
605 location_t saved_loc = input_location;
606 input_location = DECL_SOURCE_LOCATION (decl);
608 if (node->thunk.thunk_p)
610 cgraph_create_edge (node, cgraph_get_node (node->thunk.alias),
611 NULL, 0, CGRAPH_FREQ_BASE);
612 if (!expand_thunk (node, false))
614 node->thunk.alias = NULL;
615 node->analyzed = true;
616 return;
618 node->thunk.alias = NULL;
620 if (node->alias)
621 symtab_resolve_alias
622 (node, cgraph_get_node (node->alias_target));
623 else if (node->dispatcher_function)
625 /* Generate the dispatcher body of multi-versioned functions. */
626 struct cgraph_function_version_info *dispatcher_version_info
627 = get_cgraph_node_version (node);
628 if (dispatcher_version_info != NULL
629 && (dispatcher_version_info->dispatcher_resolver
630 == NULL_TREE))
632 tree resolver = NULL_TREE;
633 gcc_assert (targetm.generate_version_dispatcher_body);
634 resolver = targetm.generate_version_dispatcher_body (node);
635 gcc_assert (resolver != NULL_TREE);
638 else
640 push_cfun (DECL_STRUCT_FUNCTION (decl));
642 assign_assembler_name_if_neeeded (node->decl);
644 /* Make sure to gimplify bodies only once. During analyzing a
645 function we lower it, which will require gimplified nested
646 functions, so we can end up here with an already gimplified
647 body. */
648 if (!gimple_has_body_p (decl))
649 gimplify_function_tree (decl);
650 dump_function (TDI_generic, decl);
652 /* Lower the function. */
653 if (!node->lowered)
655 if (node->nested)
656 lower_nested_functions (node->decl);
657 gcc_assert (!node->nested);
659 gimple_register_cfg_hooks ();
660 bitmap_obstack_initialize (NULL);
661 execute_pass_list (g->get_passes ()->all_lowering_passes);
662 free_dominance_info (CDI_POST_DOMINATORS);
663 free_dominance_info (CDI_DOMINATORS);
664 compact_blocks ();
665 bitmap_obstack_release (NULL);
666 node->lowered = true;
669 pop_cfun ();
671 node->analyzed = true;
673 input_location = saved_loc;
676 /* C++ frontend produce same body aliases all over the place, even before PCH
677 gets streamed out. It relies on us linking the aliases with their function
678 in order to do the fixups, but ipa-ref is not PCH safe. Consequentely we
679 first produce aliases without links, but once C++ FE is sure he won't sream
680 PCH we build the links via this function. */
682 void
683 cgraph_process_same_body_aliases (void)
685 symtab_node *node;
686 FOR_EACH_SYMBOL (node)
687 if (node->cpp_implicit_alias && !node->analyzed)
688 symtab_resolve_alias
689 (node,
690 TREE_CODE (node->alias_target) == VAR_DECL
691 ? (symtab_node *)varpool_node_for_decl (node->alias_target)
692 : (symtab_node *)cgraph_get_create_node (node->alias_target));
693 cpp_implicit_aliases_done = true;
696 /* Process attributes common for vars and functions. */
698 static void
699 process_common_attributes (tree decl)
701 tree weakref = lookup_attribute ("weakref", DECL_ATTRIBUTES (decl));
703 if (weakref && !lookup_attribute ("alias", DECL_ATTRIBUTES (decl)))
705 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
706 "%<weakref%> attribute should be accompanied with"
707 " an %<alias%> attribute");
708 DECL_WEAK (decl) = 0;
709 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
710 DECL_ATTRIBUTES (decl));
714 /* Look for externally_visible and used attributes and mark cgraph nodes
715 accordingly.
717 We cannot mark the nodes at the point the attributes are processed (in
718 handle_*_attribute) because the copy of the declarations available at that
719 point may not be canonical. For example, in:
721 void f();
722 void f() __attribute__((used));
724 the declaration we see in handle_used_attribute will be the second
725 declaration -- but the front end will subsequently merge that declaration
726 with the original declaration and discard the second declaration.
728 Furthermore, we can't mark these nodes in cgraph_finalize_function because:
730 void f() {}
731 void f() __attribute__((externally_visible));
733 is valid.
735 So, we walk the nodes at the end of the translation unit, applying the
736 attributes at that point. */
738 static void
739 process_function_and_variable_attributes (struct cgraph_node *first,
740 varpool_node *first_var)
742 struct cgraph_node *node;
743 varpool_node *vnode;
745 for (node = cgraph_first_function (); node != first;
746 node = cgraph_next_function (node))
748 tree decl = node->decl;
749 if (DECL_PRESERVE_P (decl))
750 cgraph_mark_force_output_node (node);
751 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
753 if (! TREE_PUBLIC (node->decl))
754 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
755 "%<externally_visible%>"
756 " attribute have effect only on public objects");
758 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
759 && (node->definition && !node->alias))
761 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
762 "%<weakref%> attribute ignored"
763 " because function is defined");
764 DECL_WEAK (decl) = 0;
765 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
766 DECL_ATTRIBUTES (decl));
769 if (lookup_attribute ("always_inline", DECL_ATTRIBUTES (decl))
770 && !DECL_DECLARED_INLINE_P (decl)
771 /* redefining extern inline function makes it DECL_UNINLINABLE. */
772 && !DECL_UNINLINABLE (decl))
773 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
774 "always_inline function might not be inlinable");
776 process_common_attributes (decl);
778 for (vnode = varpool_first_variable (); vnode != first_var;
779 vnode = varpool_next_variable (vnode))
781 tree decl = vnode->decl;
782 if (DECL_EXTERNAL (decl)
783 && DECL_INITIAL (decl))
784 varpool_finalize_decl (decl);
785 if (DECL_PRESERVE_P (decl))
786 vnode->force_output = true;
787 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
789 if (! TREE_PUBLIC (vnode->decl))
790 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
791 "%<externally_visible%>"
792 " attribute have effect only on public objects");
794 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
795 && vnode->definition
796 && DECL_INITIAL (decl))
798 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
799 "%<weakref%> attribute ignored"
800 " because variable is initialized");
801 DECL_WEAK (decl) = 0;
802 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
803 DECL_ATTRIBUTES (decl));
805 process_common_attributes (decl);
809 /* Mark DECL as finalized. By finalizing the declaration, frontend instruct the
810 middle end to output the variable to asm file, if needed or externally
811 visible. */
813 void
814 varpool_finalize_decl (tree decl)
816 varpool_node *node = varpool_node_for_decl (decl);
818 gcc_assert (TREE_STATIC (decl) || DECL_EXTERNAL (decl));
820 if (node->definition)
821 return;
822 notice_global_symbol (decl);
823 node->definition = true;
824 if (TREE_THIS_VOLATILE (decl) || DECL_PRESERVE_P (decl)
825 /* Traditionally we do not eliminate static variables when not
826 optimizing and when not doing toplevel reoder. */
827 || (!flag_toplevel_reorder && !DECL_COMDAT (node->decl)
828 && !DECL_ARTIFICIAL (node->decl)))
829 node->force_output = true;
831 if (cgraph_state == CGRAPH_STATE_CONSTRUCTION
832 && (decide_is_symbol_needed (node)
833 || referred_to_p (node)))
834 enqueue_node (node);
835 if (cgraph_state >= CGRAPH_STATE_IPA_SSA)
836 varpool_analyze_node (node);
837 /* Some frontends produce various interface variables after compilation
838 finished. */
839 if (cgraph_state == CGRAPH_STATE_FINISHED
840 || (!flag_toplevel_reorder && cgraph_state == CGRAPH_STATE_EXPANSION))
841 varpool_assemble_decl (node);
844 /* EDGE is an polymorphic call. Mark all possible targets as reachable
845 and if there is only one target, perform trivial devirtualization.
846 REACHABLE_CALL_TARGETS collects target lists we already walked to
847 avoid udplicate work. */
849 static void
850 walk_polymorphic_call_targets (pointer_set_t *reachable_call_targets,
851 struct cgraph_edge *edge)
853 unsigned int i;
854 void *cache_token;
855 bool final;
856 vec <cgraph_node *>targets
857 = possible_polymorphic_call_targets
858 (edge, &final, &cache_token);
860 if (!pointer_set_insert (reachable_call_targets,
861 cache_token))
863 if (cgraph_dump_file)
864 dump_possible_polymorphic_call_targets
865 (cgraph_dump_file, edge);
867 for (i = 0; i < targets.length (); i++)
869 /* Do not bother to mark virtual methods in anonymous namespace;
870 either we will find use of virtual table defining it, or it is
871 unused. */
872 if (targets[i]->definition
873 && TREE_CODE
874 (TREE_TYPE (targets[i]->decl))
875 == METHOD_TYPE
876 && !type_in_anonymous_namespace_p
877 (method_class_type
878 (TREE_TYPE (targets[i]->decl))))
879 enqueue_node (targets[i]);
883 /* Very trivial devirtualization; when the type is
884 final or anonymous (so we know all its derivation)
885 and there is only one possible virtual call target,
886 make the edge direct. */
887 if (final)
889 if (targets.length () <= 1)
891 cgraph_node *target;
892 if (targets.length () == 1)
893 target = targets[0];
894 else
895 target = cgraph_get_create_node
896 (builtin_decl_implicit (BUILT_IN_UNREACHABLE));
898 if (cgraph_dump_file)
900 fprintf (cgraph_dump_file,
901 "Devirtualizing call: ");
902 print_gimple_stmt (cgraph_dump_file,
903 edge->call_stmt, 0,
904 TDF_SLIM);
906 cgraph_make_edge_direct (edge, target);
907 cgraph_redirect_edge_call_stmt_to_callee (edge);
908 if (cgraph_dump_file)
910 fprintf (cgraph_dump_file,
911 "Devirtualized as: ");
912 print_gimple_stmt (cgraph_dump_file,
913 edge->call_stmt, 0,
914 TDF_SLIM);
921 /* Discover all functions and variables that are trivially needed, analyze
922 them as well as all functions and variables referred by them */
924 static void
925 analyze_functions (void)
927 /* Keep track of already processed nodes when called multiple times for
928 intermodule optimization. */
929 static struct cgraph_node *first_analyzed;
930 struct cgraph_node *first_handled = first_analyzed;
931 static varpool_node *first_analyzed_var;
932 varpool_node *first_handled_var = first_analyzed_var;
933 struct pointer_set_t *reachable_call_targets = pointer_set_create ();
935 symtab_node *node;
936 symtab_node *next;
937 int i;
938 struct ipa_ref *ref;
939 bool changed = true;
940 location_t saved_loc = input_location;
942 bitmap_obstack_initialize (NULL);
943 cgraph_state = CGRAPH_STATE_CONSTRUCTION;
944 input_location = UNKNOWN_LOCATION;
946 /* Ugly, but the fixup can not happen at a time same body alias is created;
947 C++ FE is confused about the COMDAT groups being right. */
948 if (cpp_implicit_aliases_done)
949 FOR_EACH_SYMBOL (node)
950 if (node->cpp_implicit_alias)
951 fixup_same_cpp_alias_visibility (node, symtab_alias_target (node));
952 if (optimize && flag_devirtualize)
953 build_type_inheritance_graph ();
955 /* Analysis adds static variables that in turn adds references to new functions.
956 So we need to iterate the process until it stabilize. */
957 while (changed)
959 changed = false;
960 process_function_and_variable_attributes (first_analyzed,
961 first_analyzed_var);
963 /* First identify the trivially needed symbols. */
964 for (node = symtab_nodes;
965 node != first_analyzed
966 && node != first_analyzed_var; node = node->next)
968 if (decide_is_symbol_needed (node))
970 enqueue_node (node);
971 if (!changed && cgraph_dump_file)
972 fprintf (cgraph_dump_file, "Trivially needed symbols:");
973 changed = true;
974 if (cgraph_dump_file)
975 fprintf (cgraph_dump_file, " %s", node->asm_name ());
976 if (!changed && cgraph_dump_file)
977 fprintf (cgraph_dump_file, "\n");
979 if (node == first_analyzed
980 || node == first_analyzed_var)
981 break;
983 cgraph_process_new_functions ();
984 first_analyzed_var = varpool_first_variable ();
985 first_analyzed = cgraph_first_function ();
987 if (changed && dump_file)
988 fprintf (cgraph_dump_file, "\n");
990 /* Lower representation, build callgraph edges and references for all trivially
991 needed symbols and all symbols referred by them. */
992 while (queued_nodes != &symtab_terminator)
994 changed = true;
995 node = queued_nodes;
996 queued_nodes = (symtab_node *)queued_nodes->aux;
997 cgraph_node *cnode = dyn_cast <cgraph_node> (node);
998 if (cnode && cnode->definition)
1000 struct cgraph_edge *edge;
1001 tree decl = cnode->decl;
1003 /* ??? It is possible to create extern inline function
1004 and later using weak alias attribute to kill its body.
1005 See gcc.c-torture/compile/20011119-1.c */
1006 if (!DECL_STRUCT_FUNCTION (decl)
1007 && !cnode->alias
1008 && !cnode->thunk.thunk_p
1009 && !cnode->dispatcher_function)
1011 cgraph_reset_node (cnode);
1012 cnode->local.redefined_extern_inline = true;
1013 continue;
1016 if (!cnode->analyzed)
1017 analyze_function (cnode);
1019 for (edge = cnode->callees; edge; edge = edge->next_callee)
1020 if (edge->callee->definition)
1021 enqueue_node (edge->callee);
1022 if (optimize && flag_devirtualize)
1024 struct cgraph_edge *next;
1026 for (edge = cnode->indirect_calls; edge; edge = next)
1028 next = edge->next_callee;
1029 if (edge->indirect_info->polymorphic)
1030 walk_polymorphic_call_targets (reachable_call_targets,
1031 edge);
1035 /* If decl is a clone of an abstract function,
1036 mark that abstract function so that we don't release its body.
1037 The DECL_INITIAL() of that abstract function declaration
1038 will be later needed to output debug info. */
1039 if (DECL_ABSTRACT_ORIGIN (decl))
1041 struct cgraph_node *origin_node
1042 = cgraph_get_node (DECL_ABSTRACT_ORIGIN (decl));
1043 origin_node->used_as_abstract_origin = true;
1046 else
1048 varpool_node *vnode = dyn_cast <varpool_node> (node);
1049 if (vnode && vnode->definition && !vnode->analyzed)
1050 varpool_analyze_node (vnode);
1053 if (node->same_comdat_group)
1055 symtab_node *next;
1056 for (next = node->same_comdat_group;
1057 next != node;
1058 next = next->same_comdat_group)
1059 enqueue_node (next);
1061 for (i = 0; ipa_ref_list_reference_iterate (&node->ref_list, i, ref); i++)
1062 if (ref->referred->definition)
1063 enqueue_node (ref->referred);
1064 cgraph_process_new_functions ();
1067 if (optimize && flag_devirtualize)
1068 update_type_inheritance_graph ();
1070 /* Collect entry points to the unit. */
1071 if (cgraph_dump_file)
1073 fprintf (cgraph_dump_file, "\n\nInitial ");
1074 dump_symtab (cgraph_dump_file);
1077 if (cgraph_dump_file)
1078 fprintf (cgraph_dump_file, "\nRemoving unused symbols:");
1080 for (node = symtab_nodes;
1081 node != first_handled
1082 && node != first_handled_var; node = next)
1084 next = node->next;
1085 if (!node->aux && !referred_to_p (node))
1087 if (cgraph_dump_file)
1088 fprintf (cgraph_dump_file, " %s", node->name ());
1089 symtab_remove_node (node);
1090 continue;
1092 if (cgraph_node *cnode = dyn_cast <cgraph_node> (node))
1094 tree decl = node->decl;
1096 if (cnode->definition && !gimple_has_body_p (decl)
1097 && !cnode->alias
1098 && !cnode->thunk.thunk_p)
1099 cgraph_reset_node (cnode);
1101 gcc_assert (!cnode->definition || cnode->thunk.thunk_p
1102 || cnode->alias
1103 || gimple_has_body_p (decl));
1104 gcc_assert (cnode->analyzed == cnode->definition);
1106 node->aux = NULL;
1108 for (;node; node = node->next)
1109 node->aux = NULL;
1110 first_analyzed = cgraph_first_function ();
1111 first_analyzed_var = varpool_first_variable ();
1112 if (cgraph_dump_file)
1114 fprintf (cgraph_dump_file, "\n\nReclaimed ");
1115 dump_symtab (cgraph_dump_file);
1117 bitmap_obstack_release (NULL);
1118 pointer_set_destroy (reachable_call_targets);
1119 ggc_collect ();
1120 /* Initialize assembler name hash, in particular we want to trigger C++
1121 mangling and same body alias creation before we free DECL_ARGUMENTS
1122 used by it. */
1123 if (!seen_error ())
1124 symtab_initialize_asm_name_hash ();
1126 input_location = saved_loc;
1129 /* Translate the ugly representation of aliases as alias pairs into nice
1130 representation in callgraph. We don't handle all cases yet,
1131 unfortunately. */
1133 static void
1134 handle_alias_pairs (void)
1136 alias_pair *p;
1137 unsigned i;
1139 for (i = 0; alias_pairs && alias_pairs->iterate (i, &p);)
1141 symtab_node *target_node = symtab_node_for_asm (p->target);
1143 /* Weakrefs with target not defined in current unit are easy to handle:
1144 they behave just as external variables except we need to note the
1145 alias flag to later output the weakref pseudo op into asm file. */
1146 if (!target_node
1147 && lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)) != NULL)
1149 symtab_node *node = symtab_get_node (p->decl);
1150 if (node)
1152 node->alias_target = p->target;
1153 node->weakref = true;
1154 node->alias = true;
1156 alias_pairs->unordered_remove (i);
1157 continue;
1159 else if (!target_node)
1161 error ("%q+D aliased to undefined symbol %qE", p->decl, p->target);
1162 symtab_node *node = symtab_get_node (p->decl);
1163 if (node)
1164 node->alias = false;
1165 alias_pairs->unordered_remove (i);
1166 continue;
1169 if (DECL_EXTERNAL (target_node->decl)
1170 /* We use local aliases for C++ thunks to force the tailcall
1171 to bind locally. This is a hack - to keep it working do
1172 the following (which is not strictly correct). */
1173 && (! TREE_CODE (target_node->decl) == FUNCTION_DECL
1174 || ! DECL_VIRTUAL_P (target_node->decl))
1175 && ! lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)))
1177 error ("%q+D aliased to external symbol %qE",
1178 p->decl, p->target);
1181 if (TREE_CODE (p->decl) == FUNCTION_DECL
1182 && target_node && is_a <cgraph_node> (target_node))
1184 struct cgraph_node *src_node = cgraph_get_node (p->decl);
1185 if (src_node && src_node->definition)
1186 cgraph_reset_node (src_node);
1187 cgraph_create_function_alias (p->decl, target_node->decl);
1188 alias_pairs->unordered_remove (i);
1190 else if (TREE_CODE (p->decl) == VAR_DECL
1191 && target_node && is_a <varpool_node> (target_node))
1193 varpool_create_variable_alias (p->decl, target_node->decl);
1194 alias_pairs->unordered_remove (i);
1196 else
1198 error ("%q+D alias in between function and variable is not supported",
1199 p->decl);
1200 warning (0, "%q+D aliased declaration",
1201 target_node->decl);
1202 alias_pairs->unordered_remove (i);
1205 vec_free (alias_pairs);
1209 /* Figure out what functions we want to assemble. */
1211 static void
1212 mark_functions_to_output (void)
1214 struct cgraph_node *node;
1215 #ifdef ENABLE_CHECKING
1216 bool check_same_comdat_groups = false;
1218 FOR_EACH_FUNCTION (node)
1219 gcc_assert (!node->process);
1220 #endif
1222 FOR_EACH_FUNCTION (node)
1224 tree decl = node->decl;
1226 gcc_assert (!node->process || node->same_comdat_group);
1227 if (node->process)
1228 continue;
1230 /* We need to output all local functions that are used and not
1231 always inlined, as well as those that are reachable from
1232 outside the current compilation unit. */
1233 if (node->analyzed
1234 && !node->thunk.thunk_p
1235 && !node->alias
1236 && !node->global.inlined_to
1237 && !TREE_ASM_WRITTEN (decl)
1238 && !DECL_EXTERNAL (decl))
1240 node->process = 1;
1241 if (node->same_comdat_group)
1243 struct cgraph_node *next;
1244 for (next = cgraph (node->same_comdat_group);
1245 next != node;
1246 next = cgraph (next->same_comdat_group))
1247 if (!next->thunk.thunk_p && !next->alias
1248 && !symtab_comdat_local_p (next))
1249 next->process = 1;
1252 else if (node->same_comdat_group)
1254 #ifdef ENABLE_CHECKING
1255 check_same_comdat_groups = true;
1256 #endif
1258 else
1260 /* We should've reclaimed all functions that are not needed. */
1261 #ifdef ENABLE_CHECKING
1262 if (!node->global.inlined_to
1263 && gimple_has_body_p (decl)
1264 /* FIXME: in ltrans unit when offline copy is outside partition but inline copies
1265 are inside partition, we can end up not removing the body since we no longer
1266 have analyzed node pointing to it. */
1267 && !node->in_other_partition
1268 && !node->alias
1269 && !node->clones
1270 && !DECL_EXTERNAL (decl))
1272 dump_cgraph_node (stderr, node);
1273 internal_error ("failed to reclaim unneeded function");
1275 #endif
1276 gcc_assert (node->global.inlined_to
1277 || !gimple_has_body_p (decl)
1278 || node->in_other_partition
1279 || node->clones
1280 || DECL_ARTIFICIAL (decl)
1281 || DECL_EXTERNAL (decl));
1286 #ifdef ENABLE_CHECKING
1287 if (check_same_comdat_groups)
1288 FOR_EACH_FUNCTION (node)
1289 if (node->same_comdat_group && !node->process)
1291 tree decl = node->decl;
1292 if (!node->global.inlined_to
1293 && gimple_has_body_p (decl)
1294 /* FIXME: in an ltrans unit when the offline copy is outside a
1295 partition but inline copies are inside a partition, we can
1296 end up not removing the body since we no longer have an
1297 analyzed node pointing to it. */
1298 && !node->in_other_partition
1299 && !node->clones
1300 && !DECL_EXTERNAL (decl))
1302 dump_cgraph_node (stderr, node);
1303 internal_error ("failed to reclaim unneeded function in same "
1304 "comdat group");
1307 #endif
1310 /* DECL is FUNCTION_DECL. Initialize datastructures so DECL is a function
1311 in lowered gimple form. IN_SSA is true if the gimple is in SSA.
1313 Set current_function_decl and cfun to newly constructed empty function body.
1314 return basic block in the function body. */
1316 basic_block
1317 init_lowered_empty_function (tree decl, bool in_ssa)
1319 basic_block bb;
1321 current_function_decl = decl;
1322 allocate_struct_function (decl, false);
1323 gimple_register_cfg_hooks ();
1324 init_empty_tree_cfg ();
1326 if (in_ssa)
1328 init_tree_ssa (cfun);
1329 init_ssa_operands (cfun);
1330 cfun->gimple_df->in_ssa_p = true;
1331 cfun->curr_properties |= PROP_ssa;
1334 DECL_INITIAL (decl) = make_node (BLOCK);
1336 DECL_SAVED_TREE (decl) = error_mark_node;
1337 cfun->curr_properties |= (PROP_gimple_lcf | PROP_gimple_leh | PROP_gimple_any
1338 | PROP_cfg | PROP_loops);
1340 set_loops_for_fn (cfun, ggc_alloc_cleared_loops ());
1341 init_loops_structure (cfun, loops_for_fn (cfun), 1);
1342 loops_for_fn (cfun)->state |= LOOPS_MAY_HAVE_MULTIPLE_LATCHES;
1344 /* Create BB for body of the function and connect it properly. */
1345 bb = create_basic_block (NULL, (void *) 0, ENTRY_BLOCK_PTR_FOR_FN (cfun));
1346 make_edge (ENTRY_BLOCK_PTR_FOR_FN (cfun), bb, EDGE_FALLTHRU);
1347 make_edge (bb, EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
1348 add_bb_to_loop (bb, ENTRY_BLOCK_PTR_FOR_FN (cfun)->loop_father);
1350 return bb;
1353 /* Adjust PTR by the constant FIXED_OFFSET, and by the vtable
1354 offset indicated by VIRTUAL_OFFSET, if that is
1355 non-null. THIS_ADJUSTING is nonzero for a this adjusting thunk and
1356 zero for a result adjusting thunk. */
1358 static tree
1359 thunk_adjust (gimple_stmt_iterator * bsi,
1360 tree ptr, bool this_adjusting,
1361 HOST_WIDE_INT fixed_offset, tree virtual_offset)
1363 gimple stmt;
1364 tree ret;
1366 if (this_adjusting
1367 && fixed_offset != 0)
1369 stmt = gimple_build_assign
1370 (ptr, fold_build_pointer_plus_hwi_loc (input_location,
1371 ptr,
1372 fixed_offset));
1373 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1376 /* If there's a virtual offset, look up that value in the vtable and
1377 adjust the pointer again. */
1378 if (virtual_offset)
1380 tree vtabletmp;
1381 tree vtabletmp2;
1382 tree vtabletmp3;
1384 if (!vtable_entry_type)
1386 tree vfunc_type = make_node (FUNCTION_TYPE);
1387 TREE_TYPE (vfunc_type) = integer_type_node;
1388 TYPE_ARG_TYPES (vfunc_type) = NULL_TREE;
1389 layout_type (vfunc_type);
1391 vtable_entry_type = build_pointer_type (vfunc_type);
1394 vtabletmp =
1395 create_tmp_reg (build_pointer_type
1396 (build_pointer_type (vtable_entry_type)), "vptr");
1398 /* The vptr is always at offset zero in the object. */
1399 stmt = gimple_build_assign (vtabletmp,
1400 build1 (NOP_EXPR, TREE_TYPE (vtabletmp),
1401 ptr));
1402 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1404 /* Form the vtable address. */
1405 vtabletmp2 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp)),
1406 "vtableaddr");
1407 stmt = gimple_build_assign (vtabletmp2,
1408 build_simple_mem_ref (vtabletmp));
1409 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1411 /* Find the entry with the vcall offset. */
1412 stmt = gimple_build_assign (vtabletmp2,
1413 fold_build_pointer_plus_loc (input_location,
1414 vtabletmp2,
1415 virtual_offset));
1416 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1418 /* Get the offset itself. */
1419 vtabletmp3 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp2)),
1420 "vcalloffset");
1421 stmt = gimple_build_assign (vtabletmp3,
1422 build_simple_mem_ref (vtabletmp2));
1423 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1425 /* Adjust the `this' pointer. */
1426 ptr = fold_build_pointer_plus_loc (input_location, ptr, vtabletmp3);
1427 ptr = force_gimple_operand_gsi (bsi, ptr, true, NULL_TREE, false,
1428 GSI_CONTINUE_LINKING);
1431 if (!this_adjusting
1432 && fixed_offset != 0)
1433 /* Adjust the pointer by the constant. */
1435 tree ptrtmp;
1437 if (TREE_CODE (ptr) == VAR_DECL)
1438 ptrtmp = ptr;
1439 else
1441 ptrtmp = create_tmp_reg (TREE_TYPE (ptr), "ptr");
1442 stmt = gimple_build_assign (ptrtmp, ptr);
1443 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1445 ptr = fold_build_pointer_plus_hwi_loc (input_location,
1446 ptrtmp, fixed_offset);
1449 /* Emit the statement and gimplify the adjustment expression. */
1450 ret = create_tmp_reg (TREE_TYPE (ptr), "adjusted_this");
1451 stmt = gimple_build_assign (ret, ptr);
1452 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1454 return ret;
1457 /* Expand thunk NODE to gimple if possible.
1458 When OUTPUT_ASM_THUNK is true, also produce assembler for
1459 thunks that are not lowered. */
1461 bool
1462 expand_thunk (struct cgraph_node *node, bool output_asm_thunks)
1464 bool this_adjusting = node->thunk.this_adjusting;
1465 HOST_WIDE_INT fixed_offset = node->thunk.fixed_offset;
1466 HOST_WIDE_INT virtual_value = node->thunk.virtual_value;
1467 tree virtual_offset = NULL;
1468 tree alias = node->callees->callee->decl;
1469 tree thunk_fndecl = node->decl;
1470 tree a;
1473 if (this_adjusting
1474 && targetm.asm_out.can_output_mi_thunk (thunk_fndecl, fixed_offset,
1475 virtual_value, alias))
1477 const char *fnname;
1478 tree fn_block;
1479 tree restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1481 if (!output_asm_thunks)
1482 return false;
1484 if (in_lto_p)
1485 cgraph_get_body (node);
1486 a = DECL_ARGUMENTS (thunk_fndecl);
1488 current_function_decl = thunk_fndecl;
1490 /* Ensure thunks are emitted in their correct sections. */
1491 resolve_unique_section (thunk_fndecl, 0, flag_function_sections);
1493 DECL_RESULT (thunk_fndecl)
1494 = build_decl (DECL_SOURCE_LOCATION (thunk_fndecl),
1495 RESULT_DECL, 0, restype);
1496 DECL_CONTEXT (DECL_RESULT (thunk_fndecl)) = thunk_fndecl;
1497 fnname = IDENTIFIER_POINTER (DECL_ASSEMBLER_NAME (thunk_fndecl));
1499 /* The back end expects DECL_INITIAL to contain a BLOCK, so we
1500 create one. */
1501 fn_block = make_node (BLOCK);
1502 BLOCK_VARS (fn_block) = a;
1503 DECL_INITIAL (thunk_fndecl) = fn_block;
1504 init_function_start (thunk_fndecl);
1505 cfun->is_thunk = 1;
1506 insn_locations_init ();
1507 set_curr_insn_location (DECL_SOURCE_LOCATION (thunk_fndecl));
1508 prologue_location = curr_insn_location ();
1509 assemble_start_function (thunk_fndecl, fnname);
1511 targetm.asm_out.output_mi_thunk (asm_out_file, thunk_fndecl,
1512 fixed_offset, virtual_value, alias);
1514 assemble_end_function (thunk_fndecl, fnname);
1515 insn_locations_finalize ();
1516 init_insn_lengths ();
1517 free_after_compilation (cfun);
1518 set_cfun (NULL);
1519 TREE_ASM_WRITTEN (thunk_fndecl) = 1;
1520 node->thunk.thunk_p = false;
1521 node->analyzed = false;
1523 else
1525 tree restype;
1526 basic_block bb, then_bb, else_bb, return_bb;
1527 gimple_stmt_iterator bsi;
1528 int nargs = 0;
1529 tree arg;
1530 int i;
1531 tree resdecl;
1532 tree restmp = NULL;
1534 gimple call;
1535 gimple ret;
1537 if (in_lto_p)
1538 cgraph_get_body (node);
1539 a = DECL_ARGUMENTS (thunk_fndecl);
1541 current_function_decl = thunk_fndecl;
1543 /* Ensure thunks are emitted in their correct sections. */
1544 resolve_unique_section (thunk_fndecl, 0, flag_function_sections);
1546 DECL_IGNORED_P (thunk_fndecl) = 1;
1547 bitmap_obstack_initialize (NULL);
1549 if (node->thunk.virtual_offset_p)
1550 virtual_offset = size_int (virtual_value);
1552 /* Build the return declaration for the function. */
1553 restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1554 if (DECL_RESULT (thunk_fndecl) == NULL_TREE)
1556 resdecl = build_decl (input_location, RESULT_DECL, 0, restype);
1557 DECL_ARTIFICIAL (resdecl) = 1;
1558 DECL_IGNORED_P (resdecl) = 1;
1559 DECL_RESULT (thunk_fndecl) = resdecl;
1560 DECL_CONTEXT (DECL_RESULT (thunk_fndecl)) = thunk_fndecl;
1562 else
1563 resdecl = DECL_RESULT (thunk_fndecl);
1565 bb = then_bb = else_bb = return_bb = init_lowered_empty_function (thunk_fndecl, true);
1567 bsi = gsi_start_bb (bb);
1569 /* Build call to the function being thunked. */
1570 if (!VOID_TYPE_P (restype))
1572 if (DECL_BY_REFERENCE (resdecl))
1573 restmp = gimple_fold_indirect_ref (resdecl);
1574 else if (!is_gimple_reg_type (restype))
1576 restmp = resdecl;
1577 add_local_decl (cfun, restmp);
1578 BLOCK_VARS (DECL_INITIAL (current_function_decl)) = restmp;
1580 else
1581 restmp = create_tmp_reg (restype, "retval");
1584 for (arg = a; arg; arg = DECL_CHAIN (arg))
1585 nargs++;
1586 auto_vec<tree> vargs (nargs);
1587 if (this_adjusting)
1588 vargs.quick_push (thunk_adjust (&bsi, a, 1, fixed_offset,
1589 virtual_offset));
1590 else if (nargs)
1591 vargs.quick_push (a);
1593 if (nargs)
1594 for (i = 1, arg = DECL_CHAIN (a); i < nargs; i++, arg = DECL_CHAIN (arg))
1596 tree tmp = arg;
1597 if (!is_gimple_val (arg))
1599 tmp = create_tmp_reg (TYPE_MAIN_VARIANT
1600 (TREE_TYPE (arg)), "arg");
1601 gimple stmt = gimple_build_assign (tmp, arg);
1602 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1604 vargs.quick_push (tmp);
1606 call = gimple_build_call_vec (build_fold_addr_expr_loc (0, alias), vargs);
1607 node->callees->call_stmt = call;
1608 gimple_call_set_from_thunk (call, true);
1609 if (restmp)
1611 gimple_call_set_lhs (call, restmp);
1612 gcc_assert (useless_type_conversion_p (TREE_TYPE (restmp),
1613 TREE_TYPE (TREE_TYPE (alias))));
1615 gsi_insert_after (&bsi, call, GSI_NEW_STMT);
1616 if (!(gimple_call_flags (call) & ECF_NORETURN))
1618 if (restmp && !this_adjusting
1619 && (fixed_offset || virtual_offset))
1621 tree true_label = NULL_TREE;
1623 if (TREE_CODE (TREE_TYPE (restmp)) == POINTER_TYPE)
1625 gimple stmt;
1626 /* If the return type is a pointer, we need to
1627 protect against NULL. We know there will be an
1628 adjustment, because that's why we're emitting a
1629 thunk. */
1630 then_bb = create_basic_block (NULL, (void *) 0, bb);
1631 return_bb = create_basic_block (NULL, (void *) 0, then_bb);
1632 else_bb = create_basic_block (NULL, (void *) 0, else_bb);
1633 add_bb_to_loop (then_bb, bb->loop_father);
1634 add_bb_to_loop (return_bb, bb->loop_father);
1635 add_bb_to_loop (else_bb, bb->loop_father);
1636 remove_edge (single_succ_edge (bb));
1637 true_label = gimple_block_label (then_bb);
1638 stmt = gimple_build_cond (NE_EXPR, restmp,
1639 build_zero_cst (TREE_TYPE (restmp)),
1640 NULL_TREE, NULL_TREE);
1641 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1642 make_edge (bb, then_bb, EDGE_TRUE_VALUE);
1643 make_edge (bb, else_bb, EDGE_FALSE_VALUE);
1644 make_edge (return_bb, EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
1645 make_edge (then_bb, return_bb, EDGE_FALLTHRU);
1646 make_edge (else_bb, return_bb, EDGE_FALLTHRU);
1647 bsi = gsi_last_bb (then_bb);
1650 restmp = thunk_adjust (&bsi, restmp, /*this_adjusting=*/0,
1651 fixed_offset, virtual_offset);
1652 if (true_label)
1654 gimple stmt;
1655 bsi = gsi_last_bb (else_bb);
1656 stmt = gimple_build_assign (restmp,
1657 build_zero_cst (TREE_TYPE (restmp)));
1658 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1659 bsi = gsi_last_bb (return_bb);
1662 else
1663 gimple_call_set_tail (call, true);
1665 /* Build return value. */
1666 ret = gimple_build_return (restmp);
1667 gsi_insert_after (&bsi, ret, GSI_NEW_STMT);
1669 else
1671 gimple_call_set_tail (call, true);
1672 remove_edge (single_succ_edge (bb));
1675 cfun->gimple_df->in_ssa_p = true;
1676 /* FIXME: C++ FE should stop setting TREE_ASM_WRITTEN on thunks. */
1677 TREE_ASM_WRITTEN (thunk_fndecl) = false;
1678 delete_unreachable_blocks ();
1679 update_ssa (TODO_update_ssa);
1680 #ifdef ENABLE_CHECKING
1681 verify_flow_info ();
1682 #endif
1683 free_dominance_info (CDI_DOMINATORS);
1685 /* Since we want to emit the thunk, we explicitly mark its name as
1686 referenced. */
1687 node->thunk.thunk_p = false;
1688 node->lowered = true;
1689 bitmap_obstack_release (NULL);
1691 current_function_decl = NULL;
1692 set_cfun (NULL);
1693 return true;
1696 /* Assemble thunks and aliases associated to NODE. */
1698 static void
1699 assemble_thunks_and_aliases (struct cgraph_node *node)
1701 struct cgraph_edge *e;
1702 int i;
1703 struct ipa_ref *ref;
1705 for (e = node->callers; e;)
1706 if (e->caller->thunk.thunk_p)
1708 struct cgraph_node *thunk = e->caller;
1710 e = e->next_caller;
1711 assemble_thunks_and_aliases (thunk);
1712 expand_thunk (thunk, true);
1714 else
1715 e = e->next_caller;
1716 for (i = 0; ipa_ref_list_referring_iterate (&node->ref_list,
1717 i, ref); i++)
1718 if (ref->use == IPA_REF_ALIAS)
1720 struct cgraph_node *alias = ipa_ref_referring_node (ref);
1721 bool saved_written = TREE_ASM_WRITTEN (node->decl);
1723 /* Force assemble_alias to really output the alias this time instead
1724 of buffering it in same alias pairs. */
1725 TREE_ASM_WRITTEN (node->decl) = 1;
1726 do_assemble_alias (alias->decl,
1727 DECL_ASSEMBLER_NAME (node->decl));
1728 assemble_thunks_and_aliases (alias);
1729 TREE_ASM_WRITTEN (node->decl) = saved_written;
1733 /* Expand function specified by NODE. */
1735 static void
1736 expand_function (struct cgraph_node *node)
1738 tree decl = node->decl;
1739 location_t saved_loc;
1741 /* We ought to not compile any inline clones. */
1742 gcc_assert (!node->global.inlined_to);
1744 announce_function (decl);
1745 node->process = 0;
1746 gcc_assert (node->lowered);
1747 cgraph_get_body (node);
1749 /* Generate RTL for the body of DECL. */
1751 timevar_push (TV_REST_OF_COMPILATION);
1753 gcc_assert (cgraph_global_info_ready);
1755 /* Initialize the default bitmap obstack. */
1756 bitmap_obstack_initialize (NULL);
1758 /* Initialize the RTL code for the function. */
1759 current_function_decl = decl;
1760 saved_loc = input_location;
1761 input_location = DECL_SOURCE_LOCATION (decl);
1762 init_function_start (decl);
1764 gimple_register_cfg_hooks ();
1766 bitmap_obstack_initialize (&reg_obstack); /* FIXME, only at RTL generation*/
1768 execute_all_ipa_transforms ();
1770 /* Perform all tree transforms and optimizations. */
1772 /* Signal the start of passes. */
1773 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_START, NULL);
1775 execute_pass_list (g->get_passes ()->all_passes);
1777 /* Signal the end of passes. */
1778 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_END, NULL);
1780 bitmap_obstack_release (&reg_obstack);
1782 /* Release the default bitmap obstack. */
1783 bitmap_obstack_release (NULL);
1785 /* If requested, warn about function definitions where the function will
1786 return a value (usually of some struct or union type) which itself will
1787 take up a lot of stack space. */
1788 if (warn_larger_than && !DECL_EXTERNAL (decl) && TREE_TYPE (decl))
1790 tree ret_type = TREE_TYPE (TREE_TYPE (decl));
1792 if (ret_type && TYPE_SIZE_UNIT (ret_type)
1793 && TREE_CODE (TYPE_SIZE_UNIT (ret_type)) == INTEGER_CST
1794 && 0 < compare_tree_int (TYPE_SIZE_UNIT (ret_type),
1795 larger_than_size))
1797 unsigned int size_as_int
1798 = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (ret_type));
1800 if (compare_tree_int (TYPE_SIZE_UNIT (ret_type), size_as_int) == 0)
1801 warning (OPT_Wlarger_than_, "size of return value of %q+D is %u bytes",
1802 decl, size_as_int);
1803 else
1804 warning (OPT_Wlarger_than_, "size of return value of %q+D is larger than %wd bytes",
1805 decl, larger_than_size);
1809 gimple_set_body (decl, NULL);
1810 if (DECL_STRUCT_FUNCTION (decl) == 0
1811 && !cgraph_get_node (decl)->origin)
1813 /* Stop pointing to the local nodes about to be freed.
1814 But DECL_INITIAL must remain nonzero so we know this
1815 was an actual function definition.
1816 For a nested function, this is done in c_pop_function_context.
1817 If rest_of_compilation set this to 0, leave it 0. */
1818 if (DECL_INITIAL (decl) != 0)
1819 DECL_INITIAL (decl) = error_mark_node;
1822 input_location = saved_loc;
1824 ggc_collect ();
1825 timevar_pop (TV_REST_OF_COMPILATION);
1827 /* Make sure that BE didn't give up on compiling. */
1828 gcc_assert (TREE_ASM_WRITTEN (decl));
1829 set_cfun (NULL);
1830 current_function_decl = NULL;
1832 /* It would make a lot more sense to output thunks before function body to get more
1833 forward and lest backwarding jumps. This however would need solving problem
1834 with comdats. See PR48668. Also aliases must come after function itself to
1835 make one pass assemblers, like one on AIX, happy. See PR 50689.
1836 FIXME: Perhaps thunks should be move before function IFF they are not in comdat
1837 groups. */
1838 assemble_thunks_and_aliases (node);
1839 cgraph_release_function_body (node);
1840 /* Eliminate all call edges. This is important so the GIMPLE_CALL no longer
1841 points to the dead function body. */
1842 cgraph_node_remove_callees (node);
1843 ipa_remove_all_references (&node->ref_list);
1846 /* Node comparer that is responsible for the order that corresponds
1847 to time when a function was launched for the first time. */
1849 static int
1850 node_cmp (const void *pa, const void *pb)
1852 const struct cgraph_node *a = *(const struct cgraph_node * const *) pa;
1853 const struct cgraph_node *b = *(const struct cgraph_node * const *) pb;
1855 /* Functions with time profile must be before these without profile. */
1856 if (!a->tp_first_run || !b->tp_first_run)
1857 return a->tp_first_run - b->tp_first_run;
1859 return a->tp_first_run != b->tp_first_run
1860 ? b->tp_first_run - a->tp_first_run
1861 : b->order - a->order;
1864 /* Expand all functions that must be output.
1866 Attempt to topologically sort the nodes so function is output when
1867 all called functions are already assembled to allow data to be
1868 propagated across the callgraph. Use a stack to get smaller distance
1869 between a function and its callees (later we may choose to use a more
1870 sophisticated algorithm for function reordering; we will likely want
1871 to use subsections to make the output functions appear in top-down
1872 order). */
1874 static void
1875 expand_all_functions (void)
1877 struct cgraph_node *node;
1878 struct cgraph_node **order = XCNEWVEC (struct cgraph_node *, cgraph_n_nodes);
1879 unsigned int expanded_func_count = 0, profiled_func_count = 0;
1880 int order_pos, new_order_pos = 0;
1881 int i;
1883 order_pos = ipa_reverse_postorder (order);
1884 gcc_assert (order_pos == cgraph_n_nodes);
1886 /* Garbage collector may remove inline clones we eliminate during
1887 optimization. So we must be sure to not reference them. */
1888 for (i = 0; i < order_pos; i++)
1889 if (order[i]->process)
1890 order[new_order_pos++] = order[i];
1892 if (flag_profile_reorder_functions)
1893 qsort (order, new_order_pos, sizeof (struct cgraph_node *), node_cmp);
1895 for (i = new_order_pos - 1; i >= 0; i--)
1897 node = order[i];
1899 if (node->process)
1901 expanded_func_count++;
1902 if(node->tp_first_run)
1903 profiled_func_count++;
1905 if (cgraph_dump_file)
1906 fprintf (cgraph_dump_file, "Time profile order in expand_all_functions:%s:%d\n", node->asm_name (), node->tp_first_run);
1908 node->process = 0;
1909 expand_function (node);
1913 if (dump_file)
1914 fprintf (dump_file, "Expanded functions with time profile (%s):%u/%u\n",
1915 main_input_filename, profiled_func_count, expanded_func_count);
1917 if (cgraph_dump_file && flag_profile_reorder_functions)
1918 fprintf (cgraph_dump_file, "Expanded functions with time profile:%u/%u\n",
1919 profiled_func_count, expanded_func_count);
1921 cgraph_process_new_functions ();
1922 free_gimplify_stack ();
1924 free (order);
1927 /* This is used to sort the node types by the cgraph order number. */
1929 enum cgraph_order_sort_kind
1931 ORDER_UNDEFINED = 0,
1932 ORDER_FUNCTION,
1933 ORDER_VAR,
1934 ORDER_ASM
1937 struct cgraph_order_sort
1939 enum cgraph_order_sort_kind kind;
1940 union
1942 struct cgraph_node *f;
1943 varpool_node *v;
1944 struct asm_node *a;
1945 } u;
1948 /* Output all functions, variables, and asm statements in the order
1949 according to their order fields, which is the order in which they
1950 appeared in the file. This implements -fno-toplevel-reorder. In
1951 this mode we may output functions and variables which don't really
1952 need to be output. */
1954 static void
1955 output_in_order (void)
1957 int max;
1958 struct cgraph_order_sort *nodes;
1959 int i;
1960 struct cgraph_node *pf;
1961 varpool_node *pv;
1962 struct asm_node *pa;
1964 max = symtab_order;
1965 nodes = XCNEWVEC (struct cgraph_order_sort, max);
1967 FOR_EACH_DEFINED_FUNCTION (pf)
1969 if (pf->process && !pf->thunk.thunk_p && !pf->alias)
1971 i = pf->order;
1972 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
1973 nodes[i].kind = ORDER_FUNCTION;
1974 nodes[i].u.f = pf;
1978 FOR_EACH_DEFINED_VARIABLE (pv)
1979 if (!DECL_EXTERNAL (pv->decl))
1981 i = pv->order;
1982 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
1983 nodes[i].kind = ORDER_VAR;
1984 nodes[i].u.v = pv;
1987 for (pa = asm_nodes; pa; pa = pa->next)
1989 i = pa->order;
1990 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
1991 nodes[i].kind = ORDER_ASM;
1992 nodes[i].u.a = pa;
1995 /* In toplevel reorder mode we output all statics; mark them as needed. */
1997 for (i = 0; i < max; ++i)
1998 if (nodes[i].kind == ORDER_VAR)
1999 varpool_finalize_named_section_flags (nodes[i].u.v);
2001 for (i = 0; i < max; ++i)
2003 switch (nodes[i].kind)
2005 case ORDER_FUNCTION:
2006 nodes[i].u.f->process = 0;
2007 expand_function (nodes[i].u.f);
2008 break;
2010 case ORDER_VAR:
2011 varpool_assemble_decl (nodes[i].u.v);
2012 break;
2014 case ORDER_ASM:
2015 assemble_asm (nodes[i].u.a->asm_str);
2016 break;
2018 case ORDER_UNDEFINED:
2019 break;
2021 default:
2022 gcc_unreachable ();
2026 asm_nodes = NULL;
2027 free (nodes);
2030 static void
2031 ipa_passes (void)
2033 gcc::pass_manager *passes = g->get_passes ();
2035 set_cfun (NULL);
2036 current_function_decl = NULL;
2037 gimple_register_cfg_hooks ();
2038 bitmap_obstack_initialize (NULL);
2040 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_START, NULL);
2042 if (!in_lto_p)
2044 execute_ipa_pass_list (passes->all_small_ipa_passes);
2045 if (seen_error ())
2046 return;
2049 /* We never run removal of unreachable nodes after early passes. This is
2050 because TODO is run before the subpasses. It is important to remove
2051 the unreachable functions to save works at IPA level and to get LTO
2052 symbol tables right. */
2053 symtab_remove_unreachable_nodes (true, cgraph_dump_file);
2055 /* If pass_all_early_optimizations was not scheduled, the state of
2056 the cgraph will not be properly updated. Update it now. */
2057 if (cgraph_state < CGRAPH_STATE_IPA_SSA)
2058 cgraph_state = CGRAPH_STATE_IPA_SSA;
2060 if (!in_lto_p)
2062 /* Generate coverage variables and constructors. */
2063 coverage_finish ();
2065 /* Process new functions added. */
2066 set_cfun (NULL);
2067 current_function_decl = NULL;
2068 cgraph_process_new_functions ();
2070 execute_ipa_summary_passes
2071 ((ipa_opt_pass_d *) passes->all_regular_ipa_passes);
2074 /* Some targets need to handle LTO assembler output specially. */
2075 if (flag_generate_lto)
2076 targetm.asm_out.lto_start ();
2078 if (!in_lto_p)
2079 ipa_write_summaries ();
2081 if (flag_generate_lto)
2082 targetm.asm_out.lto_end ();
2084 if (!flag_ltrans && (in_lto_p || !flag_lto || flag_fat_lto_objects))
2085 execute_ipa_pass_list (passes->all_regular_ipa_passes);
2086 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_END, NULL);
2088 bitmap_obstack_release (NULL);
2092 /* Return string alias is alias of. */
2094 static tree
2095 get_alias_symbol (tree decl)
2097 tree alias = lookup_attribute ("alias", DECL_ATTRIBUTES (decl));
2098 return get_identifier (TREE_STRING_POINTER
2099 (TREE_VALUE (TREE_VALUE (alias))));
2103 /* Weakrefs may be associated to external decls and thus not output
2104 at expansion time. Emit all necessary aliases. */
2106 static void
2107 output_weakrefs (void)
2109 symtab_node *node;
2110 FOR_EACH_SYMBOL (node)
2111 if (node->alias
2112 && !TREE_ASM_WRITTEN (node->decl)
2113 && node->weakref)
2115 tree target;
2117 /* Weakrefs are special by not requiring target definition in current
2118 compilation unit. It is thus bit hard to work out what we want to
2119 alias.
2120 When alias target is defined, we need to fetch it from symtab reference,
2121 otherwise it is pointed to by alias_target. */
2122 if (node->alias_target)
2123 target = (DECL_P (node->alias_target)
2124 ? DECL_ASSEMBLER_NAME (node->alias_target)
2125 : node->alias_target);
2126 else if (node->analyzed)
2127 target = DECL_ASSEMBLER_NAME (symtab_alias_target (node)->decl);
2128 else
2130 gcc_unreachable ();
2131 target = get_alias_symbol (node->decl);
2133 do_assemble_alias (node->decl, target);
2137 /* Initialize callgraph dump file. */
2139 void
2140 init_cgraph (void)
2142 if (!cgraph_dump_file)
2143 cgraph_dump_file = dump_begin (TDI_cgraph, NULL);
2147 /* Perform simple optimizations based on callgraph. */
2149 void
2150 compile (void)
2152 if (seen_error ())
2153 return;
2155 #ifdef ENABLE_CHECKING
2156 verify_symtab ();
2157 #endif
2159 timevar_push (TV_CGRAPHOPT);
2160 if (pre_ipa_mem_report)
2162 fprintf (stderr, "Memory consumption before IPA\n");
2163 dump_memory_report (false);
2165 if (!quiet_flag)
2166 fprintf (stderr, "Performing interprocedural optimizations\n");
2167 cgraph_state = CGRAPH_STATE_IPA;
2169 /* If LTO is enabled, initialize the streamer hooks needed by GIMPLE. */
2170 if (flag_lto)
2171 lto_streamer_hooks_init ();
2173 /* Don't run the IPA passes if there was any error or sorry messages. */
2174 if (!seen_error ())
2175 ipa_passes ();
2177 /* Do nothing else if any IPA pass found errors or if we are just streaming LTO. */
2178 if (seen_error ()
2179 || (!in_lto_p && flag_lto && !flag_fat_lto_objects))
2181 timevar_pop (TV_CGRAPHOPT);
2182 return;
2185 /* This pass remove bodies of extern inline functions we never inlined.
2186 Do this later so other IPA passes see what is really going on. */
2187 symtab_remove_unreachable_nodes (false, dump_file);
2188 cgraph_global_info_ready = true;
2189 if (cgraph_dump_file)
2191 fprintf (cgraph_dump_file, "Optimized ");
2192 dump_symtab (cgraph_dump_file);
2194 if (post_ipa_mem_report)
2196 fprintf (stderr, "Memory consumption after IPA\n");
2197 dump_memory_report (false);
2199 timevar_pop (TV_CGRAPHOPT);
2201 /* Output everything. */
2202 (*debug_hooks->assembly_start) ();
2203 if (!quiet_flag)
2204 fprintf (stderr, "Assembling functions:\n");
2205 #ifdef ENABLE_CHECKING
2206 verify_symtab ();
2207 #endif
2209 cgraph_materialize_all_clones ();
2210 bitmap_obstack_initialize (NULL);
2211 execute_ipa_pass_list (g->get_passes ()->all_late_ipa_passes);
2212 symtab_remove_unreachable_nodes (true, dump_file);
2213 #ifdef ENABLE_CHECKING
2214 verify_symtab ();
2215 #endif
2216 bitmap_obstack_release (NULL);
2217 mark_functions_to_output ();
2219 /* When weakref support is missing, we autmatically translate all
2220 references to NODE to references to its ultimate alias target.
2221 The renaming mechanizm uses flag IDENTIFIER_TRANSPARENT_ALIAS and
2222 TREE_CHAIN.
2224 Set up this mapping before we output any assembler but once we are sure
2225 that all symbol renaming is done.
2227 FIXME: All this uglyness can go away if we just do renaming at gimple
2228 level by physically rewritting the IL. At the moment we can only redirect
2229 calls, so we need infrastructure for renaming references as well. */
2230 #ifndef ASM_OUTPUT_WEAKREF
2231 symtab_node *node;
2233 FOR_EACH_SYMBOL (node)
2234 if (node->alias
2235 && lookup_attribute ("weakref", DECL_ATTRIBUTES (node->decl)))
2237 IDENTIFIER_TRANSPARENT_ALIAS
2238 (DECL_ASSEMBLER_NAME (node->decl)) = 1;
2239 TREE_CHAIN (DECL_ASSEMBLER_NAME (node->decl))
2240 = (node->alias_target ? node->alias_target
2241 : DECL_ASSEMBLER_NAME (symtab_alias_target (node)->decl));
2243 #endif
2245 cgraph_state = CGRAPH_STATE_EXPANSION;
2247 if (!flag_toplevel_reorder)
2248 output_in_order ();
2249 else
2251 output_asm_statements ();
2253 expand_all_functions ();
2254 varpool_output_variables ();
2257 cgraph_process_new_functions ();
2258 cgraph_state = CGRAPH_STATE_FINISHED;
2259 output_weakrefs ();
2261 if (cgraph_dump_file)
2263 fprintf (cgraph_dump_file, "\nFinal ");
2264 dump_symtab (cgraph_dump_file);
2266 #ifdef ENABLE_CHECKING
2267 verify_symtab ();
2268 /* Double check that all inline clones are gone and that all
2269 function bodies have been released from memory. */
2270 if (!seen_error ())
2272 struct cgraph_node *node;
2273 bool error_found = false;
2275 FOR_EACH_DEFINED_FUNCTION (node)
2276 if (node->global.inlined_to
2277 || gimple_has_body_p (node->decl))
2279 error_found = true;
2280 dump_cgraph_node (stderr, node);
2282 if (error_found)
2283 internal_error ("nodes with unreleased memory found");
2285 #endif
2289 /* Analyze the whole compilation unit once it is parsed completely. */
2291 void
2292 finalize_compilation_unit (void)
2294 timevar_push (TV_CGRAPH);
2296 /* If we're here there's no current function anymore. Some frontends
2297 are lazy in clearing these. */
2298 current_function_decl = NULL;
2299 set_cfun (NULL);
2301 /* Do not skip analyzing the functions if there were errors, we
2302 miss diagnostics for following functions otherwise. */
2304 /* Emit size functions we didn't inline. */
2305 finalize_size_functions ();
2307 /* Mark alias targets necessary and emit diagnostics. */
2308 handle_alias_pairs ();
2310 if (!quiet_flag)
2312 fprintf (stderr, "\nAnalyzing compilation unit\n");
2313 fflush (stderr);
2316 if (flag_dump_passes)
2317 dump_passes ();
2319 /* Gimplify and lower all functions, compute reachability and
2320 remove unreachable nodes. */
2321 analyze_functions ();
2323 /* Mark alias targets necessary and emit diagnostics. */
2324 handle_alias_pairs ();
2326 /* Gimplify and lower thunks. */
2327 analyze_functions ();
2329 /* Finally drive the pass manager. */
2330 compile ();
2332 timevar_pop (TV_CGRAPH);
2336 #include "gt-cgraphunit.h"