Fix for ICE with -g on testcase with incomplete types.
[official-gcc.git] / gcc / cgraphunit.c
blobeab8c7f28aa3c217d1646585663effb302a1ff98
1 /* Driver of optimization process
2 Copyright (C) 2003-2015 Free Software Foundation, Inc.
3 Contributed by Jan Hubicka
5 This file is part of GCC.
7 GCC is free software; you can redistribute it and/or modify it under
8 the terms of the GNU General Public License as published by the Free
9 Software Foundation; either version 3, or (at your option) any later
10 version.
12 GCC is distributed in the hope that it will be useful, but WITHOUT ANY
13 WARRANTY; without even the implied warranty of MERCHANTABILITY or
14 FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
15 for more details.
17 You should have received a copy of the GNU General Public License
18 along with GCC; see the file COPYING3. If not see
19 <http://www.gnu.org/licenses/>. */
21 /* This module implements main driver of compilation process.
23 The main scope of this file is to act as an interface in between
24 tree based frontends and the backend.
26 The front-end is supposed to use following functionality:
28 - finalize_function
30 This function is called once front-end has parsed whole body of function
31 and it is certain that the function body nor the declaration will change.
33 (There is one exception needed for implementing GCC extern inline
34 function.)
36 - varpool_finalize_decl
38 This function has same behavior as the above but is used for static
39 variables.
41 - add_asm_node
43 Insert new toplevel ASM statement
45 - finalize_compilation_unit
47 This function is called once (source level) compilation unit is finalized
48 and it will no longer change.
50 The symbol table is constructed starting from the trivially needed
51 symbols finalized by the frontend. Functions are lowered into
52 GIMPLE representation and callgraph/reference lists are constructed.
53 Those are used to discover other necessary functions and variables.
55 At the end the bodies of unreachable functions are removed.
57 The function can be called multiple times when multiple source level
58 compilation units are combined.
60 - compile
62 This passes control to the back-end. Optimizations are performed and
63 final assembler is generated. This is done in the following way. Note
64 that with link time optimization the process is split into three
65 stages (compile time, linktime analysis and parallel linktime as
66 indicated bellow).
68 Compile time:
70 1) Inter-procedural optimization.
71 (ipa_passes)
73 This part is further split into:
75 a) early optimizations. These are local passes executed in
76 the topological order on the callgraph.
78 The purpose of early optimiations is to optimize away simple
79 things that may otherwise confuse IP analysis. Very simple
80 propagation across the callgraph is done i.e. to discover
81 functions without side effects and simple inlining is performed.
83 b) early small interprocedural passes.
85 Those are interprocedural passes executed only at compilation
86 time. These include, for example, transational memory lowering,
87 unreachable code removal and other simple transformations.
89 c) IP analysis stage. All interprocedural passes do their
90 analysis.
92 Interprocedural passes differ from small interprocedural
93 passes by their ability to operate across whole program
94 at linktime. Their analysis stage is performed early to
95 both reduce linking times and linktime memory usage by
96 not having to represent whole program in memory.
98 d) LTO sreaming. When doing LTO, everything important gets
99 streamed into the object file.
101 Compile time and or linktime analysis stage (WPA):
103 At linktime units gets streamed back and symbol table is
104 merged. Function bodies are not streamed in and not
105 available.
106 e) IP propagation stage. All IP passes execute their
107 IP propagation. This is done based on the earlier analysis
108 without having function bodies at hand.
109 f) Ltrans streaming. When doing WHOPR LTO, the program
110 is partitioned and streamed into multple object files.
112 Compile time and/or parallel linktime stage (ltrans)
114 Each of the object files is streamed back and compiled
115 separately. Now the function bodies becomes available
116 again.
118 2) Virtual clone materialization
119 (cgraph_materialize_clone)
121 IP passes can produce copies of existing functoins (such
122 as versioned clones or inline clones) without actually
123 manipulating their bodies by creating virtual clones in
124 the callgraph. At this time the virtual clones are
125 turned into real functions
126 3) IP transformation
128 All IP passes transform function bodies based on earlier
129 decision of the IP propagation.
131 4) late small IP passes
133 Simple IP passes working within single program partition.
135 5) Expansion
136 (expand_all_functions)
138 At this stage functions that needs to be output into
139 assembler are identified and compiled in topological order
140 6) Output of variables and aliases
141 Now it is known what variable references was not optimized
142 out and thus all variables are output to the file.
144 Note that with -fno-toplevel-reorder passes 5 and 6
145 are combined together in cgraph_output_in_order.
147 Finally there are functions to manipulate the callgraph from
148 backend.
149 - cgraph_add_new_function is used to add backend produced
150 functions introduced after the unit is finalized.
151 The functions are enqueue for later processing and inserted
152 into callgraph with cgraph_process_new_functions.
154 - cgraph_function_versioning
156 produces a copy of function into new one (a version)
157 and apply simple transformations
160 #include "config.h"
161 #include "system.h"
162 #include "coretypes.h"
163 #include "backend.h"
164 #include "cfghooks.h"
165 #include "tree.h"
166 #include "gimple.h"
167 #include "rtl.h"
168 #include "alias.h"
169 #include "fold-const.h"
170 #include "varasm.h"
171 #include "stor-layout.h"
172 #include "stringpool.h"
173 #include "gimple-ssa.h"
174 #include "output.h"
175 #include "cfgcleanup.h"
176 #include "internal-fn.h"
177 #include "gimple-fold.h"
178 #include "gimplify.h"
179 #include "gimple-iterator.h"
180 #include "gimplify-me.h"
181 #include "tree-cfg.h"
182 #include "tree-into-ssa.h"
183 #include "tree-ssa.h"
184 #include "tree-inline.h"
185 #include "langhooks.h"
186 #include "toplev.h"
187 #include "flags.h"
188 #include "debug.h"
189 #include "target.h"
190 #include "diagnostic.h"
191 #include "params.h"
192 #include "intl.h"
193 #include "cgraph.h"
194 #include "alloc-pool.h"
195 #include "symbol-summary.h"
196 #include "ipa-prop.h"
197 #include "tree-iterator.h"
198 #include "tree-pass.h"
199 #include "tree-dump.h"
200 #include "gimple-pretty-print.h"
201 #include "output.h"
202 #include "coverage.h"
203 #include "plugin.h"
204 #include "ipa-inline.h"
205 #include "ipa-utils.h"
206 #include "lto-streamer.h"
207 #include "except.h"
208 #include "cfgloop.h"
209 #include "regset.h" /* FIXME: For reg_obstack. */
210 #include "context.h"
211 #include "pass_manager.h"
212 #include "tree-nested.h"
213 #include "gimplify.h"
214 #include "dbgcnt.h"
215 #include "tree-chkp.h"
216 #include "lto-section-names.h"
217 #include "omp-low.h"
218 #include "print-tree.h"
220 /* Queue of cgraph nodes scheduled to be added into cgraph. This is a
221 secondary queue used during optimization to accommodate passes that
222 may generate new functions that need to be optimized and expanded. */
223 vec<cgraph_node *> cgraph_new_nodes;
225 static void expand_all_functions (void);
226 static void mark_functions_to_output (void);
227 static void handle_alias_pairs (void);
229 /* Used for vtable lookup in thunk adjusting. */
230 static GTY (()) tree vtable_entry_type;
232 /* Determine if symbol declaration is needed. That is, visible to something
233 either outside this translation unit, something magic in the system
234 configury */
235 bool
236 symtab_node::needed_p (void)
238 /* Double check that no one output the function into assembly file
239 early. */
240 gcc_checking_assert (!DECL_ASSEMBLER_NAME_SET_P (decl)
241 || !TREE_SYMBOL_REFERENCED (DECL_ASSEMBLER_NAME (decl)));
243 if (!definition)
244 return false;
246 if (DECL_EXTERNAL (decl))
247 return false;
249 /* If the user told us it is used, then it must be so. */
250 if (force_output)
251 return true;
253 /* ABI forced symbols are needed when they are external. */
254 if (forced_by_abi && TREE_PUBLIC (decl))
255 return true;
257 /* Keep constructors, destructors and virtual functions. */
258 if (TREE_CODE (decl) == FUNCTION_DECL
259 && (DECL_STATIC_CONSTRUCTOR (decl) || DECL_STATIC_DESTRUCTOR (decl)))
260 return true;
262 /* Externally visible variables must be output. The exception is
263 COMDAT variables that must be output only when they are needed. */
264 if (TREE_PUBLIC (decl) && !DECL_COMDAT (decl))
265 return true;
267 return false;
270 /* Head and terminator of the queue of nodes to be processed while building
271 callgraph. */
273 static symtab_node symtab_terminator;
274 static symtab_node *queued_nodes = &symtab_terminator;
276 /* Add NODE to queue starting at QUEUED_NODES.
277 The queue is linked via AUX pointers and terminated by pointer to 1. */
279 static void
280 enqueue_node (symtab_node *node)
282 if (node->aux)
283 return;
284 gcc_checking_assert (queued_nodes);
285 node->aux = queued_nodes;
286 queued_nodes = node;
289 /* Process CGRAPH_NEW_FUNCTIONS and perform actions necessary to add these
290 functions into callgraph in a way so they look like ordinary reachable
291 functions inserted into callgraph already at construction time. */
293 void
294 symbol_table::process_new_functions (void)
296 tree fndecl;
298 if (!cgraph_new_nodes.exists ())
299 return;
301 handle_alias_pairs ();
302 /* Note that this queue may grow as its being processed, as the new
303 functions may generate new ones. */
304 for (unsigned i = 0; i < cgraph_new_nodes.length (); i++)
306 cgraph_node *node = cgraph_new_nodes[i];
307 fndecl = node->decl;
308 switch (state)
310 case CONSTRUCTION:
311 /* At construction time we just need to finalize function and move
312 it into reachable functions list. */
314 cgraph_node::finalize_function (fndecl, false);
315 call_cgraph_insertion_hooks (node);
316 enqueue_node (node);
317 break;
319 case IPA:
320 case IPA_SSA:
321 case IPA_SSA_AFTER_INLINING:
322 /* When IPA optimization already started, do all essential
323 transformations that has been already performed on the whole
324 cgraph but not on this function. */
326 gimple_register_cfg_hooks ();
327 if (!node->analyzed)
328 node->analyze ();
329 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
330 if ((state == IPA_SSA || state == IPA_SSA_AFTER_INLINING)
331 && !gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
332 g->get_passes ()->execute_early_local_passes ();
333 else if (inline_summaries != NULL)
334 compute_inline_parameters (node, true);
335 free_dominance_info (CDI_POST_DOMINATORS);
336 free_dominance_info (CDI_DOMINATORS);
337 pop_cfun ();
338 call_cgraph_insertion_hooks (node);
339 break;
341 case EXPANSION:
342 /* Functions created during expansion shall be compiled
343 directly. */
344 node->process = 0;
345 call_cgraph_insertion_hooks (node);
346 node->expand ();
347 break;
349 default:
350 gcc_unreachable ();
351 break;
355 cgraph_new_nodes.release ();
358 /* As an GCC extension we allow redefinition of the function. The
359 semantics when both copies of bodies differ is not well defined.
360 We replace the old body with new body so in unit at a time mode
361 we always use new body, while in normal mode we may end up with
362 old body inlined into some functions and new body expanded and
363 inlined in others.
365 ??? It may make more sense to use one body for inlining and other
366 body for expanding the function but this is difficult to do. */
368 void
369 cgraph_node::reset (void)
371 /* If process is set, then we have already begun whole-unit analysis.
372 This is *not* testing for whether we've already emitted the function.
373 That case can be sort-of legitimately seen with real function redefinition
374 errors. I would argue that the front end should never present us with
375 such a case, but don't enforce that for now. */
376 gcc_assert (!process);
378 /* Reset our data structures so we can analyze the function again. */
379 memset (&local, 0, sizeof (local));
380 memset (&global, 0, sizeof (global));
381 memset (&rtl, 0, sizeof (rtl));
382 analyzed = false;
383 definition = false;
384 alias = false;
385 weakref = false;
386 cpp_implicit_alias = false;
388 remove_callees ();
389 remove_all_references ();
392 /* Return true when there are references to the node. INCLUDE_SELF is
393 true if a self reference counts as a reference. */
395 bool
396 symtab_node::referred_to_p (bool include_self)
398 ipa_ref *ref = NULL;
400 /* See if there are any references at all. */
401 if (iterate_referring (0, ref))
402 return true;
403 /* For functions check also calls. */
404 cgraph_node *cn = dyn_cast <cgraph_node *> (this);
405 if (cn && cn->callers)
407 if (include_self)
408 return true;
409 for (cgraph_edge *e = cn->callers; e; e = e->next_caller)
410 if (e->caller != this)
411 return true;
413 return false;
416 /* DECL has been parsed. Take it, queue it, compile it at the whim of the
417 logic in effect. If NO_COLLECT is true, then our caller cannot stand to have
418 the garbage collector run at the moment. We would need to either create
419 a new GC context, or just not compile right now. */
421 void
422 cgraph_node::finalize_function (tree decl, bool no_collect)
424 cgraph_node *node = cgraph_node::get_create (decl);
426 if (node->definition)
428 /* Nested functions should only be defined once. */
429 gcc_assert (!DECL_CONTEXT (decl)
430 || TREE_CODE (DECL_CONTEXT (decl)) != FUNCTION_DECL);
431 node->reset ();
432 node->local.redefined_extern_inline = true;
435 /* Set definition first before calling notice_global_symbol so that
436 it is available to notice_global_symbol. */
437 node->definition = true;
438 notice_global_symbol (decl);
439 node->lowered = DECL_STRUCT_FUNCTION (decl)->cfg != NULL;
441 /* With -fkeep-inline-functions we are keeping all inline functions except
442 for extern inline ones. */
443 if (flag_keep_inline_functions
444 && DECL_DECLARED_INLINE_P (decl)
445 && !DECL_EXTERNAL (decl)
446 && !DECL_DISREGARD_INLINE_LIMITS (decl))
447 node->force_output = 1;
449 /* When not optimizing, also output the static functions. (see
450 PR24561), but don't do so for always_inline functions, functions
451 declared inline and nested functions. These were optimized out
452 in the original implementation and it is unclear whether we want
453 to change the behavior here. */
454 if (((!opt_for_fn (decl, optimize) || flag_keep_static_functions)
455 && !node->cpp_implicit_alias
456 && !DECL_DISREGARD_INLINE_LIMITS (decl)
457 && !DECL_DECLARED_INLINE_P (decl)
458 && !(DECL_CONTEXT (decl)
459 && TREE_CODE (DECL_CONTEXT (decl)) == FUNCTION_DECL))
460 && !DECL_COMDAT (decl) && !DECL_EXTERNAL (decl))
461 node->force_output = 1;
463 /* If we've not yet emitted decl, tell the debug info about it. */
464 if (!TREE_ASM_WRITTEN (decl))
465 (*debug_hooks->deferred_inline_function) (decl);
467 if (!no_collect)
468 ggc_collect ();
470 if (symtab->state == CONSTRUCTION
471 && (node->needed_p () || node->referred_to_p ()))
472 enqueue_node (node);
475 /* Add the function FNDECL to the call graph.
476 Unlike finalize_function, this function is intended to be used
477 by middle end and allows insertion of new function at arbitrary point
478 of compilation. The function can be either in high, low or SSA form
479 GIMPLE.
481 The function is assumed to be reachable and have address taken (so no
482 API breaking optimizations are performed on it).
484 Main work done by this function is to enqueue the function for later
485 processing to avoid need the passes to be re-entrant. */
487 void
488 cgraph_node::add_new_function (tree fndecl, bool lowered)
490 gcc::pass_manager *passes = g->get_passes ();
491 cgraph_node *node;
493 if (dump_file)
495 struct function *fn = DECL_STRUCT_FUNCTION (fndecl);
496 const char *function_type = ((gimple_has_body_p (fndecl))
497 ? (lowered
498 ? (gimple_in_ssa_p (fn)
499 ? "ssa gimple"
500 : "low gimple")
501 : "high gimple")
502 : "to-be-gimplified");
503 fprintf (dump_file,
504 "Added new %s function %s to callgraph\n",
505 function_type,
506 fndecl_name (fndecl));
509 switch (symtab->state)
511 case PARSING:
512 cgraph_node::finalize_function (fndecl, false);
513 break;
514 case CONSTRUCTION:
515 /* Just enqueue function to be processed at nearest occurrence. */
516 node = cgraph_node::get_create (fndecl);
517 if (lowered)
518 node->lowered = true;
519 cgraph_new_nodes.safe_push (node);
520 break;
522 case IPA:
523 case IPA_SSA:
524 case IPA_SSA_AFTER_INLINING:
525 case EXPANSION:
526 /* Bring the function into finalized state and enqueue for later
527 analyzing and compilation. */
528 node = cgraph_node::get_create (fndecl);
529 node->local.local = false;
530 node->definition = true;
531 node->force_output = true;
532 if (!lowered && symtab->state == EXPANSION)
534 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
535 gimple_register_cfg_hooks ();
536 bitmap_obstack_initialize (NULL);
537 execute_pass_list (cfun, passes->all_lowering_passes);
538 passes->execute_early_local_passes ();
539 bitmap_obstack_release (NULL);
540 pop_cfun ();
542 lowered = true;
544 if (lowered)
545 node->lowered = true;
546 cgraph_new_nodes.safe_push (node);
547 break;
549 case FINISHED:
550 /* At the very end of compilation we have to do all the work up
551 to expansion. */
552 node = cgraph_node::create (fndecl);
553 if (lowered)
554 node->lowered = true;
555 node->definition = true;
556 node->analyze ();
557 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
558 gimple_register_cfg_hooks ();
559 bitmap_obstack_initialize (NULL);
560 if (!gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
561 g->get_passes ()->execute_early_local_passes ();
562 bitmap_obstack_release (NULL);
563 pop_cfun ();
564 node->expand ();
565 break;
567 default:
568 gcc_unreachable ();
571 /* Set a personality if required and we already passed EH lowering. */
572 if (lowered
573 && (function_needs_eh_personality (DECL_STRUCT_FUNCTION (fndecl))
574 == eh_personality_lang))
575 DECL_FUNCTION_PERSONALITY (fndecl) = lang_hooks.eh_personality ();
578 /* Analyze the function scheduled to be output. */
579 void
580 cgraph_node::analyze (void)
582 tree decl = this->decl;
583 location_t saved_loc = input_location;
584 input_location = DECL_SOURCE_LOCATION (decl);
586 if (thunk.thunk_p)
588 cgraph_node *t = cgraph_node::get (thunk.alias);
590 create_edge (t, NULL, 0, CGRAPH_FREQ_BASE);
591 /* Target code in expand_thunk may need the thunk's target
592 to be analyzed, so recurse here. */
593 if (!t->analyzed)
594 t->analyze ();
595 if (t->alias)
597 t = t->get_alias_target ();
598 if (!t->analyzed)
599 t->analyze ();
601 if (!expand_thunk (false, false))
603 thunk.alias = NULL;
604 return;
606 thunk.alias = NULL;
608 if (alias)
609 resolve_alias (cgraph_node::get (alias_target));
610 else if (dispatcher_function)
612 /* Generate the dispatcher body of multi-versioned functions. */
613 cgraph_function_version_info *dispatcher_version_info
614 = function_version ();
615 if (dispatcher_version_info != NULL
616 && (dispatcher_version_info->dispatcher_resolver
617 == NULL_TREE))
619 tree resolver = NULL_TREE;
620 gcc_assert (targetm.generate_version_dispatcher_body);
621 resolver = targetm.generate_version_dispatcher_body (this);
622 gcc_assert (resolver != NULL_TREE);
625 else
627 push_cfun (DECL_STRUCT_FUNCTION (decl));
629 assign_assembler_name_if_neeeded (decl);
631 /* Make sure to gimplify bodies only once. During analyzing a
632 function we lower it, which will require gimplified nested
633 functions, so we can end up here with an already gimplified
634 body. */
635 if (!gimple_has_body_p (decl))
636 gimplify_function_tree (decl);
638 /* Lower the function. */
639 if (!lowered)
641 if (nested)
642 lower_nested_functions (decl);
643 gcc_assert (!nested);
645 gimple_register_cfg_hooks ();
646 bitmap_obstack_initialize (NULL);
647 execute_pass_list (cfun, g->get_passes ()->all_lowering_passes);
648 free_dominance_info (CDI_POST_DOMINATORS);
649 free_dominance_info (CDI_DOMINATORS);
650 compact_blocks ();
651 bitmap_obstack_release (NULL);
652 lowered = true;
655 pop_cfun ();
657 analyzed = true;
659 input_location = saved_loc;
662 /* C++ frontend produce same body aliases all over the place, even before PCH
663 gets streamed out. It relies on us linking the aliases with their function
664 in order to do the fixups, but ipa-ref is not PCH safe. Consequentely we
665 first produce aliases without links, but once C++ FE is sure he won't sream
666 PCH we build the links via this function. */
668 void
669 symbol_table::process_same_body_aliases (void)
671 symtab_node *node;
672 FOR_EACH_SYMBOL (node)
673 if (node->cpp_implicit_alias && !node->analyzed)
674 node->resolve_alias
675 (TREE_CODE (node->alias_target) == VAR_DECL
676 ? (symtab_node *)varpool_node::get_create (node->alias_target)
677 : (symtab_node *)cgraph_node::get_create (node->alias_target));
678 cpp_implicit_aliases_done = true;
681 /* Process attributes common for vars and functions. */
683 static void
684 process_common_attributes (symtab_node *node, tree decl)
686 tree weakref = lookup_attribute ("weakref", DECL_ATTRIBUTES (decl));
688 if (weakref && !lookup_attribute ("alias", DECL_ATTRIBUTES (decl)))
690 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
691 "%<weakref%> attribute should be accompanied with"
692 " an %<alias%> attribute");
693 DECL_WEAK (decl) = 0;
694 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
695 DECL_ATTRIBUTES (decl));
698 if (lookup_attribute ("no_reorder", DECL_ATTRIBUTES (decl)))
699 node->no_reorder = 1;
702 /* Look for externally_visible and used attributes and mark cgraph nodes
703 accordingly.
705 We cannot mark the nodes at the point the attributes are processed (in
706 handle_*_attribute) because the copy of the declarations available at that
707 point may not be canonical. For example, in:
709 void f();
710 void f() __attribute__((used));
712 the declaration we see in handle_used_attribute will be the second
713 declaration -- but the front end will subsequently merge that declaration
714 with the original declaration and discard the second declaration.
716 Furthermore, we can't mark these nodes in finalize_function because:
718 void f() {}
719 void f() __attribute__((externally_visible));
721 is valid.
723 So, we walk the nodes at the end of the translation unit, applying the
724 attributes at that point. */
726 static void
727 process_function_and_variable_attributes (cgraph_node *first,
728 varpool_node *first_var)
730 cgraph_node *node;
731 varpool_node *vnode;
733 for (node = symtab->first_function (); node != first;
734 node = symtab->next_function (node))
736 tree decl = node->decl;
737 if (DECL_PRESERVE_P (decl))
738 node->mark_force_output ();
739 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
741 if (! TREE_PUBLIC (node->decl))
742 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
743 "%<externally_visible%>"
744 " attribute have effect only on public objects");
746 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
747 && (node->definition && !node->alias))
749 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
750 "%<weakref%> attribute ignored"
751 " because function is defined");
752 DECL_WEAK (decl) = 0;
753 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
754 DECL_ATTRIBUTES (decl));
757 if (lookup_attribute ("always_inline", DECL_ATTRIBUTES (decl))
758 && !DECL_DECLARED_INLINE_P (decl)
759 /* redefining extern inline function makes it DECL_UNINLINABLE. */
760 && !DECL_UNINLINABLE (decl))
761 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
762 "always_inline function might not be inlinable");
764 process_common_attributes (node, decl);
766 for (vnode = symtab->first_variable (); vnode != first_var;
767 vnode = symtab->next_variable (vnode))
769 tree decl = vnode->decl;
770 if (DECL_EXTERNAL (decl)
771 && DECL_INITIAL (decl))
772 varpool_node::finalize_decl (decl);
773 if (DECL_PRESERVE_P (decl))
774 vnode->force_output = true;
775 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
777 if (! TREE_PUBLIC (vnode->decl))
778 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
779 "%<externally_visible%>"
780 " attribute have effect only on public objects");
782 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
783 && vnode->definition
784 && DECL_INITIAL (decl))
786 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
787 "%<weakref%> attribute ignored"
788 " because variable is initialized");
789 DECL_WEAK (decl) = 0;
790 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
791 DECL_ATTRIBUTES (decl));
793 process_common_attributes (vnode, decl);
797 /* Mark DECL as finalized. By finalizing the declaration, frontend instruct the
798 middle end to output the variable to asm file, if needed or externally
799 visible. */
801 void
802 varpool_node::finalize_decl (tree decl)
804 varpool_node *node = varpool_node::get_create (decl);
806 gcc_assert (TREE_STATIC (decl) || DECL_EXTERNAL (decl));
808 if (node->definition)
809 return;
810 /* Set definition first before calling notice_global_symbol so that
811 it is available to notice_global_symbol. */
812 node->definition = true;
813 notice_global_symbol (decl);
814 if (TREE_THIS_VOLATILE (decl) || DECL_PRESERVE_P (decl)
815 /* Traditionally we do not eliminate static variables when not
816 optimizing and when not doing toplevel reoder. */
817 || node->no_reorder
818 || ((!flag_toplevel_reorder
819 && !DECL_COMDAT (node->decl)
820 && !DECL_ARTIFICIAL (node->decl))))
821 node->force_output = true;
823 if (symtab->state == CONSTRUCTION
824 && (node->needed_p () || node->referred_to_p ()))
825 enqueue_node (node);
826 if (symtab->state >= IPA_SSA)
827 node->analyze ();
828 /* Some frontends produce various interface variables after compilation
829 finished. */
830 if (symtab->state == FINISHED
831 || (!flag_toplevel_reorder
832 && symtab->state == EXPANSION))
833 node->assemble_decl ();
835 if (DECL_INITIAL (decl))
836 chkp_register_var_initializer (decl);
839 /* EDGE is an polymorphic call. Mark all possible targets as reachable
840 and if there is only one target, perform trivial devirtualization.
841 REACHABLE_CALL_TARGETS collects target lists we already walked to
842 avoid udplicate work. */
844 static void
845 walk_polymorphic_call_targets (hash_set<void *> *reachable_call_targets,
846 cgraph_edge *edge)
848 unsigned int i;
849 void *cache_token;
850 bool final;
851 vec <cgraph_node *>targets
852 = possible_polymorphic_call_targets
853 (edge, &final, &cache_token);
855 if (!reachable_call_targets->add (cache_token))
857 if (symtab->dump_file)
858 dump_possible_polymorphic_call_targets
859 (symtab->dump_file, edge);
861 for (i = 0; i < targets.length (); i++)
863 /* Do not bother to mark virtual methods in anonymous namespace;
864 either we will find use of virtual table defining it, or it is
865 unused. */
866 if (targets[i]->definition
867 && TREE_CODE
868 (TREE_TYPE (targets[i]->decl))
869 == METHOD_TYPE
870 && !type_in_anonymous_namespace_p
871 (TYPE_METHOD_BASETYPE (TREE_TYPE (targets[i]->decl))))
872 enqueue_node (targets[i]);
876 /* Very trivial devirtualization; when the type is
877 final or anonymous (so we know all its derivation)
878 and there is only one possible virtual call target,
879 make the edge direct. */
880 if (final)
882 if (targets.length () <= 1 && dbg_cnt (devirt))
884 cgraph_node *target;
885 if (targets.length () == 1)
886 target = targets[0];
887 else
888 target = cgraph_node::create
889 (builtin_decl_implicit (BUILT_IN_UNREACHABLE));
891 if (symtab->dump_file)
893 fprintf (symtab->dump_file,
894 "Devirtualizing call: ");
895 print_gimple_stmt (symtab->dump_file,
896 edge->call_stmt, 0,
897 TDF_SLIM);
899 if (dump_enabled_p ())
901 location_t locus = gimple_location_safe (edge->call_stmt);
902 dump_printf_loc (MSG_OPTIMIZED_LOCATIONS, locus,
903 "devirtualizing call in %s to %s\n",
904 edge->caller->name (), target->name ());
907 edge->make_direct (target);
908 edge->redirect_call_stmt_to_callee ();
910 /* Call to __builtin_unreachable shouldn't be instrumented. */
911 if (!targets.length ())
912 gimple_call_set_with_bounds (edge->call_stmt, false);
914 if (symtab->dump_file)
916 fprintf (symtab->dump_file,
917 "Devirtualized as: ");
918 print_gimple_stmt (symtab->dump_file,
919 edge->call_stmt, 0,
920 TDF_SLIM);
926 /* Issue appropriate warnings for the global declaration DECL. */
928 static void
929 check_global_declaration (symtab_node *snode)
931 tree decl = snode->decl;
933 /* Warn about any function declared static but not defined. We don't
934 warn about variables, because many programs have static variables
935 that exist only to get some text into the object file. */
936 if (TREE_CODE (decl) == FUNCTION_DECL
937 && DECL_INITIAL (decl) == 0
938 && DECL_EXTERNAL (decl)
939 && ! DECL_ARTIFICIAL (decl)
940 && ! TREE_NO_WARNING (decl)
941 && ! TREE_PUBLIC (decl)
942 && (warn_unused_function
943 || snode->referred_to_p (/*include_self=*/false)))
945 if (snode->referred_to_p (/*include_self=*/false))
946 pedwarn (input_location, 0, "%q+F used but never defined", decl);
947 else
948 warning (OPT_Wunused_function, "%q+F declared %<static%> but never defined", decl);
949 /* This symbol is effectively an "extern" declaration now. */
950 TREE_PUBLIC (decl) = 1;
953 /* Warn about static fns or vars defined but not used. */
954 if (((warn_unused_function && TREE_CODE (decl) == FUNCTION_DECL)
955 || (((warn_unused_variable && ! TREE_READONLY (decl))
956 || (warn_unused_const_variable && TREE_READONLY (decl)))
957 && TREE_CODE (decl) == VAR_DECL))
958 && ! DECL_IN_SYSTEM_HEADER (decl)
959 && ! snode->referred_to_p (/*include_self=*/false)
960 /* This TREE_USED check is needed in addition to referred_to_p
961 above, because the `__unused__' attribute is not being
962 considered for referred_to_p. */
963 && ! TREE_USED (decl)
964 /* The TREE_USED bit for file-scope decls is kept in the identifier,
965 to handle multiple external decls in different scopes. */
966 && ! (DECL_NAME (decl) && TREE_USED (DECL_NAME (decl)))
967 && ! DECL_EXTERNAL (decl)
968 && ! DECL_ARTIFICIAL (decl)
969 && ! DECL_ABSTRACT_ORIGIN (decl)
970 && ! TREE_PUBLIC (decl)
971 /* A volatile variable might be used in some non-obvious way. */
972 && ! TREE_THIS_VOLATILE (decl)
973 /* Global register variables must be declared to reserve them. */
974 && ! (TREE_CODE (decl) == VAR_DECL && DECL_REGISTER (decl))
975 /* Global ctors and dtors are called by the runtime. */
976 && (TREE_CODE (decl) != FUNCTION_DECL
977 || (!DECL_STATIC_CONSTRUCTOR (decl)
978 && !DECL_STATIC_DESTRUCTOR (decl)))
979 /* Otherwise, ask the language. */
980 && lang_hooks.decls.warn_unused_global (decl))
981 warning_at (DECL_SOURCE_LOCATION (decl),
982 (TREE_CODE (decl) == FUNCTION_DECL)
983 ? OPT_Wunused_function
984 : (TREE_READONLY (decl)
985 ? OPT_Wunused_const_variable
986 : OPT_Wunused_variable),
987 "%qD defined but not used", decl);
990 /* Discover all functions and variables that are trivially needed, analyze
991 them as well as all functions and variables referred by them */
992 static cgraph_node *first_analyzed;
993 static varpool_node *first_analyzed_var;
995 /* FIRST_TIME is set to TRUE for the first time we are called for a
996 translation unit from finalize_compilation_unit() or false
997 otherwise. */
999 static void
1000 analyze_functions (bool first_time)
1002 /* Keep track of already processed nodes when called multiple times for
1003 intermodule optimization. */
1004 cgraph_node *first_handled = first_analyzed;
1005 varpool_node *first_handled_var = first_analyzed_var;
1006 hash_set<void *> reachable_call_targets;
1008 symtab_node *node;
1009 symtab_node *next;
1010 int i;
1011 ipa_ref *ref;
1012 bool changed = true;
1013 location_t saved_loc = input_location;
1015 bitmap_obstack_initialize (NULL);
1016 symtab->state = CONSTRUCTION;
1017 input_location = UNKNOWN_LOCATION;
1019 /* Ugly, but the fixup can not happen at a time same body alias is created;
1020 C++ FE is confused about the COMDAT groups being right. */
1021 if (symtab->cpp_implicit_aliases_done)
1022 FOR_EACH_SYMBOL (node)
1023 if (node->cpp_implicit_alias)
1024 node->fixup_same_cpp_alias_visibility (node->get_alias_target ());
1025 build_type_inheritance_graph ();
1027 /* Analysis adds static variables that in turn adds references to new functions.
1028 So we need to iterate the process until it stabilize. */
1029 while (changed)
1031 changed = false;
1032 process_function_and_variable_attributes (first_analyzed,
1033 first_analyzed_var);
1035 /* First identify the trivially needed symbols. */
1036 for (node = symtab->first_symbol ();
1037 node != first_analyzed
1038 && node != first_analyzed_var; node = node->next)
1040 /* Convert COMDAT group designators to IDENTIFIER_NODEs. */
1041 node->get_comdat_group_id ();
1042 if (node->needed_p ())
1044 enqueue_node (node);
1045 if (!changed && symtab->dump_file)
1046 fprintf (symtab->dump_file, "Trivially needed symbols:");
1047 changed = true;
1048 if (symtab->dump_file)
1049 fprintf (symtab->dump_file, " %s", node->asm_name ());
1050 if (!changed && symtab->dump_file)
1051 fprintf (symtab->dump_file, "\n");
1053 if (node == first_analyzed
1054 || node == first_analyzed_var)
1055 break;
1057 symtab->process_new_functions ();
1058 first_analyzed_var = symtab->first_variable ();
1059 first_analyzed = symtab->first_function ();
1061 if (changed && symtab->dump_file)
1062 fprintf (symtab->dump_file, "\n");
1064 /* Lower representation, build callgraph edges and references for all trivially
1065 needed symbols and all symbols referred by them. */
1066 while (queued_nodes != &symtab_terminator)
1068 changed = true;
1069 node = queued_nodes;
1070 queued_nodes = (symtab_node *)queued_nodes->aux;
1071 cgraph_node *cnode = dyn_cast <cgraph_node *> (node);
1072 if (cnode && cnode->definition)
1074 cgraph_edge *edge;
1075 tree decl = cnode->decl;
1077 /* ??? It is possible to create extern inline function
1078 and later using weak alias attribute to kill its body.
1079 See gcc.c-torture/compile/20011119-1.c */
1080 if (!DECL_STRUCT_FUNCTION (decl)
1081 && !cnode->alias
1082 && !cnode->thunk.thunk_p
1083 && !cnode->dispatcher_function)
1085 cnode->reset ();
1086 cnode->local.redefined_extern_inline = true;
1087 continue;
1090 if (!cnode->analyzed)
1091 cnode->analyze ();
1093 for (edge = cnode->callees; edge; edge = edge->next_callee)
1094 if (edge->callee->definition
1095 && (!DECL_EXTERNAL (edge->callee->decl)
1096 /* When not optimizing, do not try to analyze extern
1097 inline functions. Doing so is pointless. */
1098 || opt_for_fn (edge->callee->decl, optimize)
1099 /* Weakrefs needs to be preserved. */
1100 || edge->callee->alias
1101 /* always_inline functions are inlined aven at -O0. */
1102 || lookup_attribute
1103 ("always_inline",
1104 DECL_ATTRIBUTES (edge->callee->decl))
1105 /* Multiversioned functions needs the dispatcher to
1106 be produced locally even for extern functions. */
1107 || edge->callee->function_version ()))
1108 enqueue_node (edge->callee);
1109 if (opt_for_fn (cnode->decl, optimize)
1110 && opt_for_fn (cnode->decl, flag_devirtualize))
1112 cgraph_edge *next;
1114 for (edge = cnode->indirect_calls; edge; edge = next)
1116 next = edge->next_callee;
1117 if (edge->indirect_info->polymorphic)
1118 walk_polymorphic_call_targets (&reachable_call_targets,
1119 edge);
1123 /* If decl is a clone of an abstract function,
1124 mark that abstract function so that we don't release its body.
1125 The DECL_INITIAL() of that abstract function declaration
1126 will be later needed to output debug info. */
1127 if (DECL_ABSTRACT_ORIGIN (decl))
1129 cgraph_node *origin_node
1130 = cgraph_node::get_create (DECL_ABSTRACT_ORIGIN (decl));
1131 origin_node->used_as_abstract_origin = true;
1134 else
1136 varpool_node *vnode = dyn_cast <varpool_node *> (node);
1137 if (vnode && vnode->definition && !vnode->analyzed)
1138 vnode->analyze ();
1141 if (node->same_comdat_group)
1143 symtab_node *next;
1144 for (next = node->same_comdat_group;
1145 next != node;
1146 next = next->same_comdat_group)
1147 if (!next->comdat_local_p ())
1148 enqueue_node (next);
1150 for (i = 0; node->iterate_reference (i, ref); i++)
1151 if (ref->referred->definition
1152 && (!DECL_EXTERNAL (ref->referred->decl)
1153 || ((TREE_CODE (ref->referred->decl) != FUNCTION_DECL
1154 && optimize)
1155 || (TREE_CODE (ref->referred->decl) == FUNCTION_DECL
1156 && opt_for_fn (ref->referred->decl, optimize))
1157 || node->alias
1158 || ref->referred->alias)))
1159 enqueue_node (ref->referred);
1160 symtab->process_new_functions ();
1163 update_type_inheritance_graph ();
1165 /* Collect entry points to the unit. */
1166 if (symtab->dump_file)
1168 fprintf (symtab->dump_file, "\n\nInitial ");
1169 symtab_node::dump_table (symtab->dump_file);
1172 if (first_time)
1174 symtab_node *snode;
1175 FOR_EACH_SYMBOL (snode)
1176 check_global_declaration (snode);
1179 if (symtab->dump_file)
1180 fprintf (symtab->dump_file, "\nRemoving unused symbols:");
1182 for (node = symtab->first_symbol ();
1183 node != first_handled
1184 && node != first_handled_var; node = next)
1186 next = node->next;
1187 if (!node->aux && !node->referred_to_p ())
1189 if (symtab->dump_file)
1190 fprintf (symtab->dump_file, " %s", node->name ());
1192 /* See if the debugger can use anything before the DECL
1193 passes away. Perhaps it can notice a DECL that is now a
1194 constant and can tag the early DIE with an appropriate
1195 attribute.
1197 Otherwise, this is the last chance the debug_hooks have
1198 at looking at optimized away DECLs, since
1199 late_global_decl will subsequently be called from the
1200 contents of the now pruned symbol table. */
1201 if (!decl_function_context (node->decl))
1202 (*debug_hooks->late_global_decl) (node->decl);
1204 node->remove ();
1205 continue;
1207 if (cgraph_node *cnode = dyn_cast <cgraph_node *> (node))
1209 tree decl = node->decl;
1211 if (cnode->definition && !gimple_has_body_p (decl)
1212 && !cnode->alias
1213 && !cnode->thunk.thunk_p)
1214 cnode->reset ();
1216 gcc_assert (!cnode->definition || cnode->thunk.thunk_p
1217 || cnode->alias
1218 || gimple_has_body_p (decl));
1219 gcc_assert (cnode->analyzed == cnode->definition);
1221 node->aux = NULL;
1223 for (;node; node = node->next)
1224 node->aux = NULL;
1225 first_analyzed = symtab->first_function ();
1226 first_analyzed_var = symtab->first_variable ();
1227 if (symtab->dump_file)
1229 fprintf (symtab->dump_file, "\n\nReclaimed ");
1230 symtab_node::dump_table (symtab->dump_file);
1232 bitmap_obstack_release (NULL);
1233 ggc_collect ();
1234 /* Initialize assembler name hash, in particular we want to trigger C++
1235 mangling and same body alias creation before we free DECL_ARGUMENTS
1236 used by it. */
1237 if (!seen_error ())
1238 symtab->symtab_initialize_asm_name_hash ();
1240 input_location = saved_loc;
1243 /* Translate the ugly representation of aliases as alias pairs into nice
1244 representation in callgraph. We don't handle all cases yet,
1245 unfortunately. */
1247 static void
1248 handle_alias_pairs (void)
1250 alias_pair *p;
1251 unsigned i;
1253 for (i = 0; alias_pairs && alias_pairs->iterate (i, &p);)
1255 symtab_node *target_node = symtab_node::get_for_asmname (p->target);
1257 /* Weakrefs with target not defined in current unit are easy to handle:
1258 they behave just as external variables except we need to note the
1259 alias flag to later output the weakref pseudo op into asm file. */
1260 if (!target_node
1261 && lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)) != NULL)
1263 symtab_node *node = symtab_node::get (p->decl);
1264 if (node)
1266 node->alias_target = p->target;
1267 node->weakref = true;
1268 node->alias = true;
1270 alias_pairs->unordered_remove (i);
1271 continue;
1273 else if (!target_node)
1275 error ("%q+D aliased to undefined symbol %qE", p->decl, p->target);
1276 symtab_node *node = symtab_node::get (p->decl);
1277 if (node)
1278 node->alias = false;
1279 alias_pairs->unordered_remove (i);
1280 continue;
1283 if (DECL_EXTERNAL (target_node->decl)
1284 /* We use local aliases for C++ thunks to force the tailcall
1285 to bind locally. This is a hack - to keep it working do
1286 the following (which is not strictly correct). */
1287 && (TREE_CODE (target_node->decl) != FUNCTION_DECL
1288 || ! DECL_VIRTUAL_P (target_node->decl))
1289 && ! lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)))
1291 error ("%q+D aliased to external symbol %qE",
1292 p->decl, p->target);
1295 if (TREE_CODE (p->decl) == FUNCTION_DECL
1296 && target_node && is_a <cgraph_node *> (target_node))
1298 cgraph_node *src_node = cgraph_node::get (p->decl);
1299 if (src_node && src_node->definition)
1300 src_node->reset ();
1301 cgraph_node::create_alias (p->decl, target_node->decl);
1302 alias_pairs->unordered_remove (i);
1304 else if (TREE_CODE (p->decl) == VAR_DECL
1305 && target_node && is_a <varpool_node *> (target_node))
1307 varpool_node::create_alias (p->decl, target_node->decl);
1308 alias_pairs->unordered_remove (i);
1310 else
1312 error ("%q+D alias in between function and variable is not supported",
1313 p->decl);
1314 warning (0, "%q+D aliased declaration",
1315 target_node->decl);
1316 alias_pairs->unordered_remove (i);
1319 vec_free (alias_pairs);
1323 /* Figure out what functions we want to assemble. */
1325 static void
1326 mark_functions_to_output (void)
1328 bool check_same_comdat_groups = false;
1329 cgraph_node *node;
1331 if (flag_checking)
1332 FOR_EACH_FUNCTION (node)
1333 gcc_assert (!node->process);
1335 FOR_EACH_FUNCTION (node)
1337 tree decl = node->decl;
1339 gcc_assert (!node->process || node->same_comdat_group);
1340 if (node->process)
1341 continue;
1343 /* We need to output all local functions that are used and not
1344 always inlined, as well as those that are reachable from
1345 outside the current compilation unit. */
1346 if (node->analyzed
1347 && !node->thunk.thunk_p
1348 && !node->alias
1349 && !node->global.inlined_to
1350 && !TREE_ASM_WRITTEN (decl)
1351 && !DECL_EXTERNAL (decl))
1353 node->process = 1;
1354 if (node->same_comdat_group)
1356 cgraph_node *next;
1357 for (next = dyn_cast<cgraph_node *> (node->same_comdat_group);
1358 next != node;
1359 next = dyn_cast<cgraph_node *> (next->same_comdat_group))
1360 if (!next->thunk.thunk_p && !next->alias
1361 && !next->comdat_local_p ())
1362 next->process = 1;
1365 else if (node->same_comdat_group)
1367 if (flag_checking)
1368 check_same_comdat_groups = true;
1370 else
1372 /* We should've reclaimed all functions that are not needed. */
1373 if (flag_checking
1374 && !node->global.inlined_to
1375 && gimple_has_body_p (decl)
1376 /* FIXME: in ltrans unit when offline copy is outside partition but inline copies
1377 are inside partition, we can end up not removing the body since we no longer
1378 have analyzed node pointing to it. */
1379 && !node->in_other_partition
1380 && !node->alias
1381 && !node->clones
1382 && !DECL_EXTERNAL (decl))
1384 node->debug ();
1385 internal_error ("failed to reclaim unneeded function");
1387 gcc_assert (node->global.inlined_to
1388 || !gimple_has_body_p (decl)
1389 || node->in_other_partition
1390 || node->clones
1391 || DECL_ARTIFICIAL (decl)
1392 || DECL_EXTERNAL (decl));
1397 if (flag_checking && check_same_comdat_groups)
1398 FOR_EACH_FUNCTION (node)
1399 if (node->same_comdat_group && !node->process)
1401 tree decl = node->decl;
1402 if (!node->global.inlined_to
1403 && gimple_has_body_p (decl)
1404 /* FIXME: in an ltrans unit when the offline copy is outside a
1405 partition but inline copies are inside a partition, we can
1406 end up not removing the body since we no longer have an
1407 analyzed node pointing to it. */
1408 && !node->in_other_partition
1409 && !node->clones
1410 && !DECL_EXTERNAL (decl))
1412 node->debug ();
1413 internal_error ("failed to reclaim unneeded function in same "
1414 "comdat group");
1419 /* DECL is FUNCTION_DECL. Initialize datastructures so DECL is a function
1420 in lowered gimple form. IN_SSA is true if the gimple is in SSA.
1422 Set current_function_decl and cfun to newly constructed empty function body.
1423 return basic block in the function body. */
1425 basic_block
1426 init_lowered_empty_function (tree decl, bool in_ssa, gcov_type count)
1428 basic_block bb;
1429 edge e;
1431 current_function_decl = decl;
1432 allocate_struct_function (decl, false);
1433 gimple_register_cfg_hooks ();
1434 init_empty_tree_cfg ();
1436 if (in_ssa)
1438 init_tree_ssa (cfun);
1439 init_ssa_operands (cfun);
1440 cfun->gimple_df->in_ssa_p = true;
1441 cfun->curr_properties |= PROP_ssa;
1444 DECL_INITIAL (decl) = make_node (BLOCK);
1446 DECL_SAVED_TREE (decl) = error_mark_node;
1447 cfun->curr_properties |= (PROP_gimple_lcf | PROP_gimple_leh | PROP_gimple_any
1448 | PROP_cfg | PROP_loops);
1450 set_loops_for_fn (cfun, ggc_cleared_alloc<loops> ());
1451 init_loops_structure (cfun, loops_for_fn (cfun), 1);
1452 loops_for_fn (cfun)->state |= LOOPS_MAY_HAVE_MULTIPLE_LATCHES;
1454 /* Create BB for body of the function and connect it properly. */
1455 ENTRY_BLOCK_PTR_FOR_FN (cfun)->count = count;
1456 ENTRY_BLOCK_PTR_FOR_FN (cfun)->frequency = REG_BR_PROB_BASE;
1457 EXIT_BLOCK_PTR_FOR_FN (cfun)->count = count;
1458 EXIT_BLOCK_PTR_FOR_FN (cfun)->frequency = REG_BR_PROB_BASE;
1459 bb = create_basic_block (NULL, ENTRY_BLOCK_PTR_FOR_FN (cfun));
1460 bb->count = count;
1461 bb->frequency = BB_FREQ_MAX;
1462 e = make_edge (ENTRY_BLOCK_PTR_FOR_FN (cfun), bb, EDGE_FALLTHRU);
1463 e->count = count;
1464 e->probability = REG_BR_PROB_BASE;
1465 e = make_edge (bb, EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
1466 e->count = count;
1467 e->probability = REG_BR_PROB_BASE;
1468 add_bb_to_loop (bb, ENTRY_BLOCK_PTR_FOR_FN (cfun)->loop_father);
1470 return bb;
1473 /* Adjust PTR by the constant FIXED_OFFSET, and by the vtable
1474 offset indicated by VIRTUAL_OFFSET, if that is
1475 non-null. THIS_ADJUSTING is nonzero for a this adjusting thunk and
1476 zero for a result adjusting thunk. */
1478 static tree
1479 thunk_adjust (gimple_stmt_iterator * bsi,
1480 tree ptr, bool this_adjusting,
1481 HOST_WIDE_INT fixed_offset, tree virtual_offset)
1483 gassign *stmt;
1484 tree ret;
1486 if (this_adjusting
1487 && fixed_offset != 0)
1489 stmt = gimple_build_assign
1490 (ptr, fold_build_pointer_plus_hwi_loc (input_location,
1491 ptr,
1492 fixed_offset));
1493 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1496 /* If there's a virtual offset, look up that value in the vtable and
1497 adjust the pointer again. */
1498 if (virtual_offset)
1500 tree vtabletmp;
1501 tree vtabletmp2;
1502 tree vtabletmp3;
1504 if (!vtable_entry_type)
1506 tree vfunc_type = make_node (FUNCTION_TYPE);
1507 TREE_TYPE (vfunc_type) = integer_type_node;
1508 TYPE_ARG_TYPES (vfunc_type) = NULL_TREE;
1509 layout_type (vfunc_type);
1511 vtable_entry_type = build_pointer_type (vfunc_type);
1514 vtabletmp =
1515 create_tmp_reg (build_pointer_type
1516 (build_pointer_type (vtable_entry_type)), "vptr");
1518 /* The vptr is always at offset zero in the object. */
1519 stmt = gimple_build_assign (vtabletmp,
1520 build1 (NOP_EXPR, TREE_TYPE (vtabletmp),
1521 ptr));
1522 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1524 /* Form the vtable address. */
1525 vtabletmp2 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp)),
1526 "vtableaddr");
1527 stmt = gimple_build_assign (vtabletmp2,
1528 build_simple_mem_ref (vtabletmp));
1529 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1531 /* Find the entry with the vcall offset. */
1532 stmt = gimple_build_assign (vtabletmp2,
1533 fold_build_pointer_plus_loc (input_location,
1534 vtabletmp2,
1535 virtual_offset));
1536 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1538 /* Get the offset itself. */
1539 vtabletmp3 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp2)),
1540 "vcalloffset");
1541 stmt = gimple_build_assign (vtabletmp3,
1542 build_simple_mem_ref (vtabletmp2));
1543 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1545 /* Adjust the `this' pointer. */
1546 ptr = fold_build_pointer_plus_loc (input_location, ptr, vtabletmp3);
1547 ptr = force_gimple_operand_gsi (bsi, ptr, true, NULL_TREE, false,
1548 GSI_CONTINUE_LINKING);
1551 if (!this_adjusting
1552 && fixed_offset != 0)
1553 /* Adjust the pointer by the constant. */
1555 tree ptrtmp;
1557 if (TREE_CODE (ptr) == VAR_DECL)
1558 ptrtmp = ptr;
1559 else
1561 ptrtmp = create_tmp_reg (TREE_TYPE (ptr), "ptr");
1562 stmt = gimple_build_assign (ptrtmp, ptr);
1563 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1565 ptr = fold_build_pointer_plus_hwi_loc (input_location,
1566 ptrtmp, fixed_offset);
1569 /* Emit the statement and gimplify the adjustment expression. */
1570 ret = create_tmp_reg (TREE_TYPE (ptr), "adjusted_this");
1571 stmt = gimple_build_assign (ret, ptr);
1572 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1574 return ret;
1577 /* Expand thunk NODE to gimple if possible.
1578 When FORCE_GIMPLE_THUNK is true, gimple thunk is created and
1579 no assembler is produced.
1580 When OUTPUT_ASM_THUNK is true, also produce assembler for
1581 thunks that are not lowered. */
1583 bool
1584 cgraph_node::expand_thunk (bool output_asm_thunks, bool force_gimple_thunk)
1586 bool this_adjusting = thunk.this_adjusting;
1587 HOST_WIDE_INT fixed_offset = thunk.fixed_offset;
1588 HOST_WIDE_INT virtual_value = thunk.virtual_value;
1589 tree virtual_offset = NULL;
1590 tree alias = callees->callee->decl;
1591 tree thunk_fndecl = decl;
1592 tree a;
1594 /* Instrumentation thunk is the same function with
1595 a different signature. Never need to expand it. */
1596 if (thunk.add_pointer_bounds_args)
1597 return false;
1599 if (!force_gimple_thunk && this_adjusting
1600 && targetm.asm_out.can_output_mi_thunk (thunk_fndecl, fixed_offset,
1601 virtual_value, alias))
1603 const char *fnname;
1604 tree fn_block;
1605 tree restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1607 if (!output_asm_thunks)
1609 analyzed = true;
1610 return false;
1613 if (in_lto_p)
1614 get_untransformed_body ();
1615 a = DECL_ARGUMENTS (thunk_fndecl);
1617 current_function_decl = thunk_fndecl;
1619 /* Ensure thunks are emitted in their correct sections. */
1620 resolve_unique_section (thunk_fndecl, 0,
1621 flag_function_sections);
1623 DECL_RESULT (thunk_fndecl)
1624 = build_decl (DECL_SOURCE_LOCATION (thunk_fndecl),
1625 RESULT_DECL, 0, restype);
1626 DECL_CONTEXT (DECL_RESULT (thunk_fndecl)) = thunk_fndecl;
1627 fnname = IDENTIFIER_POINTER (DECL_ASSEMBLER_NAME (thunk_fndecl));
1629 /* The back end expects DECL_INITIAL to contain a BLOCK, so we
1630 create one. */
1631 fn_block = make_node (BLOCK);
1632 BLOCK_VARS (fn_block) = a;
1633 DECL_INITIAL (thunk_fndecl) = fn_block;
1634 init_function_start (thunk_fndecl);
1635 cfun->is_thunk = 1;
1636 insn_locations_init ();
1637 set_curr_insn_location (DECL_SOURCE_LOCATION (thunk_fndecl));
1638 prologue_location = curr_insn_location ();
1639 assemble_start_function (thunk_fndecl, fnname);
1641 targetm.asm_out.output_mi_thunk (asm_out_file, thunk_fndecl,
1642 fixed_offset, virtual_value, alias);
1644 assemble_end_function (thunk_fndecl, fnname);
1645 insn_locations_finalize ();
1646 init_insn_lengths ();
1647 free_after_compilation (cfun);
1648 set_cfun (NULL);
1649 TREE_ASM_WRITTEN (thunk_fndecl) = 1;
1650 thunk.thunk_p = false;
1651 analyzed = false;
1653 else if (stdarg_p (TREE_TYPE (thunk_fndecl)))
1655 error ("generic thunk code fails for method %qD which uses %<...%>",
1656 thunk_fndecl);
1657 TREE_ASM_WRITTEN (thunk_fndecl) = 1;
1658 analyzed = true;
1659 return false;
1661 else
1663 tree restype;
1664 basic_block bb, then_bb, else_bb, return_bb;
1665 gimple_stmt_iterator bsi;
1666 int nargs = 0;
1667 tree arg;
1668 int i;
1669 tree resdecl;
1670 tree restmp = NULL;
1671 tree resbnd = NULL;
1673 gcall *call;
1674 greturn *ret;
1675 bool alias_is_noreturn = TREE_THIS_VOLATILE (alias);
1677 if (in_lto_p)
1678 get_untransformed_body ();
1679 a = DECL_ARGUMENTS (thunk_fndecl);
1681 current_function_decl = thunk_fndecl;
1683 /* Ensure thunks are emitted in their correct sections. */
1684 resolve_unique_section (thunk_fndecl, 0,
1685 flag_function_sections);
1687 DECL_IGNORED_P (thunk_fndecl) = 1;
1688 bitmap_obstack_initialize (NULL);
1690 if (thunk.virtual_offset_p)
1691 virtual_offset = size_int (virtual_value);
1693 /* Build the return declaration for the function. */
1694 restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1695 if (DECL_RESULT (thunk_fndecl) == NULL_TREE)
1697 resdecl = build_decl (input_location, RESULT_DECL, 0, restype);
1698 DECL_ARTIFICIAL (resdecl) = 1;
1699 DECL_IGNORED_P (resdecl) = 1;
1700 DECL_RESULT (thunk_fndecl) = resdecl;
1701 DECL_CONTEXT (DECL_RESULT (thunk_fndecl)) = thunk_fndecl;
1703 else
1704 resdecl = DECL_RESULT (thunk_fndecl);
1706 bb = then_bb = else_bb = return_bb
1707 = init_lowered_empty_function (thunk_fndecl, true, count);
1709 bsi = gsi_start_bb (bb);
1711 /* Build call to the function being thunked. */
1712 if (!VOID_TYPE_P (restype) && !alias_is_noreturn)
1714 if (DECL_BY_REFERENCE (resdecl))
1716 restmp = gimple_fold_indirect_ref (resdecl);
1717 if (!restmp)
1718 restmp = build2 (MEM_REF,
1719 TREE_TYPE (TREE_TYPE (DECL_RESULT (alias))),
1720 resdecl,
1721 build_int_cst (TREE_TYPE
1722 (DECL_RESULT (alias)), 0));
1724 else if (!is_gimple_reg_type (restype))
1726 if (aggregate_value_p (resdecl, TREE_TYPE (thunk_fndecl)))
1728 restmp = resdecl;
1730 if (TREE_CODE (restmp) == VAR_DECL)
1731 add_local_decl (cfun, restmp);
1732 BLOCK_VARS (DECL_INITIAL (current_function_decl)) = restmp;
1734 else
1735 restmp = create_tmp_var (restype, "retval");
1737 else
1738 restmp = create_tmp_reg (restype, "retval");
1741 for (arg = a; arg; arg = DECL_CHAIN (arg))
1742 nargs++;
1743 auto_vec<tree> vargs (nargs);
1744 i = 0;
1745 arg = a;
1746 if (this_adjusting)
1748 vargs.quick_push (thunk_adjust (&bsi, a, 1, fixed_offset,
1749 virtual_offset));
1750 arg = DECL_CHAIN (a);
1751 i = 1;
1754 if (nargs)
1755 for (; i < nargs; i++, arg = DECL_CHAIN (arg))
1757 tree tmp = arg;
1758 if (!is_gimple_val (arg))
1760 tmp = create_tmp_reg (TYPE_MAIN_VARIANT
1761 (TREE_TYPE (arg)), "arg");
1762 gimple *stmt = gimple_build_assign (tmp, arg);
1763 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1765 vargs.quick_push (tmp);
1767 call = gimple_build_call_vec (build_fold_addr_expr_loc (0, alias), vargs);
1768 callees->call_stmt = call;
1769 gimple_call_set_from_thunk (call, true);
1770 gimple_call_set_with_bounds (call, instrumentation_clone);
1772 /* Return slot optimization is always possible and in fact requred to
1773 return values with DECL_BY_REFERENCE. */
1774 if (aggregate_value_p (resdecl, TREE_TYPE (thunk_fndecl))
1775 && (!is_gimple_reg_type (TREE_TYPE (resdecl))
1776 || DECL_BY_REFERENCE (resdecl)))
1777 gimple_call_set_return_slot_opt (call, true);
1779 if (restmp && !alias_is_noreturn)
1781 gimple_call_set_lhs (call, restmp);
1782 gcc_assert (useless_type_conversion_p (TREE_TYPE (restmp),
1783 TREE_TYPE (TREE_TYPE (alias))));
1785 gsi_insert_after (&bsi, call, GSI_NEW_STMT);
1786 if (!alias_is_noreturn)
1788 if (instrumentation_clone
1789 && !DECL_BY_REFERENCE (resdecl)
1790 && restmp
1791 && BOUNDED_P (restmp))
1793 resbnd = chkp_insert_retbnd_call (NULL, restmp, &bsi);
1794 create_edge (get_create (gimple_call_fndecl (gsi_stmt (bsi))),
1795 as_a <gcall *> (gsi_stmt (bsi)),
1796 callees->count, callees->frequency);
1799 if (restmp && !this_adjusting
1800 && (fixed_offset || virtual_offset))
1802 tree true_label = NULL_TREE;
1804 if (TREE_CODE (TREE_TYPE (restmp)) == POINTER_TYPE)
1806 gimple *stmt;
1807 edge e;
1808 /* If the return type is a pointer, we need to
1809 protect against NULL. We know there will be an
1810 adjustment, because that's why we're emitting a
1811 thunk. */
1812 then_bb = create_basic_block (NULL, bb);
1813 then_bb->count = count - count / 16;
1814 then_bb->frequency = BB_FREQ_MAX - BB_FREQ_MAX / 16;
1815 return_bb = create_basic_block (NULL, then_bb);
1816 return_bb->count = count;
1817 return_bb->frequency = BB_FREQ_MAX;
1818 else_bb = create_basic_block (NULL, else_bb);
1819 then_bb->count = count / 16;
1820 then_bb->frequency = BB_FREQ_MAX / 16;
1821 add_bb_to_loop (then_bb, bb->loop_father);
1822 add_bb_to_loop (return_bb, bb->loop_father);
1823 add_bb_to_loop (else_bb, bb->loop_father);
1824 remove_edge (single_succ_edge (bb));
1825 true_label = gimple_block_label (then_bb);
1826 stmt = gimple_build_cond (NE_EXPR, restmp,
1827 build_zero_cst (TREE_TYPE (restmp)),
1828 NULL_TREE, NULL_TREE);
1829 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1830 e = make_edge (bb, then_bb, EDGE_TRUE_VALUE);
1831 e->probability = REG_BR_PROB_BASE - REG_BR_PROB_BASE / 16;
1832 e->count = count - count / 16;
1833 e = make_edge (bb, else_bb, EDGE_FALSE_VALUE);
1834 e->probability = REG_BR_PROB_BASE / 16;
1835 e->count = count / 16;
1836 e = make_edge (return_bb, EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
1837 e->probability = REG_BR_PROB_BASE;
1838 e->count = count;
1839 e = make_edge (then_bb, return_bb, EDGE_FALLTHRU);
1840 e->probability = REG_BR_PROB_BASE;
1841 e->count = count - count / 16;
1842 e = make_edge (else_bb, return_bb, EDGE_FALLTHRU);
1843 e->probability = REG_BR_PROB_BASE;
1844 e->count = count / 16;
1845 bsi = gsi_last_bb (then_bb);
1848 restmp = thunk_adjust (&bsi, restmp, /*this_adjusting=*/0,
1849 fixed_offset, virtual_offset);
1850 if (true_label)
1852 gimple *stmt;
1853 bsi = gsi_last_bb (else_bb);
1854 stmt = gimple_build_assign (restmp,
1855 build_zero_cst (TREE_TYPE (restmp)));
1856 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1857 bsi = gsi_last_bb (return_bb);
1860 else
1861 gimple_call_set_tail (call, true);
1863 /* Build return value. */
1864 if (!DECL_BY_REFERENCE (resdecl))
1865 ret = gimple_build_return (restmp);
1866 else
1867 ret = gimple_build_return (resdecl);
1868 gimple_return_set_retbnd (ret, resbnd);
1870 gsi_insert_after (&bsi, ret, GSI_NEW_STMT);
1872 else
1874 gimple_call_set_tail (call, true);
1875 remove_edge (single_succ_edge (bb));
1878 cfun->gimple_df->in_ssa_p = true;
1879 profile_status_for_fn (cfun)
1880 = count ? PROFILE_READ : PROFILE_GUESSED;
1881 /* FIXME: C++ FE should stop setting TREE_ASM_WRITTEN on thunks. */
1882 TREE_ASM_WRITTEN (thunk_fndecl) = false;
1883 delete_unreachable_blocks ();
1884 update_ssa (TODO_update_ssa);
1885 checking_verify_flow_info ();
1886 free_dominance_info (CDI_DOMINATORS);
1888 /* Since we want to emit the thunk, we explicitly mark its name as
1889 referenced. */
1890 thunk.thunk_p = false;
1891 lowered = true;
1892 bitmap_obstack_release (NULL);
1894 current_function_decl = NULL;
1895 set_cfun (NULL);
1896 return true;
1899 /* Assemble thunks and aliases associated to node. */
1901 void
1902 cgraph_node::assemble_thunks_and_aliases (void)
1904 cgraph_edge *e;
1905 ipa_ref *ref;
1907 for (e = callers; e;)
1908 if (e->caller->thunk.thunk_p
1909 && !e->caller->thunk.add_pointer_bounds_args)
1911 cgraph_node *thunk = e->caller;
1913 e = e->next_caller;
1914 thunk->expand_thunk (true, false);
1915 thunk->assemble_thunks_and_aliases ();
1917 else
1918 e = e->next_caller;
1920 FOR_EACH_ALIAS (this, ref)
1922 cgraph_node *alias = dyn_cast <cgraph_node *> (ref->referring);
1923 bool saved_written = TREE_ASM_WRITTEN (decl);
1925 /* Force assemble_alias to really output the alias this time instead
1926 of buffering it in same alias pairs. */
1927 TREE_ASM_WRITTEN (decl) = 1;
1928 do_assemble_alias (alias->decl,
1929 DECL_ASSEMBLER_NAME (decl));
1930 alias->assemble_thunks_and_aliases ();
1931 TREE_ASM_WRITTEN (decl) = saved_written;
1935 /* Expand function specified by node. */
1937 void
1938 cgraph_node::expand (void)
1940 location_t saved_loc;
1942 /* We ought to not compile any inline clones. */
1943 gcc_assert (!global.inlined_to);
1945 announce_function (decl);
1946 process = 0;
1947 gcc_assert (lowered);
1948 get_untransformed_body ();
1950 /* Generate RTL for the body of DECL. */
1952 timevar_push (TV_REST_OF_COMPILATION);
1954 gcc_assert (symtab->global_info_ready);
1956 /* Initialize the default bitmap obstack. */
1957 bitmap_obstack_initialize (NULL);
1959 /* Initialize the RTL code for the function. */
1960 current_function_decl = decl;
1961 saved_loc = input_location;
1962 input_location = DECL_SOURCE_LOCATION (decl);
1963 init_function_start (decl);
1965 gimple_register_cfg_hooks ();
1967 bitmap_obstack_initialize (&reg_obstack); /* FIXME, only at RTL generation*/
1969 execute_all_ipa_transforms ();
1971 /* Perform all tree transforms and optimizations. */
1973 /* Signal the start of passes. */
1974 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_START, NULL);
1976 execute_pass_list (cfun, g->get_passes ()->all_passes);
1978 /* Signal the end of passes. */
1979 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_END, NULL);
1981 bitmap_obstack_release (&reg_obstack);
1983 /* Release the default bitmap obstack. */
1984 bitmap_obstack_release (NULL);
1986 /* If requested, warn about function definitions where the function will
1987 return a value (usually of some struct or union type) which itself will
1988 take up a lot of stack space. */
1989 if (warn_larger_than && !DECL_EXTERNAL (decl) && TREE_TYPE (decl))
1991 tree ret_type = TREE_TYPE (TREE_TYPE (decl));
1993 if (ret_type && TYPE_SIZE_UNIT (ret_type)
1994 && TREE_CODE (TYPE_SIZE_UNIT (ret_type)) == INTEGER_CST
1995 && 0 < compare_tree_int (TYPE_SIZE_UNIT (ret_type),
1996 larger_than_size))
1998 unsigned int size_as_int
1999 = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (ret_type));
2001 if (compare_tree_int (TYPE_SIZE_UNIT (ret_type), size_as_int) == 0)
2002 warning (OPT_Wlarger_than_, "size of return value of %q+D is %u bytes",
2003 decl, size_as_int);
2004 else
2005 warning (OPT_Wlarger_than_, "size of return value of %q+D is larger than %wd bytes",
2006 decl, larger_than_size);
2010 gimple_set_body (decl, NULL);
2011 if (DECL_STRUCT_FUNCTION (decl) == 0
2012 && !cgraph_node::get (decl)->origin)
2014 /* Stop pointing to the local nodes about to be freed.
2015 But DECL_INITIAL must remain nonzero so we know this
2016 was an actual function definition.
2017 For a nested function, this is done in c_pop_function_context.
2018 If rest_of_compilation set this to 0, leave it 0. */
2019 if (DECL_INITIAL (decl) != 0)
2020 DECL_INITIAL (decl) = error_mark_node;
2023 input_location = saved_loc;
2025 ggc_collect ();
2026 timevar_pop (TV_REST_OF_COMPILATION);
2028 /* Make sure that BE didn't give up on compiling. */
2029 gcc_assert (TREE_ASM_WRITTEN (decl));
2030 set_cfun (NULL);
2031 current_function_decl = NULL;
2033 /* It would make a lot more sense to output thunks before function body to get more
2034 forward and lest backwarding jumps. This however would need solving problem
2035 with comdats. See PR48668. Also aliases must come after function itself to
2036 make one pass assemblers, like one on AIX, happy. See PR 50689.
2037 FIXME: Perhaps thunks should be move before function IFF they are not in comdat
2038 groups. */
2039 assemble_thunks_and_aliases ();
2040 release_body ();
2041 /* Eliminate all call edges. This is important so the GIMPLE_CALL no longer
2042 points to the dead function body. */
2043 remove_callees ();
2044 remove_all_references ();
2047 /* Node comparer that is responsible for the order that corresponds
2048 to time when a function was launched for the first time. */
2050 static int
2051 node_cmp (const void *pa, const void *pb)
2053 const cgraph_node *a = *(const cgraph_node * const *) pa;
2054 const cgraph_node *b = *(const cgraph_node * const *) pb;
2056 /* Functions with time profile must be before these without profile. */
2057 if (!a->tp_first_run || !b->tp_first_run)
2058 return a->tp_first_run - b->tp_first_run;
2060 return a->tp_first_run != b->tp_first_run
2061 ? b->tp_first_run - a->tp_first_run
2062 : b->order - a->order;
2065 /* Expand all functions that must be output.
2067 Attempt to topologically sort the nodes so function is output when
2068 all called functions are already assembled to allow data to be
2069 propagated across the callgraph. Use a stack to get smaller distance
2070 between a function and its callees (later we may choose to use a more
2071 sophisticated algorithm for function reordering; we will likely want
2072 to use subsections to make the output functions appear in top-down
2073 order). */
2075 static void
2076 expand_all_functions (void)
2078 cgraph_node *node;
2079 cgraph_node **order = XCNEWVEC (cgraph_node *,
2080 symtab->cgraph_count);
2081 unsigned int expanded_func_count = 0, profiled_func_count = 0;
2082 int order_pos, new_order_pos = 0;
2083 int i;
2085 order_pos = ipa_reverse_postorder (order);
2086 gcc_assert (order_pos == symtab->cgraph_count);
2088 /* Garbage collector may remove inline clones we eliminate during
2089 optimization. So we must be sure to not reference them. */
2090 for (i = 0; i < order_pos; i++)
2091 if (order[i]->process)
2092 order[new_order_pos++] = order[i];
2094 if (flag_profile_reorder_functions)
2095 qsort (order, new_order_pos, sizeof (cgraph_node *), node_cmp);
2097 for (i = new_order_pos - 1; i >= 0; i--)
2099 node = order[i];
2101 if (node->process)
2103 expanded_func_count++;
2104 if(node->tp_first_run)
2105 profiled_func_count++;
2107 if (symtab->dump_file)
2108 fprintf (symtab->dump_file,
2109 "Time profile order in expand_all_functions:%s:%d\n",
2110 node->asm_name (), node->tp_first_run);
2111 node->process = 0;
2112 node->expand ();
2116 if (dump_file)
2117 fprintf (dump_file, "Expanded functions with time profile (%s):%u/%u\n",
2118 main_input_filename, profiled_func_count, expanded_func_count);
2120 if (symtab->dump_file && flag_profile_reorder_functions)
2121 fprintf (symtab->dump_file, "Expanded functions with time profile:%u/%u\n",
2122 profiled_func_count, expanded_func_count);
2124 symtab->process_new_functions ();
2125 free_gimplify_stack ();
2127 free (order);
2130 /* This is used to sort the node types by the cgraph order number. */
2132 enum cgraph_order_sort_kind
2134 ORDER_UNDEFINED = 0,
2135 ORDER_FUNCTION,
2136 ORDER_VAR,
2137 ORDER_ASM
2140 struct cgraph_order_sort
2142 enum cgraph_order_sort_kind kind;
2143 union
2145 cgraph_node *f;
2146 varpool_node *v;
2147 asm_node *a;
2148 } u;
2151 /* Output all functions, variables, and asm statements in the order
2152 according to their order fields, which is the order in which they
2153 appeared in the file. This implements -fno-toplevel-reorder. In
2154 this mode we may output functions and variables which don't really
2155 need to be output.
2156 When NO_REORDER is true only do this for symbols marked no reorder. */
2158 static void
2159 output_in_order (bool no_reorder)
2161 int max;
2162 cgraph_order_sort *nodes;
2163 int i;
2164 cgraph_node *pf;
2165 varpool_node *pv;
2166 asm_node *pa;
2167 max = symtab->order;
2168 nodes = XCNEWVEC (cgraph_order_sort, max);
2170 FOR_EACH_DEFINED_FUNCTION (pf)
2172 if (pf->process && !pf->thunk.thunk_p && !pf->alias)
2174 if (no_reorder && !pf->no_reorder)
2175 continue;
2176 i = pf->order;
2177 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
2178 nodes[i].kind = ORDER_FUNCTION;
2179 nodes[i].u.f = pf;
2183 FOR_EACH_DEFINED_VARIABLE (pv)
2184 if (!DECL_EXTERNAL (pv->decl))
2186 if (no_reorder && !pv->no_reorder)
2187 continue;
2188 i = pv->order;
2189 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
2190 nodes[i].kind = ORDER_VAR;
2191 nodes[i].u.v = pv;
2194 for (pa = symtab->first_asm_symbol (); pa; pa = pa->next)
2196 i = pa->order;
2197 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
2198 nodes[i].kind = ORDER_ASM;
2199 nodes[i].u.a = pa;
2202 /* In toplevel reorder mode we output all statics; mark them as needed. */
2204 for (i = 0; i < max; ++i)
2205 if (nodes[i].kind == ORDER_VAR)
2206 nodes[i].u.v->finalize_named_section_flags ();
2208 for (i = 0; i < max; ++i)
2210 switch (nodes[i].kind)
2212 case ORDER_FUNCTION:
2213 nodes[i].u.f->process = 0;
2214 nodes[i].u.f->expand ();
2215 break;
2217 case ORDER_VAR:
2218 nodes[i].u.v->assemble_decl ();
2219 break;
2221 case ORDER_ASM:
2222 assemble_asm (nodes[i].u.a->asm_str);
2223 break;
2225 case ORDER_UNDEFINED:
2226 break;
2228 default:
2229 gcc_unreachable ();
2233 symtab->clear_asm_symbols ();
2235 free (nodes);
2238 static void
2239 ipa_passes (void)
2241 gcc::pass_manager *passes = g->get_passes ();
2243 set_cfun (NULL);
2244 current_function_decl = NULL;
2245 gimple_register_cfg_hooks ();
2246 bitmap_obstack_initialize (NULL);
2248 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_START, NULL);
2250 if (!in_lto_p)
2252 execute_ipa_pass_list (passes->all_small_ipa_passes);
2253 if (seen_error ())
2254 return;
2257 /* This extra symtab_remove_unreachable_nodes pass tends to catch some
2258 devirtualization and other changes where removal iterate. */
2259 symtab->remove_unreachable_nodes (symtab->dump_file);
2261 /* If pass_all_early_optimizations was not scheduled, the state of
2262 the cgraph will not be properly updated. Update it now. */
2263 if (symtab->state < IPA_SSA)
2264 symtab->state = IPA_SSA;
2266 if (!in_lto_p)
2268 /* Generate coverage variables and constructors. */
2269 coverage_finish ();
2271 /* Process new functions added. */
2272 set_cfun (NULL);
2273 current_function_decl = NULL;
2274 symtab->process_new_functions ();
2276 execute_ipa_summary_passes
2277 ((ipa_opt_pass_d *) passes->all_regular_ipa_passes);
2280 /* Some targets need to handle LTO assembler output specially. */
2281 if (flag_generate_lto || flag_generate_offload)
2282 targetm.asm_out.lto_start ();
2284 if (!in_lto_p)
2286 if (g->have_offload)
2288 section_name_prefix = OFFLOAD_SECTION_NAME_PREFIX;
2289 lto_stream_offload_p = true;
2290 ipa_write_summaries ();
2291 lto_stream_offload_p = false;
2293 if (flag_lto)
2295 section_name_prefix = LTO_SECTION_NAME_PREFIX;
2296 lto_stream_offload_p = false;
2297 ipa_write_summaries ();
2301 if (flag_generate_lto || flag_generate_offload)
2302 targetm.asm_out.lto_end ();
2304 if (!flag_ltrans && (in_lto_p || !flag_lto || flag_fat_lto_objects))
2305 execute_ipa_pass_list (passes->all_regular_ipa_passes);
2306 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_END, NULL);
2308 bitmap_obstack_release (NULL);
2312 /* Return string alias is alias of. */
2314 static tree
2315 get_alias_symbol (tree decl)
2317 tree alias = lookup_attribute ("alias", DECL_ATTRIBUTES (decl));
2318 return get_identifier (TREE_STRING_POINTER
2319 (TREE_VALUE (TREE_VALUE (alias))));
2323 /* Weakrefs may be associated to external decls and thus not output
2324 at expansion time. Emit all necessary aliases. */
2326 void
2327 symbol_table::output_weakrefs (void)
2329 symtab_node *node;
2330 cgraph_node *cnode;
2331 FOR_EACH_SYMBOL (node)
2332 if (node->alias
2333 && !TREE_ASM_WRITTEN (node->decl)
2334 && (!(cnode = dyn_cast <cgraph_node *> (node))
2335 || !cnode->instrumented_version
2336 || !TREE_ASM_WRITTEN (cnode->instrumented_version->decl))
2337 && node->weakref)
2339 tree target;
2341 /* Weakrefs are special by not requiring target definition in current
2342 compilation unit. It is thus bit hard to work out what we want to
2343 alias.
2344 When alias target is defined, we need to fetch it from symtab reference,
2345 otherwise it is pointed to by alias_target. */
2346 if (node->alias_target)
2347 target = (DECL_P (node->alias_target)
2348 ? DECL_ASSEMBLER_NAME (node->alias_target)
2349 : node->alias_target);
2350 else if (node->analyzed)
2351 target = DECL_ASSEMBLER_NAME (node->get_alias_target ()->decl);
2352 else
2354 gcc_unreachable ();
2355 target = get_alias_symbol (node->decl);
2357 do_assemble_alias (node->decl, target);
2361 /* Perform simple optimizations based on callgraph. */
2363 void
2364 symbol_table::compile (void)
2366 if (seen_error ())
2367 return;
2369 symtab_node::checking_verify_symtab_nodes ();
2371 timevar_push (TV_CGRAPHOPT);
2372 if (pre_ipa_mem_report)
2374 fprintf (stderr, "Memory consumption before IPA\n");
2375 dump_memory_report (false);
2377 if (!quiet_flag)
2378 fprintf (stderr, "Performing interprocedural optimizations\n");
2379 state = IPA;
2381 /* Offloading requires LTO infrastructure. */
2382 if (!in_lto_p && g->have_offload)
2383 flag_generate_offload = 1;
2385 /* If LTO is enabled, initialize the streamer hooks needed by GIMPLE. */
2386 if (flag_generate_lto || flag_generate_offload)
2387 lto_streamer_hooks_init ();
2389 /* Don't run the IPA passes if there was any error or sorry messages. */
2390 if (!seen_error ())
2391 ipa_passes ();
2393 /* Do nothing else if any IPA pass found errors or if we are just streaming LTO. */
2394 if (seen_error ()
2395 || (!in_lto_p && flag_lto && !flag_fat_lto_objects))
2397 timevar_pop (TV_CGRAPHOPT);
2398 return;
2401 global_info_ready = true;
2402 if (dump_file)
2404 fprintf (dump_file, "Optimized ");
2405 symtab_node:: dump_table (dump_file);
2407 if (post_ipa_mem_report)
2409 fprintf (stderr, "Memory consumption after IPA\n");
2410 dump_memory_report (false);
2412 timevar_pop (TV_CGRAPHOPT);
2414 /* Output everything. */
2415 (*debug_hooks->assembly_start) ();
2416 if (!quiet_flag)
2417 fprintf (stderr, "Assembling functions:\n");
2418 symtab_node::checking_verify_symtab_nodes ();
2420 materialize_all_clones ();
2421 bitmap_obstack_initialize (NULL);
2422 execute_ipa_pass_list (g->get_passes ()->all_late_ipa_passes);
2423 bitmap_obstack_release (NULL);
2424 mark_functions_to_output ();
2426 /* When weakref support is missing, we autmatically translate all
2427 references to NODE to references to its ultimate alias target.
2428 The renaming mechanizm uses flag IDENTIFIER_TRANSPARENT_ALIAS and
2429 TREE_CHAIN.
2431 Set up this mapping before we output any assembler but once we are sure
2432 that all symbol renaming is done.
2434 FIXME: All this uglyness can go away if we just do renaming at gimple
2435 level by physically rewritting the IL. At the moment we can only redirect
2436 calls, so we need infrastructure for renaming references as well. */
2437 #ifndef ASM_OUTPUT_WEAKREF
2438 symtab_node *node;
2440 FOR_EACH_SYMBOL (node)
2441 if (node->alias
2442 && lookup_attribute ("weakref", DECL_ATTRIBUTES (node->decl)))
2444 IDENTIFIER_TRANSPARENT_ALIAS
2445 (DECL_ASSEMBLER_NAME (node->decl)) = 1;
2446 TREE_CHAIN (DECL_ASSEMBLER_NAME (node->decl))
2447 = (node->alias_target ? node->alias_target
2448 : DECL_ASSEMBLER_NAME (node->get_alias_target ()->decl));
2450 #endif
2452 state = EXPANSION;
2454 if (!flag_toplevel_reorder)
2455 output_in_order (false);
2456 else
2458 /* Output first asm statements and anything ordered. The process
2459 flag is cleared for these nodes, so we skip them later. */
2460 output_in_order (true);
2461 expand_all_functions ();
2462 output_variables ();
2465 process_new_functions ();
2466 state = FINISHED;
2467 output_weakrefs ();
2469 if (dump_file)
2471 fprintf (dump_file, "\nFinal ");
2472 symtab_node::dump_table (dump_file);
2474 if (!flag_checking)
2475 return;
2476 symtab_node::verify_symtab_nodes ();
2477 /* Double check that all inline clones are gone and that all
2478 function bodies have been released from memory. */
2479 if (!seen_error ())
2481 cgraph_node *node;
2482 bool error_found = false;
2484 FOR_EACH_DEFINED_FUNCTION (node)
2485 if (node->global.inlined_to
2486 || gimple_has_body_p (node->decl))
2488 error_found = true;
2489 node->debug ();
2491 if (error_found)
2492 internal_error ("nodes with unreleased memory found");
2497 /* Analyze the whole compilation unit once it is parsed completely. */
2499 void
2500 symbol_table::finalize_compilation_unit (void)
2502 timevar_push (TV_CGRAPH);
2504 /* If we're here there's no current function anymore. Some frontends
2505 are lazy in clearing these. */
2506 current_function_decl = NULL;
2507 set_cfun (NULL);
2509 /* Do not skip analyzing the functions if there were errors, we
2510 miss diagnostics for following functions otherwise. */
2512 /* Emit size functions we didn't inline. */
2513 finalize_size_functions ();
2515 /* Mark alias targets necessary and emit diagnostics. */
2516 handle_alias_pairs ();
2518 if (!quiet_flag)
2520 fprintf (stderr, "\nAnalyzing compilation unit\n");
2521 fflush (stderr);
2524 if (flag_dump_passes)
2525 dump_passes ();
2527 /* Gimplify and lower all functions, compute reachability and
2528 remove unreachable nodes. */
2529 analyze_functions (/*first_time=*/true);
2531 /* Mark alias targets necessary and emit diagnostics. */
2532 handle_alias_pairs ();
2534 /* Gimplify and lower thunks. */
2535 analyze_functions (/*first_time=*/false);
2537 if (!seen_error ())
2539 /* Emit early debug for reachable functions, and by consequence,
2540 locally scoped symbols. */
2541 struct cgraph_node *cnode;
2542 FOR_EACH_FUNCTION_WITH_GIMPLE_BODY (cnode)
2543 (*debug_hooks->early_global_decl) (cnode->decl);
2545 /* Clean up anything that needs cleaning up after initial debug
2546 generation. */
2547 (*debug_hooks->early_finish) ();
2550 /* Finally drive the pass manager. */
2551 compile ();
2553 timevar_pop (TV_CGRAPH);
2556 /* Reset all state within cgraphunit.c so that we can rerun the compiler
2557 within the same process. For use by toplev::finalize. */
2559 void
2560 cgraphunit_c_finalize (void)
2562 gcc_assert (cgraph_new_nodes.length () == 0);
2563 cgraph_new_nodes.truncate (0);
2565 vtable_entry_type = NULL;
2566 queued_nodes = &symtab_terminator;
2568 first_analyzed = NULL;
2569 first_analyzed_var = NULL;
2572 /* Creates a wrapper from cgraph_node to TARGET node. Thunk is used for this
2573 kind of wrapper method. */
2575 void
2576 cgraph_node::create_wrapper (cgraph_node *target)
2578 /* Preserve DECL_RESULT so we get right by reference flag. */
2579 tree decl_result = DECL_RESULT (decl);
2581 /* Remove the function's body but keep arguments to be reused
2582 for thunk. */
2583 release_body (true);
2584 reset ();
2586 DECL_UNINLINABLE (decl) = false;
2587 DECL_RESULT (decl) = decl_result;
2588 DECL_INITIAL (decl) = NULL;
2589 allocate_struct_function (decl, false);
2590 set_cfun (NULL);
2592 /* Turn alias into thunk and expand it into GIMPLE representation. */
2593 definition = true;
2595 memset (&thunk, 0, sizeof (cgraph_thunk_info));
2596 thunk.thunk_p = true;
2597 create_edge (target, NULL, count, CGRAPH_FREQ_BASE);
2598 callees->can_throw_external = !TREE_NOTHROW (target->decl);
2600 tree arguments = DECL_ARGUMENTS (decl);
2602 while (arguments)
2604 TREE_ADDRESSABLE (arguments) = false;
2605 arguments = TREE_CHAIN (arguments);
2608 expand_thunk (false, true);
2610 /* Inline summary set-up. */
2611 analyze ();
2612 inline_analyze_function (this);
2615 #include "gt-cgraphunit.h"