Add support for grouping of related diagnostics (PR other/84889)
[official-gcc.git] / gcc / cgraphunit.c
blob208798f0dc7a8e636e91ff7bb70a1f5cff4c97cc
1 /* Driver of optimization process
2 Copyright (C) 2003-2018 Free Software Foundation, Inc.
3 Contributed by Jan Hubicka
5 This file is part of GCC.
7 GCC is free software; you can redistribute it and/or modify it under
8 the terms of the GNU General Public License as published by the Free
9 Software Foundation; either version 3, or (at your option) any later
10 version.
12 GCC is distributed in the hope that it will be useful, but WITHOUT ANY
13 WARRANTY; without even the implied warranty of MERCHANTABILITY or
14 FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
15 for more details.
17 You should have received a copy of the GNU General Public License
18 along with GCC; see the file COPYING3. If not see
19 <http://www.gnu.org/licenses/>. */
21 /* This module implements main driver of compilation process.
23 The main scope of this file is to act as an interface in between
24 tree based frontends and the backend.
26 The front-end is supposed to use following functionality:
28 - finalize_function
30 This function is called once front-end has parsed whole body of function
31 and it is certain that the function body nor the declaration will change.
33 (There is one exception needed for implementing GCC extern inline
34 function.)
36 - varpool_finalize_decl
38 This function has same behavior as the above but is used for static
39 variables.
41 - add_asm_node
43 Insert new toplevel ASM statement
45 - finalize_compilation_unit
47 This function is called once (source level) compilation unit is finalized
48 and it will no longer change.
50 The symbol table is constructed starting from the trivially needed
51 symbols finalized by the frontend. Functions are lowered into
52 GIMPLE representation and callgraph/reference lists are constructed.
53 Those are used to discover other necessary functions and variables.
55 At the end the bodies of unreachable functions are removed.
57 The function can be called multiple times when multiple source level
58 compilation units are combined.
60 - compile
62 This passes control to the back-end. Optimizations are performed and
63 final assembler is generated. This is done in the following way. Note
64 that with link time optimization the process is split into three
65 stages (compile time, linktime analysis and parallel linktime as
66 indicated bellow).
68 Compile time:
70 1) Inter-procedural optimization.
71 (ipa_passes)
73 This part is further split into:
75 a) early optimizations. These are local passes executed in
76 the topological order on the callgraph.
78 The purpose of early optimiations is to optimize away simple
79 things that may otherwise confuse IP analysis. Very simple
80 propagation across the callgraph is done i.e. to discover
81 functions without side effects and simple inlining is performed.
83 b) early small interprocedural passes.
85 Those are interprocedural passes executed only at compilation
86 time. These include, for example, transational memory lowering,
87 unreachable code removal and other simple transformations.
89 c) IP analysis stage. All interprocedural passes do their
90 analysis.
92 Interprocedural passes differ from small interprocedural
93 passes by their ability to operate across whole program
94 at linktime. Their analysis stage is performed early to
95 both reduce linking times and linktime memory usage by
96 not having to represent whole program in memory.
98 d) LTO sreaming. When doing LTO, everything important gets
99 streamed into the object file.
101 Compile time and or linktime analysis stage (WPA):
103 At linktime units gets streamed back and symbol table is
104 merged. Function bodies are not streamed in and not
105 available.
106 e) IP propagation stage. All IP passes execute their
107 IP propagation. This is done based on the earlier analysis
108 without having function bodies at hand.
109 f) Ltrans streaming. When doing WHOPR LTO, the program
110 is partitioned and streamed into multple object files.
112 Compile time and/or parallel linktime stage (ltrans)
114 Each of the object files is streamed back and compiled
115 separately. Now the function bodies becomes available
116 again.
118 2) Virtual clone materialization
119 (cgraph_materialize_clone)
121 IP passes can produce copies of existing functoins (such
122 as versioned clones or inline clones) without actually
123 manipulating their bodies by creating virtual clones in
124 the callgraph. At this time the virtual clones are
125 turned into real functions
126 3) IP transformation
128 All IP passes transform function bodies based on earlier
129 decision of the IP propagation.
131 4) late small IP passes
133 Simple IP passes working within single program partition.
135 5) Expansion
136 (expand_all_functions)
138 At this stage functions that needs to be output into
139 assembler are identified and compiled in topological order
140 6) Output of variables and aliases
141 Now it is known what variable references was not optimized
142 out and thus all variables are output to the file.
144 Note that with -fno-toplevel-reorder passes 5 and 6
145 are combined together in cgraph_output_in_order.
147 Finally there are functions to manipulate the callgraph from
148 backend.
149 - cgraph_add_new_function is used to add backend produced
150 functions introduced after the unit is finalized.
151 The functions are enqueue for later processing and inserted
152 into callgraph with cgraph_process_new_functions.
154 - cgraph_function_versioning
156 produces a copy of function into new one (a version)
157 and apply simple transformations
160 #include "config.h"
161 #include "system.h"
162 #include "coretypes.h"
163 #include "backend.h"
164 #include "target.h"
165 #include "rtl.h"
166 #include "tree.h"
167 #include "gimple.h"
168 #include "cfghooks.h"
169 #include "regset.h" /* FIXME: For reg_obstack. */
170 #include "alloc-pool.h"
171 #include "tree-pass.h"
172 #include "stringpool.h"
173 #include "gimple-ssa.h"
174 #include "cgraph.h"
175 #include "coverage.h"
176 #include "lto-streamer.h"
177 #include "fold-const.h"
178 #include "varasm.h"
179 #include "stor-layout.h"
180 #include "output.h"
181 #include "cfgcleanup.h"
182 #include "gimple-fold.h"
183 #include "gimplify.h"
184 #include "gimple-iterator.h"
185 #include "gimplify-me.h"
186 #include "tree-cfg.h"
187 #include "tree-into-ssa.h"
188 #include "tree-ssa.h"
189 #include "langhooks.h"
190 #include "toplev.h"
191 #include "debug.h"
192 #include "symbol-summary.h"
193 #include "tree-vrp.h"
194 #include "ipa-prop.h"
195 #include "gimple-pretty-print.h"
196 #include "plugin.h"
197 #include "ipa-fnsummary.h"
198 #include "ipa-utils.h"
199 #include "except.h"
200 #include "cfgloop.h"
201 #include "context.h"
202 #include "pass_manager.h"
203 #include "tree-nested.h"
204 #include "dbgcnt.h"
205 #include "lto-section-names.h"
206 #include "stringpool.h"
207 #include "attribs.h"
209 /* Queue of cgraph nodes scheduled to be added into cgraph. This is a
210 secondary queue used during optimization to accommodate passes that
211 may generate new functions that need to be optimized and expanded. */
212 vec<cgraph_node *> cgraph_new_nodes;
214 static void expand_all_functions (void);
215 static void mark_functions_to_output (void);
216 static void handle_alias_pairs (void);
218 /* Used for vtable lookup in thunk adjusting. */
219 static GTY (()) tree vtable_entry_type;
221 /* Return true if this symbol is a function from the C frontend specified
222 directly in RTL form (with "__RTL"). */
224 bool
225 symtab_node::native_rtl_p () const
227 if (TREE_CODE (decl) != FUNCTION_DECL)
228 return false;
229 if (!DECL_STRUCT_FUNCTION (decl))
230 return false;
231 return DECL_STRUCT_FUNCTION (decl)->curr_properties & PROP_rtl;
234 /* Determine if symbol declaration is needed. That is, visible to something
235 either outside this translation unit, something magic in the system
236 configury */
237 bool
238 symtab_node::needed_p (void)
240 /* Double check that no one output the function into assembly file
241 early. */
242 if (!native_rtl_p ())
243 gcc_checking_assert
244 (!DECL_ASSEMBLER_NAME_SET_P (decl)
245 || !TREE_SYMBOL_REFERENCED (DECL_ASSEMBLER_NAME (decl)));
247 if (!definition)
248 return false;
250 if (DECL_EXTERNAL (decl))
251 return false;
253 /* If the user told us it is used, then it must be so. */
254 if (force_output)
255 return true;
257 /* ABI forced symbols are needed when they are external. */
258 if (forced_by_abi && TREE_PUBLIC (decl))
259 return true;
261 /* Keep constructors, destructors and virtual functions. */
262 if (TREE_CODE (decl) == FUNCTION_DECL
263 && (DECL_STATIC_CONSTRUCTOR (decl) || DECL_STATIC_DESTRUCTOR (decl)))
264 return true;
266 /* Externally visible variables must be output. The exception is
267 COMDAT variables that must be output only when they are needed. */
268 if (TREE_PUBLIC (decl) && !DECL_COMDAT (decl))
269 return true;
271 return false;
274 /* Head and terminator of the queue of nodes to be processed while building
275 callgraph. */
277 static symtab_node symtab_terminator;
278 static symtab_node *queued_nodes = &symtab_terminator;
280 /* Add NODE to queue starting at QUEUED_NODES.
281 The queue is linked via AUX pointers and terminated by pointer to 1. */
283 static void
284 enqueue_node (symtab_node *node)
286 if (node->aux)
287 return;
288 gcc_checking_assert (queued_nodes);
289 node->aux = queued_nodes;
290 queued_nodes = node;
293 /* Process CGRAPH_NEW_FUNCTIONS and perform actions necessary to add these
294 functions into callgraph in a way so they look like ordinary reachable
295 functions inserted into callgraph already at construction time. */
297 void
298 symbol_table::process_new_functions (void)
300 tree fndecl;
302 if (!cgraph_new_nodes.exists ())
303 return;
305 handle_alias_pairs ();
306 /* Note that this queue may grow as its being processed, as the new
307 functions may generate new ones. */
308 for (unsigned i = 0; i < cgraph_new_nodes.length (); i++)
310 cgraph_node *node = cgraph_new_nodes[i];
311 fndecl = node->decl;
312 switch (state)
314 case CONSTRUCTION:
315 /* At construction time we just need to finalize function and move
316 it into reachable functions list. */
318 cgraph_node::finalize_function (fndecl, false);
319 call_cgraph_insertion_hooks (node);
320 enqueue_node (node);
321 break;
323 case IPA:
324 case IPA_SSA:
325 case IPA_SSA_AFTER_INLINING:
326 /* When IPA optimization already started, do all essential
327 transformations that has been already performed on the whole
328 cgraph but not on this function. */
330 gimple_register_cfg_hooks ();
331 if (!node->analyzed)
332 node->analyze ();
333 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
334 if ((state == IPA_SSA || state == IPA_SSA_AFTER_INLINING)
335 && !gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
337 bool summaried_computed = ipa_fn_summaries != NULL;
338 g->get_passes ()->execute_early_local_passes ();
339 /* Early passes compure inline parameters to do inlining
340 and splitting. This is redundant for functions added late.
341 Just throw away whatever it did. */
342 if (!summaried_computed)
343 ipa_free_fn_summary ();
345 else if (ipa_fn_summaries != NULL)
346 compute_fn_summary (node, true);
347 free_dominance_info (CDI_POST_DOMINATORS);
348 free_dominance_info (CDI_DOMINATORS);
349 pop_cfun ();
350 call_cgraph_insertion_hooks (node);
351 break;
353 case EXPANSION:
354 /* Functions created during expansion shall be compiled
355 directly. */
356 node->process = 0;
357 call_cgraph_insertion_hooks (node);
358 node->expand ();
359 break;
361 default:
362 gcc_unreachable ();
363 break;
367 cgraph_new_nodes.release ();
370 /* As an GCC extension we allow redefinition of the function. The
371 semantics when both copies of bodies differ is not well defined.
372 We replace the old body with new body so in unit at a time mode
373 we always use new body, while in normal mode we may end up with
374 old body inlined into some functions and new body expanded and
375 inlined in others.
377 ??? It may make more sense to use one body for inlining and other
378 body for expanding the function but this is difficult to do. */
380 void
381 cgraph_node::reset (void)
383 /* If process is set, then we have already begun whole-unit analysis.
384 This is *not* testing for whether we've already emitted the function.
385 That case can be sort-of legitimately seen with real function redefinition
386 errors. I would argue that the front end should never present us with
387 such a case, but don't enforce that for now. */
388 gcc_assert (!process);
390 /* Reset our data structures so we can analyze the function again. */
391 memset (&local, 0, sizeof (local));
392 memset (&global, 0, sizeof (global));
393 memset (&rtl, 0, sizeof (rtl));
394 analyzed = false;
395 definition = false;
396 alias = false;
397 transparent_alias = false;
398 weakref = false;
399 cpp_implicit_alias = false;
401 remove_callees ();
402 remove_all_references ();
405 /* Return true when there are references to the node. INCLUDE_SELF is
406 true if a self reference counts as a reference. */
408 bool
409 symtab_node::referred_to_p (bool include_self)
411 ipa_ref *ref = NULL;
413 /* See if there are any references at all. */
414 if (iterate_referring (0, ref))
415 return true;
416 /* For functions check also calls. */
417 cgraph_node *cn = dyn_cast <cgraph_node *> (this);
418 if (cn && cn->callers)
420 if (include_self)
421 return true;
422 for (cgraph_edge *e = cn->callers; e; e = e->next_caller)
423 if (e->caller != this)
424 return true;
426 return false;
429 /* DECL has been parsed. Take it, queue it, compile it at the whim of the
430 logic in effect. If NO_COLLECT is true, then our caller cannot stand to have
431 the garbage collector run at the moment. We would need to either create
432 a new GC context, or just not compile right now. */
434 void
435 cgraph_node::finalize_function (tree decl, bool no_collect)
437 cgraph_node *node = cgraph_node::get_create (decl);
439 if (node->definition)
441 /* Nested functions should only be defined once. */
442 gcc_assert (!DECL_CONTEXT (decl)
443 || TREE_CODE (DECL_CONTEXT (decl)) != FUNCTION_DECL);
444 node->reset ();
445 node->local.redefined_extern_inline = true;
448 /* Set definition first before calling notice_global_symbol so that
449 it is available to notice_global_symbol. */
450 node->definition = true;
451 notice_global_symbol (decl);
452 node->lowered = DECL_STRUCT_FUNCTION (decl)->cfg != NULL;
453 if (!flag_toplevel_reorder)
454 node->no_reorder = true;
456 /* With -fkeep-inline-functions we are keeping all inline functions except
457 for extern inline ones. */
458 if (flag_keep_inline_functions
459 && DECL_DECLARED_INLINE_P (decl)
460 && !DECL_EXTERNAL (decl)
461 && !DECL_DISREGARD_INLINE_LIMITS (decl))
462 node->force_output = 1;
464 /* __RTL functions were already output as soon as they were parsed (due
465 to the large amount of global state in the backend).
466 Mark such functions as "force_output" to reflect the fact that they
467 will be in the asm file when considering the symbols they reference.
468 The attempt to output them later on will bail out immediately. */
469 if (node->native_rtl_p ())
470 node->force_output = 1;
472 /* When not optimizing, also output the static functions. (see
473 PR24561), but don't do so for always_inline functions, functions
474 declared inline and nested functions. These were optimized out
475 in the original implementation and it is unclear whether we want
476 to change the behavior here. */
477 if (((!opt_for_fn (decl, optimize) || flag_keep_static_functions
478 || node->no_reorder)
479 && !node->cpp_implicit_alias
480 && !DECL_DISREGARD_INLINE_LIMITS (decl)
481 && !DECL_DECLARED_INLINE_P (decl)
482 && !(DECL_CONTEXT (decl)
483 && TREE_CODE (DECL_CONTEXT (decl)) == FUNCTION_DECL))
484 && !DECL_COMDAT (decl) && !DECL_EXTERNAL (decl))
485 node->force_output = 1;
487 /* If we've not yet emitted decl, tell the debug info about it. */
488 if (!TREE_ASM_WRITTEN (decl))
489 (*debug_hooks->deferred_inline_function) (decl);
491 if (!no_collect)
492 ggc_collect ();
494 if (symtab->state == CONSTRUCTION
495 && (node->needed_p () || node->referred_to_p ()))
496 enqueue_node (node);
499 /* Add the function FNDECL to the call graph.
500 Unlike finalize_function, this function is intended to be used
501 by middle end and allows insertion of new function at arbitrary point
502 of compilation. The function can be either in high, low or SSA form
503 GIMPLE.
505 The function is assumed to be reachable and have address taken (so no
506 API breaking optimizations are performed on it).
508 Main work done by this function is to enqueue the function for later
509 processing to avoid need the passes to be re-entrant. */
511 void
512 cgraph_node::add_new_function (tree fndecl, bool lowered)
514 gcc::pass_manager *passes = g->get_passes ();
515 cgraph_node *node;
517 if (dump_file)
519 struct function *fn = DECL_STRUCT_FUNCTION (fndecl);
520 const char *function_type = ((gimple_has_body_p (fndecl))
521 ? (lowered
522 ? (gimple_in_ssa_p (fn)
523 ? "ssa gimple"
524 : "low gimple")
525 : "high gimple")
526 : "to-be-gimplified");
527 fprintf (dump_file,
528 "Added new %s function %s to callgraph\n",
529 function_type,
530 fndecl_name (fndecl));
533 switch (symtab->state)
535 case PARSING:
536 cgraph_node::finalize_function (fndecl, false);
537 break;
538 case CONSTRUCTION:
539 /* Just enqueue function to be processed at nearest occurrence. */
540 node = cgraph_node::get_create (fndecl);
541 if (lowered)
542 node->lowered = true;
543 cgraph_new_nodes.safe_push (node);
544 break;
546 case IPA:
547 case IPA_SSA:
548 case IPA_SSA_AFTER_INLINING:
549 case EXPANSION:
550 /* Bring the function into finalized state and enqueue for later
551 analyzing and compilation. */
552 node = cgraph_node::get_create (fndecl);
553 node->local.local = false;
554 node->definition = true;
555 node->force_output = true;
556 if (TREE_PUBLIC (fndecl))
557 node->externally_visible = true;
558 if (!lowered && symtab->state == EXPANSION)
560 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
561 gimple_register_cfg_hooks ();
562 bitmap_obstack_initialize (NULL);
563 execute_pass_list (cfun, passes->all_lowering_passes);
564 passes->execute_early_local_passes ();
565 bitmap_obstack_release (NULL);
566 pop_cfun ();
568 lowered = true;
570 if (lowered)
571 node->lowered = true;
572 cgraph_new_nodes.safe_push (node);
573 break;
575 case FINISHED:
576 /* At the very end of compilation we have to do all the work up
577 to expansion. */
578 node = cgraph_node::create (fndecl);
579 if (lowered)
580 node->lowered = true;
581 node->definition = true;
582 node->analyze ();
583 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
584 gimple_register_cfg_hooks ();
585 bitmap_obstack_initialize (NULL);
586 if (!gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
587 g->get_passes ()->execute_early_local_passes ();
588 bitmap_obstack_release (NULL);
589 pop_cfun ();
590 node->expand ();
591 break;
593 default:
594 gcc_unreachable ();
597 /* Set a personality if required and we already passed EH lowering. */
598 if (lowered
599 && (function_needs_eh_personality (DECL_STRUCT_FUNCTION (fndecl))
600 == eh_personality_lang))
601 DECL_FUNCTION_PERSONALITY (fndecl) = lang_hooks.eh_personality ();
604 /* Analyze the function scheduled to be output. */
605 void
606 cgraph_node::analyze (void)
608 if (native_rtl_p ())
610 analyzed = true;
611 return;
614 tree decl = this->decl;
615 location_t saved_loc = input_location;
616 input_location = DECL_SOURCE_LOCATION (decl);
618 if (thunk.thunk_p)
620 cgraph_node *t = cgraph_node::get (thunk.alias);
622 create_edge (t, NULL, t->count);
623 callees->can_throw_external = !TREE_NOTHROW (t->decl);
624 /* Target code in expand_thunk may need the thunk's target
625 to be analyzed, so recurse here. */
626 if (!t->analyzed)
627 t->analyze ();
628 if (t->alias)
630 t = t->get_alias_target ();
631 if (!t->analyzed)
632 t->analyze ();
634 if (!expand_thunk (false, false))
636 thunk.alias = NULL;
637 return;
639 thunk.alias = NULL;
641 if (alias)
642 resolve_alias (cgraph_node::get (alias_target), transparent_alias);
643 else if (dispatcher_function)
645 /* Generate the dispatcher body of multi-versioned functions. */
646 cgraph_function_version_info *dispatcher_version_info
647 = function_version ();
648 if (dispatcher_version_info != NULL
649 && (dispatcher_version_info->dispatcher_resolver
650 == NULL_TREE))
652 tree resolver = NULL_TREE;
653 gcc_assert (targetm.generate_version_dispatcher_body);
654 resolver = targetm.generate_version_dispatcher_body (this);
655 gcc_assert (resolver != NULL_TREE);
658 else
660 push_cfun (DECL_STRUCT_FUNCTION (decl));
662 assign_assembler_name_if_needed (decl);
664 /* Make sure to gimplify bodies only once. During analyzing a
665 function we lower it, which will require gimplified nested
666 functions, so we can end up here with an already gimplified
667 body. */
668 if (!gimple_has_body_p (decl))
669 gimplify_function_tree (decl);
671 /* Lower the function. */
672 if (!lowered)
674 if (nested)
675 lower_nested_functions (decl);
676 gcc_assert (!nested);
678 gimple_register_cfg_hooks ();
679 bitmap_obstack_initialize (NULL);
680 execute_pass_list (cfun, g->get_passes ()->all_lowering_passes);
681 free_dominance_info (CDI_POST_DOMINATORS);
682 free_dominance_info (CDI_DOMINATORS);
683 compact_blocks ();
684 bitmap_obstack_release (NULL);
685 lowered = true;
688 pop_cfun ();
690 analyzed = true;
692 input_location = saved_loc;
695 /* C++ frontend produce same body aliases all over the place, even before PCH
696 gets streamed out. It relies on us linking the aliases with their function
697 in order to do the fixups, but ipa-ref is not PCH safe. Consequentely we
698 first produce aliases without links, but once C++ FE is sure he won't sream
699 PCH we build the links via this function. */
701 void
702 symbol_table::process_same_body_aliases (void)
704 symtab_node *node;
705 FOR_EACH_SYMBOL (node)
706 if (node->cpp_implicit_alias && !node->analyzed)
707 node->resolve_alias
708 (VAR_P (node->alias_target)
709 ? (symtab_node *)varpool_node::get_create (node->alias_target)
710 : (symtab_node *)cgraph_node::get_create (node->alias_target));
711 cpp_implicit_aliases_done = true;
714 /* Process attributes common for vars and functions. */
716 static void
717 process_common_attributes (symtab_node *node, tree decl)
719 tree weakref = lookup_attribute ("weakref", DECL_ATTRIBUTES (decl));
721 if (weakref && !lookup_attribute ("alias", DECL_ATTRIBUTES (decl)))
723 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
724 "%<weakref%> attribute should be accompanied with"
725 " an %<alias%> attribute");
726 DECL_WEAK (decl) = 0;
727 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
728 DECL_ATTRIBUTES (decl));
731 if (lookup_attribute ("no_reorder", DECL_ATTRIBUTES (decl)))
732 node->no_reorder = 1;
735 /* Look for externally_visible and used attributes and mark cgraph nodes
736 accordingly.
738 We cannot mark the nodes at the point the attributes are processed (in
739 handle_*_attribute) because the copy of the declarations available at that
740 point may not be canonical. For example, in:
742 void f();
743 void f() __attribute__((used));
745 the declaration we see in handle_used_attribute will be the second
746 declaration -- but the front end will subsequently merge that declaration
747 with the original declaration and discard the second declaration.
749 Furthermore, we can't mark these nodes in finalize_function because:
751 void f() {}
752 void f() __attribute__((externally_visible));
754 is valid.
756 So, we walk the nodes at the end of the translation unit, applying the
757 attributes at that point. */
759 static void
760 process_function_and_variable_attributes (cgraph_node *first,
761 varpool_node *first_var)
763 cgraph_node *node;
764 varpool_node *vnode;
766 for (node = symtab->first_function (); node != first;
767 node = symtab->next_function (node))
769 tree decl = node->decl;
770 if (DECL_PRESERVE_P (decl))
771 node->mark_force_output ();
772 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
774 if (! TREE_PUBLIC (node->decl))
775 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
776 "%<externally_visible%>"
777 " attribute have effect only on public objects");
779 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
780 && (node->definition && !node->alias))
782 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
783 "%<weakref%> attribute ignored"
784 " because function is defined");
785 DECL_WEAK (decl) = 0;
786 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
787 DECL_ATTRIBUTES (decl));
790 if (lookup_attribute ("always_inline", DECL_ATTRIBUTES (decl))
791 && !DECL_DECLARED_INLINE_P (decl)
792 /* redefining extern inline function makes it DECL_UNINLINABLE. */
793 && !DECL_UNINLINABLE (decl))
794 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
795 "always_inline function might not be inlinable");
797 process_common_attributes (node, decl);
799 for (vnode = symtab->first_variable (); vnode != first_var;
800 vnode = symtab->next_variable (vnode))
802 tree decl = vnode->decl;
803 if (DECL_EXTERNAL (decl)
804 && DECL_INITIAL (decl))
805 varpool_node::finalize_decl (decl);
806 if (DECL_PRESERVE_P (decl))
807 vnode->force_output = true;
808 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
810 if (! TREE_PUBLIC (vnode->decl))
811 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
812 "%<externally_visible%>"
813 " attribute have effect only on public objects");
815 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
816 && vnode->definition
817 && DECL_INITIAL (decl))
819 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
820 "%<weakref%> attribute ignored"
821 " because variable is initialized");
822 DECL_WEAK (decl) = 0;
823 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
824 DECL_ATTRIBUTES (decl));
826 process_common_attributes (vnode, decl);
830 /* Mark DECL as finalized. By finalizing the declaration, frontend instruct the
831 middle end to output the variable to asm file, if needed or externally
832 visible. */
834 void
835 varpool_node::finalize_decl (tree decl)
837 varpool_node *node = varpool_node::get_create (decl);
839 gcc_assert (TREE_STATIC (decl) || DECL_EXTERNAL (decl));
841 if (node->definition)
842 return;
843 /* Set definition first before calling notice_global_symbol so that
844 it is available to notice_global_symbol. */
845 node->definition = true;
846 notice_global_symbol (decl);
847 if (!flag_toplevel_reorder)
848 node->no_reorder = true;
849 if (TREE_THIS_VOLATILE (decl) || DECL_PRESERVE_P (decl)
850 /* Traditionally we do not eliminate static variables when not
851 optimizing and when not doing toplevel reoder. */
852 || (node->no_reorder && !DECL_COMDAT (node->decl)
853 && !DECL_ARTIFICIAL (node->decl)))
854 node->force_output = true;
856 if (symtab->state == CONSTRUCTION
857 && (node->needed_p () || node->referred_to_p ()))
858 enqueue_node (node);
859 if (symtab->state >= IPA_SSA)
860 node->analyze ();
861 /* Some frontends produce various interface variables after compilation
862 finished. */
863 if (symtab->state == FINISHED
864 || (node->no_reorder
865 && symtab->state == EXPANSION))
866 node->assemble_decl ();
869 /* EDGE is an polymorphic call. Mark all possible targets as reachable
870 and if there is only one target, perform trivial devirtualization.
871 REACHABLE_CALL_TARGETS collects target lists we already walked to
872 avoid udplicate work. */
874 static void
875 walk_polymorphic_call_targets (hash_set<void *> *reachable_call_targets,
876 cgraph_edge *edge)
878 unsigned int i;
879 void *cache_token;
880 bool final;
881 vec <cgraph_node *>targets
882 = possible_polymorphic_call_targets
883 (edge, &final, &cache_token);
885 if (!reachable_call_targets->add (cache_token))
887 if (symtab->dump_file)
888 dump_possible_polymorphic_call_targets
889 (symtab->dump_file, edge);
891 for (i = 0; i < targets.length (); i++)
893 /* Do not bother to mark virtual methods in anonymous namespace;
894 either we will find use of virtual table defining it, or it is
895 unused. */
896 if (targets[i]->definition
897 && TREE_CODE
898 (TREE_TYPE (targets[i]->decl))
899 == METHOD_TYPE
900 && !type_in_anonymous_namespace_p
901 (TYPE_METHOD_BASETYPE (TREE_TYPE (targets[i]->decl))))
902 enqueue_node (targets[i]);
906 /* Very trivial devirtualization; when the type is
907 final or anonymous (so we know all its derivation)
908 and there is only one possible virtual call target,
909 make the edge direct. */
910 if (final)
912 if (targets.length () <= 1 && dbg_cnt (devirt))
914 cgraph_node *target;
915 if (targets.length () == 1)
916 target = targets[0];
917 else
918 target = cgraph_node::create
919 (builtin_decl_implicit (BUILT_IN_UNREACHABLE));
921 if (symtab->dump_file)
923 fprintf (symtab->dump_file,
924 "Devirtualizing call: ");
925 print_gimple_stmt (symtab->dump_file,
926 edge->call_stmt, 0,
927 TDF_SLIM);
929 if (dump_enabled_p ())
931 dump_printf_loc (MSG_OPTIMIZED_LOCATIONS, edge->call_stmt,
932 "devirtualizing call in %s to %s\n",
933 edge->caller->name (), target->name ());
936 edge->make_direct (target);
937 edge->redirect_call_stmt_to_callee ();
939 if (symtab->dump_file)
941 fprintf (symtab->dump_file,
942 "Devirtualized as: ");
943 print_gimple_stmt (symtab->dump_file,
944 edge->call_stmt, 0,
945 TDF_SLIM);
951 /* Issue appropriate warnings for the global declaration DECL. */
953 static void
954 check_global_declaration (symtab_node *snode)
956 const char *decl_file;
957 tree decl = snode->decl;
959 /* Warn about any function declared static but not defined. We don't
960 warn about variables, because many programs have static variables
961 that exist only to get some text into the object file. */
962 if (TREE_CODE (decl) == FUNCTION_DECL
963 && DECL_INITIAL (decl) == 0
964 && DECL_EXTERNAL (decl)
965 && ! DECL_ARTIFICIAL (decl)
966 && ! TREE_NO_WARNING (decl)
967 && ! TREE_PUBLIC (decl)
968 && (warn_unused_function
969 || snode->referred_to_p (/*include_self=*/false)))
971 if (snode->referred_to_p (/*include_self=*/false))
972 pedwarn (input_location, 0, "%q+F used but never defined", decl);
973 else
974 warning (OPT_Wunused_function, "%q+F declared %<static%> but never defined", decl);
975 /* This symbol is effectively an "extern" declaration now. */
976 TREE_PUBLIC (decl) = 1;
979 /* Warn about static fns or vars defined but not used. */
980 if (((warn_unused_function && TREE_CODE (decl) == FUNCTION_DECL)
981 || (((warn_unused_variable && ! TREE_READONLY (decl))
982 || (warn_unused_const_variable > 0 && TREE_READONLY (decl)
983 && (warn_unused_const_variable == 2
984 || (main_input_filename != NULL
985 && (decl_file = DECL_SOURCE_FILE (decl)) != NULL
986 && filename_cmp (main_input_filename,
987 decl_file) == 0))))
988 && VAR_P (decl)))
989 && ! DECL_IN_SYSTEM_HEADER (decl)
990 && ! snode->referred_to_p (/*include_self=*/false)
991 /* This TREE_USED check is needed in addition to referred_to_p
992 above, because the `__unused__' attribute is not being
993 considered for referred_to_p. */
994 && ! TREE_USED (decl)
995 /* The TREE_USED bit for file-scope decls is kept in the identifier,
996 to handle multiple external decls in different scopes. */
997 && ! (DECL_NAME (decl) && TREE_USED (DECL_NAME (decl)))
998 && ! DECL_EXTERNAL (decl)
999 && ! DECL_ARTIFICIAL (decl)
1000 && ! DECL_ABSTRACT_ORIGIN (decl)
1001 && ! TREE_PUBLIC (decl)
1002 /* A volatile variable might be used in some non-obvious way. */
1003 && (! VAR_P (decl) || ! TREE_THIS_VOLATILE (decl))
1004 /* Global register variables must be declared to reserve them. */
1005 && ! (VAR_P (decl) && DECL_REGISTER (decl))
1006 /* Global ctors and dtors are called by the runtime. */
1007 && (TREE_CODE (decl) != FUNCTION_DECL
1008 || (!DECL_STATIC_CONSTRUCTOR (decl)
1009 && !DECL_STATIC_DESTRUCTOR (decl)))
1010 /* Otherwise, ask the language. */
1011 && lang_hooks.decls.warn_unused_global (decl))
1012 warning_at (DECL_SOURCE_LOCATION (decl),
1013 (TREE_CODE (decl) == FUNCTION_DECL)
1014 ? OPT_Wunused_function
1015 : (TREE_READONLY (decl)
1016 ? OPT_Wunused_const_variable_
1017 : OPT_Wunused_variable),
1018 "%qD defined but not used", decl);
1021 /* Discover all functions and variables that are trivially needed, analyze
1022 them as well as all functions and variables referred by them */
1023 static cgraph_node *first_analyzed;
1024 static varpool_node *first_analyzed_var;
1026 /* FIRST_TIME is set to TRUE for the first time we are called for a
1027 translation unit from finalize_compilation_unit() or false
1028 otherwise. */
1030 static void
1031 analyze_functions (bool first_time)
1033 /* Keep track of already processed nodes when called multiple times for
1034 intermodule optimization. */
1035 cgraph_node *first_handled = first_analyzed;
1036 varpool_node *first_handled_var = first_analyzed_var;
1037 hash_set<void *> reachable_call_targets;
1039 symtab_node *node;
1040 symtab_node *next;
1041 int i;
1042 ipa_ref *ref;
1043 bool changed = true;
1044 location_t saved_loc = input_location;
1046 bitmap_obstack_initialize (NULL);
1047 symtab->state = CONSTRUCTION;
1048 input_location = UNKNOWN_LOCATION;
1050 /* Ugly, but the fixup can not happen at a time same body alias is created;
1051 C++ FE is confused about the COMDAT groups being right. */
1052 if (symtab->cpp_implicit_aliases_done)
1053 FOR_EACH_SYMBOL (node)
1054 if (node->cpp_implicit_alias)
1055 node->fixup_same_cpp_alias_visibility (node->get_alias_target ());
1056 build_type_inheritance_graph ();
1058 /* Analysis adds static variables that in turn adds references to new functions.
1059 So we need to iterate the process until it stabilize. */
1060 while (changed)
1062 changed = false;
1063 process_function_and_variable_attributes (first_analyzed,
1064 first_analyzed_var);
1066 /* First identify the trivially needed symbols. */
1067 for (node = symtab->first_symbol ();
1068 node != first_analyzed
1069 && node != first_analyzed_var; node = node->next)
1071 /* Convert COMDAT group designators to IDENTIFIER_NODEs. */
1072 node->get_comdat_group_id ();
1073 if (node->needed_p ())
1075 enqueue_node (node);
1076 if (!changed && symtab->dump_file)
1077 fprintf (symtab->dump_file, "Trivially needed symbols:");
1078 changed = true;
1079 if (symtab->dump_file)
1080 fprintf (symtab->dump_file, " %s", node->asm_name ());
1081 if (!changed && symtab->dump_file)
1082 fprintf (symtab->dump_file, "\n");
1084 if (node == first_analyzed
1085 || node == first_analyzed_var)
1086 break;
1088 symtab->process_new_functions ();
1089 first_analyzed_var = symtab->first_variable ();
1090 first_analyzed = symtab->first_function ();
1092 if (changed && symtab->dump_file)
1093 fprintf (symtab->dump_file, "\n");
1095 /* Lower representation, build callgraph edges and references for all trivially
1096 needed symbols and all symbols referred by them. */
1097 while (queued_nodes != &symtab_terminator)
1099 changed = true;
1100 node = queued_nodes;
1101 queued_nodes = (symtab_node *)queued_nodes->aux;
1102 cgraph_node *cnode = dyn_cast <cgraph_node *> (node);
1103 if (cnode && cnode->definition)
1105 cgraph_edge *edge;
1106 tree decl = cnode->decl;
1108 /* ??? It is possible to create extern inline function
1109 and later using weak alias attribute to kill its body.
1110 See gcc.c-torture/compile/20011119-1.c */
1111 if (!DECL_STRUCT_FUNCTION (decl)
1112 && !cnode->alias
1113 && !cnode->thunk.thunk_p
1114 && !cnode->dispatcher_function)
1116 cnode->reset ();
1117 cnode->local.redefined_extern_inline = true;
1118 continue;
1121 if (!cnode->analyzed)
1122 cnode->analyze ();
1124 for (edge = cnode->callees; edge; edge = edge->next_callee)
1125 if (edge->callee->definition
1126 && (!DECL_EXTERNAL (edge->callee->decl)
1127 /* When not optimizing, do not try to analyze extern
1128 inline functions. Doing so is pointless. */
1129 || opt_for_fn (edge->callee->decl, optimize)
1130 /* Weakrefs needs to be preserved. */
1131 || edge->callee->alias
1132 /* always_inline functions are inlined aven at -O0. */
1133 || lookup_attribute
1134 ("always_inline",
1135 DECL_ATTRIBUTES (edge->callee->decl))
1136 /* Multiversioned functions needs the dispatcher to
1137 be produced locally even for extern functions. */
1138 || edge->callee->function_version ()))
1139 enqueue_node (edge->callee);
1140 if (opt_for_fn (cnode->decl, optimize)
1141 && opt_for_fn (cnode->decl, flag_devirtualize))
1143 cgraph_edge *next;
1145 for (edge = cnode->indirect_calls; edge; edge = next)
1147 next = edge->next_callee;
1148 if (edge->indirect_info->polymorphic)
1149 walk_polymorphic_call_targets (&reachable_call_targets,
1150 edge);
1154 /* If decl is a clone of an abstract function,
1155 mark that abstract function so that we don't release its body.
1156 The DECL_INITIAL() of that abstract function declaration
1157 will be later needed to output debug info. */
1158 if (DECL_ABSTRACT_ORIGIN (decl))
1160 cgraph_node *origin_node
1161 = cgraph_node::get_create (DECL_ABSTRACT_ORIGIN (decl));
1162 origin_node->used_as_abstract_origin = true;
1164 /* Preserve a functions function context node. It will
1165 later be needed to output debug info. */
1166 if (tree fn = decl_function_context (decl))
1168 cgraph_node *origin_node = cgraph_node::get_create (fn);
1169 enqueue_node (origin_node);
1172 else
1174 varpool_node *vnode = dyn_cast <varpool_node *> (node);
1175 if (vnode && vnode->definition && !vnode->analyzed)
1176 vnode->analyze ();
1179 if (node->same_comdat_group)
1181 symtab_node *next;
1182 for (next = node->same_comdat_group;
1183 next != node;
1184 next = next->same_comdat_group)
1185 if (!next->comdat_local_p ())
1186 enqueue_node (next);
1188 for (i = 0; node->iterate_reference (i, ref); i++)
1189 if (ref->referred->definition
1190 && (!DECL_EXTERNAL (ref->referred->decl)
1191 || ((TREE_CODE (ref->referred->decl) != FUNCTION_DECL
1192 && optimize)
1193 || (TREE_CODE (ref->referred->decl) == FUNCTION_DECL
1194 && opt_for_fn (ref->referred->decl, optimize))
1195 || node->alias
1196 || ref->referred->alias)))
1197 enqueue_node (ref->referred);
1198 symtab->process_new_functions ();
1201 update_type_inheritance_graph ();
1203 /* Collect entry points to the unit. */
1204 if (symtab->dump_file)
1206 fprintf (symtab->dump_file, "\n\nInitial ");
1207 symtab->dump (symtab->dump_file);
1210 if (first_time)
1212 symtab_node *snode;
1213 FOR_EACH_SYMBOL (snode)
1214 check_global_declaration (snode);
1217 if (symtab->dump_file)
1218 fprintf (symtab->dump_file, "\nRemoving unused symbols:");
1220 for (node = symtab->first_symbol ();
1221 node != first_handled
1222 && node != first_handled_var; node = next)
1224 next = node->next;
1225 if (!node->aux && !node->referred_to_p ())
1227 if (symtab->dump_file)
1228 fprintf (symtab->dump_file, " %s", node->name ());
1230 /* See if the debugger can use anything before the DECL
1231 passes away. Perhaps it can notice a DECL that is now a
1232 constant and can tag the early DIE with an appropriate
1233 attribute.
1235 Otherwise, this is the last chance the debug_hooks have
1236 at looking at optimized away DECLs, since
1237 late_global_decl will subsequently be called from the
1238 contents of the now pruned symbol table. */
1239 if (VAR_P (node->decl)
1240 && !decl_function_context (node->decl))
1242 /* We are reclaiming totally unreachable code and variables
1243 so they effectively appear as readonly. Show that to
1244 the debug machinery. */
1245 TREE_READONLY (node->decl) = 1;
1246 node->definition = false;
1247 (*debug_hooks->late_global_decl) (node->decl);
1250 node->remove ();
1251 continue;
1253 if (cgraph_node *cnode = dyn_cast <cgraph_node *> (node))
1255 tree decl = node->decl;
1257 if (cnode->definition && !gimple_has_body_p (decl)
1258 && !cnode->alias
1259 && !cnode->thunk.thunk_p)
1260 cnode->reset ();
1262 gcc_assert (!cnode->definition || cnode->thunk.thunk_p
1263 || cnode->alias
1264 || gimple_has_body_p (decl)
1265 || cnode->native_rtl_p ());
1266 gcc_assert (cnode->analyzed == cnode->definition);
1268 node->aux = NULL;
1270 for (;node; node = node->next)
1271 node->aux = NULL;
1272 first_analyzed = symtab->first_function ();
1273 first_analyzed_var = symtab->first_variable ();
1274 if (symtab->dump_file)
1276 fprintf (symtab->dump_file, "\n\nReclaimed ");
1277 symtab->dump (symtab->dump_file);
1279 bitmap_obstack_release (NULL);
1280 ggc_collect ();
1281 /* Initialize assembler name hash, in particular we want to trigger C++
1282 mangling and same body alias creation before we free DECL_ARGUMENTS
1283 used by it. */
1284 if (!seen_error ())
1285 symtab->symtab_initialize_asm_name_hash ();
1287 input_location = saved_loc;
1290 /* Check declaration of the type of ALIAS for compatibility with its TARGET
1291 (which may be an ifunc resolver) and issue a diagnostic when they are
1292 not compatible according to language rules (plus a C++ extension for
1293 non-static member functions). */
1295 static void
1296 maybe_diag_incompatible_alias (tree alias, tree target)
1298 tree altype = TREE_TYPE (alias);
1299 tree targtype = TREE_TYPE (target);
1301 bool ifunc = cgraph_node::get (alias)->ifunc_resolver;
1302 tree funcptr = altype;
1304 if (ifunc)
1306 /* Handle attribute ifunc first. */
1307 if (TREE_CODE (altype) == METHOD_TYPE)
1309 /* Set FUNCPTR to the type of the alias target. If the type
1310 is a non-static member function of class C, construct a type
1311 of an ordinary function taking C* as the first argument,
1312 followed by the member function argument list, and use it
1313 instead to check for incompatibility. This conversion is
1314 not defined by the language but an extension provided by
1315 G++. */
1317 tree rettype = TREE_TYPE (altype);
1318 tree args = TYPE_ARG_TYPES (altype);
1319 altype = build_function_type (rettype, args);
1320 funcptr = altype;
1323 targtype = TREE_TYPE (targtype);
1325 if (POINTER_TYPE_P (targtype))
1327 targtype = TREE_TYPE (targtype);
1329 /* Only issue Wattribute-alias for conversions to void* with
1330 -Wextra. */
1331 if (VOID_TYPE_P (targtype) && !extra_warnings)
1332 return;
1334 /* Proceed to handle incompatible ifunc resolvers below. */
1336 else
1338 funcptr = build_pointer_type (funcptr);
1340 error_at (DECL_SOURCE_LOCATION (target),
1341 "%<ifunc%> resolver for %qD must return %qT",
1342 alias, funcptr);
1343 inform (DECL_SOURCE_LOCATION (alias),
1344 "resolver indirect function declared here");
1345 return;
1349 if ((!FUNC_OR_METHOD_TYPE_P (targtype)
1350 || (prototype_p (altype)
1351 && prototype_p (targtype)
1352 && !types_compatible_p (altype, targtype))))
1354 /* Warn for incompatibilities. Avoid warning for functions
1355 without a prototype to make it possible to declare aliases
1356 without knowing the exact type, as libstdc++ does. */
1357 if (ifunc)
1359 funcptr = build_pointer_type (funcptr);
1361 auto_diagnostic_group d;
1362 if (warning_at (DECL_SOURCE_LOCATION (target),
1363 OPT_Wattribute_alias,
1364 "%<ifunc%> resolver for %qD should return %qT",
1365 alias, funcptr))
1366 inform (DECL_SOURCE_LOCATION (alias),
1367 "resolver indirect function declared here");
1369 else
1371 auto_diagnostic_group d;
1372 if (warning_at (DECL_SOURCE_LOCATION (alias),
1373 OPT_Wattribute_alias,
1374 "%qD alias between functions of incompatible "
1375 "types %qT and %qT", alias, altype, targtype))
1376 inform (DECL_SOURCE_LOCATION (target),
1377 "aliased declaration here");
1382 /* Translate the ugly representation of aliases as alias pairs into nice
1383 representation in callgraph. We don't handle all cases yet,
1384 unfortunately. */
1386 static void
1387 handle_alias_pairs (void)
1389 alias_pair *p;
1390 unsigned i;
1392 for (i = 0; alias_pairs && alias_pairs->iterate (i, &p);)
1394 symtab_node *target_node = symtab_node::get_for_asmname (p->target);
1396 /* Weakrefs with target not defined in current unit are easy to handle:
1397 they behave just as external variables except we need to note the
1398 alias flag to later output the weakref pseudo op into asm file. */
1399 if (!target_node
1400 && lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)) != NULL)
1402 symtab_node *node = symtab_node::get (p->decl);
1403 if (node)
1405 node->alias_target = p->target;
1406 node->weakref = true;
1407 node->alias = true;
1408 node->transparent_alias = true;
1410 alias_pairs->unordered_remove (i);
1411 continue;
1413 else if (!target_node)
1415 error ("%q+D aliased to undefined symbol %qE", p->decl, p->target);
1416 symtab_node *node = symtab_node::get (p->decl);
1417 if (node)
1418 node->alias = false;
1419 alias_pairs->unordered_remove (i);
1420 continue;
1423 if (DECL_EXTERNAL (target_node->decl)
1424 /* We use local aliases for C++ thunks to force the tailcall
1425 to bind locally. This is a hack - to keep it working do
1426 the following (which is not strictly correct). */
1427 && (TREE_CODE (target_node->decl) != FUNCTION_DECL
1428 || ! DECL_VIRTUAL_P (target_node->decl))
1429 && ! lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)))
1431 error ("%q+D aliased to external symbol %qE",
1432 p->decl, p->target);
1435 if (TREE_CODE (p->decl) == FUNCTION_DECL
1436 && target_node && is_a <cgraph_node *> (target_node))
1438 maybe_diag_incompatible_alias (p->decl, target_node->decl);
1440 cgraph_node *src_node = cgraph_node::get (p->decl);
1441 if (src_node && src_node->definition)
1442 src_node->reset ();
1443 cgraph_node::create_alias (p->decl, target_node->decl);
1444 alias_pairs->unordered_remove (i);
1446 else if (VAR_P (p->decl)
1447 && target_node && is_a <varpool_node *> (target_node))
1449 varpool_node::create_alias (p->decl, target_node->decl);
1450 alias_pairs->unordered_remove (i);
1452 else
1454 error ("%q+D alias between function and variable is not supported",
1455 p->decl);
1456 inform (DECL_SOURCE_LOCATION (target_node->decl),
1457 "aliased declaration here");
1459 alias_pairs->unordered_remove (i);
1462 vec_free (alias_pairs);
1466 /* Figure out what functions we want to assemble. */
1468 static void
1469 mark_functions_to_output (void)
1471 bool check_same_comdat_groups = false;
1472 cgraph_node *node;
1474 if (flag_checking)
1475 FOR_EACH_FUNCTION (node)
1476 gcc_assert (!node->process);
1478 FOR_EACH_FUNCTION (node)
1480 tree decl = node->decl;
1482 gcc_assert (!node->process || node->same_comdat_group);
1483 if (node->process)
1484 continue;
1486 /* We need to output all local functions that are used and not
1487 always inlined, as well as those that are reachable from
1488 outside the current compilation unit. */
1489 if (node->analyzed
1490 && !node->thunk.thunk_p
1491 && !node->alias
1492 && !node->global.inlined_to
1493 && !TREE_ASM_WRITTEN (decl)
1494 && !DECL_EXTERNAL (decl))
1496 node->process = 1;
1497 if (node->same_comdat_group)
1499 cgraph_node *next;
1500 for (next = dyn_cast<cgraph_node *> (node->same_comdat_group);
1501 next != node;
1502 next = dyn_cast<cgraph_node *> (next->same_comdat_group))
1503 if (!next->thunk.thunk_p && !next->alias
1504 && !next->comdat_local_p ())
1505 next->process = 1;
1508 else if (node->same_comdat_group)
1510 if (flag_checking)
1511 check_same_comdat_groups = true;
1513 else
1515 /* We should've reclaimed all functions that are not needed. */
1516 if (flag_checking
1517 && !node->global.inlined_to
1518 && gimple_has_body_p (decl)
1519 /* FIXME: in ltrans unit when offline copy is outside partition but inline copies
1520 are inside partition, we can end up not removing the body since we no longer
1521 have analyzed node pointing to it. */
1522 && !node->in_other_partition
1523 && !node->alias
1524 && !node->clones
1525 && !DECL_EXTERNAL (decl))
1527 node->debug ();
1528 internal_error ("failed to reclaim unneeded function");
1530 gcc_assert (node->global.inlined_to
1531 || !gimple_has_body_p (decl)
1532 || node->in_other_partition
1533 || node->clones
1534 || DECL_ARTIFICIAL (decl)
1535 || DECL_EXTERNAL (decl));
1540 if (flag_checking && check_same_comdat_groups)
1541 FOR_EACH_FUNCTION (node)
1542 if (node->same_comdat_group && !node->process)
1544 tree decl = node->decl;
1545 if (!node->global.inlined_to
1546 && gimple_has_body_p (decl)
1547 /* FIXME: in an ltrans unit when the offline copy is outside a
1548 partition but inline copies are inside a partition, we can
1549 end up not removing the body since we no longer have an
1550 analyzed node pointing to it. */
1551 && !node->in_other_partition
1552 && !node->clones
1553 && !DECL_EXTERNAL (decl))
1555 node->debug ();
1556 internal_error ("failed to reclaim unneeded function in same "
1557 "comdat group");
1562 /* DECL is FUNCTION_DECL. Initialize datastructures so DECL is a function
1563 in lowered gimple form. IN_SSA is true if the gimple is in SSA.
1565 Set current_function_decl and cfun to newly constructed empty function body.
1566 return basic block in the function body. */
1568 basic_block
1569 init_lowered_empty_function (tree decl, bool in_ssa, profile_count count)
1571 basic_block bb;
1572 edge e;
1574 current_function_decl = decl;
1575 allocate_struct_function (decl, false);
1576 gimple_register_cfg_hooks ();
1577 init_empty_tree_cfg ();
1578 init_tree_ssa (cfun);
1580 if (in_ssa)
1582 init_ssa_operands (cfun);
1583 cfun->gimple_df->in_ssa_p = true;
1584 cfun->curr_properties |= PROP_ssa;
1587 DECL_INITIAL (decl) = make_node (BLOCK);
1588 BLOCK_SUPERCONTEXT (DECL_INITIAL (decl)) = decl;
1590 DECL_SAVED_TREE (decl) = error_mark_node;
1591 cfun->curr_properties |= (PROP_gimple_lcf | PROP_gimple_leh | PROP_gimple_any
1592 | PROP_cfg | PROP_loops);
1594 set_loops_for_fn (cfun, ggc_cleared_alloc<loops> ());
1595 init_loops_structure (cfun, loops_for_fn (cfun), 1);
1596 loops_for_fn (cfun)->state |= LOOPS_MAY_HAVE_MULTIPLE_LATCHES;
1598 /* Create BB for body of the function and connect it properly. */
1599 ENTRY_BLOCK_PTR_FOR_FN (cfun)->count = count;
1600 EXIT_BLOCK_PTR_FOR_FN (cfun)->count = count;
1601 bb = create_basic_block (NULL, ENTRY_BLOCK_PTR_FOR_FN (cfun));
1602 bb->count = count;
1603 e = make_edge (ENTRY_BLOCK_PTR_FOR_FN (cfun), bb, EDGE_FALLTHRU);
1604 e->probability = profile_probability::always ();
1605 e = make_edge (bb, EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
1606 e->probability = profile_probability::always ();
1607 add_bb_to_loop (bb, ENTRY_BLOCK_PTR_FOR_FN (cfun)->loop_father);
1609 return bb;
1612 /* Adjust PTR by the constant FIXED_OFFSET, and by the vtable
1613 offset indicated by VIRTUAL_OFFSET, if that is
1614 non-null. THIS_ADJUSTING is nonzero for a this adjusting thunk and
1615 zero for a result adjusting thunk. */
1617 tree
1618 thunk_adjust (gimple_stmt_iterator * bsi,
1619 tree ptr, bool this_adjusting,
1620 HOST_WIDE_INT fixed_offset, tree virtual_offset)
1622 gassign *stmt;
1623 tree ret;
1625 if (this_adjusting
1626 && fixed_offset != 0)
1628 stmt = gimple_build_assign
1629 (ptr, fold_build_pointer_plus_hwi_loc (input_location,
1630 ptr,
1631 fixed_offset));
1632 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1635 /* If there's a virtual offset, look up that value in the vtable and
1636 adjust the pointer again. */
1637 if (virtual_offset)
1639 tree vtabletmp;
1640 tree vtabletmp2;
1641 tree vtabletmp3;
1643 if (!vtable_entry_type)
1645 tree vfunc_type = make_node (FUNCTION_TYPE);
1646 TREE_TYPE (vfunc_type) = integer_type_node;
1647 TYPE_ARG_TYPES (vfunc_type) = NULL_TREE;
1648 layout_type (vfunc_type);
1650 vtable_entry_type = build_pointer_type (vfunc_type);
1653 vtabletmp =
1654 create_tmp_reg (build_pointer_type
1655 (build_pointer_type (vtable_entry_type)), "vptr");
1657 /* The vptr is always at offset zero in the object. */
1658 stmt = gimple_build_assign (vtabletmp,
1659 build1 (NOP_EXPR, TREE_TYPE (vtabletmp),
1660 ptr));
1661 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1663 /* Form the vtable address. */
1664 vtabletmp2 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp)),
1665 "vtableaddr");
1666 stmt = gimple_build_assign (vtabletmp2,
1667 build_simple_mem_ref (vtabletmp));
1668 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1670 /* Find the entry with the vcall offset. */
1671 stmt = gimple_build_assign (vtabletmp2,
1672 fold_build_pointer_plus_loc (input_location,
1673 vtabletmp2,
1674 virtual_offset));
1675 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1677 /* Get the offset itself. */
1678 vtabletmp3 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp2)),
1679 "vcalloffset");
1680 stmt = gimple_build_assign (vtabletmp3,
1681 build_simple_mem_ref (vtabletmp2));
1682 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1684 /* Adjust the `this' pointer. */
1685 ptr = fold_build_pointer_plus_loc (input_location, ptr, vtabletmp3);
1686 ptr = force_gimple_operand_gsi (bsi, ptr, true, NULL_TREE, false,
1687 GSI_CONTINUE_LINKING);
1690 if (!this_adjusting
1691 && fixed_offset != 0)
1692 /* Adjust the pointer by the constant. */
1694 tree ptrtmp;
1696 if (VAR_P (ptr))
1697 ptrtmp = ptr;
1698 else
1700 ptrtmp = create_tmp_reg (TREE_TYPE (ptr), "ptr");
1701 stmt = gimple_build_assign (ptrtmp, ptr);
1702 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1704 ptr = fold_build_pointer_plus_hwi_loc (input_location,
1705 ptrtmp, fixed_offset);
1708 /* Emit the statement and gimplify the adjustment expression. */
1709 ret = create_tmp_reg (TREE_TYPE (ptr), "adjusted_this");
1710 stmt = gimple_build_assign (ret, ptr);
1711 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1713 return ret;
1716 /* Expand thunk NODE to gimple if possible.
1717 When FORCE_GIMPLE_THUNK is true, gimple thunk is created and
1718 no assembler is produced.
1719 When OUTPUT_ASM_THUNK is true, also produce assembler for
1720 thunks that are not lowered. */
1722 bool
1723 cgraph_node::expand_thunk (bool output_asm_thunks, bool force_gimple_thunk)
1725 bool this_adjusting = thunk.this_adjusting;
1726 HOST_WIDE_INT fixed_offset = thunk.fixed_offset;
1727 HOST_WIDE_INT virtual_value = thunk.virtual_value;
1728 tree virtual_offset = NULL;
1729 tree alias = callees->callee->decl;
1730 tree thunk_fndecl = decl;
1731 tree a;
1733 /* Instrumentation thunk is the same function with
1734 a different signature. Never need to expand it. */
1735 if (thunk.add_pointer_bounds_args)
1736 return false;
1738 if (!force_gimple_thunk && this_adjusting
1739 && targetm.asm_out.can_output_mi_thunk (thunk_fndecl, fixed_offset,
1740 virtual_value, alias))
1742 const char *fnname;
1743 tree fn_block;
1744 tree restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1746 if (!output_asm_thunks)
1748 analyzed = true;
1749 return false;
1752 if (in_lto_p)
1753 get_untransformed_body ();
1754 a = DECL_ARGUMENTS (thunk_fndecl);
1756 current_function_decl = thunk_fndecl;
1758 /* Ensure thunks are emitted in their correct sections. */
1759 resolve_unique_section (thunk_fndecl, 0,
1760 flag_function_sections);
1762 DECL_RESULT (thunk_fndecl)
1763 = build_decl (DECL_SOURCE_LOCATION (thunk_fndecl),
1764 RESULT_DECL, 0, restype);
1765 DECL_CONTEXT (DECL_RESULT (thunk_fndecl)) = thunk_fndecl;
1766 fnname = IDENTIFIER_POINTER (DECL_ASSEMBLER_NAME (thunk_fndecl));
1768 /* The back end expects DECL_INITIAL to contain a BLOCK, so we
1769 create one. */
1770 fn_block = make_node (BLOCK);
1771 BLOCK_VARS (fn_block) = a;
1772 DECL_INITIAL (thunk_fndecl) = fn_block;
1773 BLOCK_SUPERCONTEXT (fn_block) = thunk_fndecl;
1774 allocate_struct_function (thunk_fndecl, false);
1775 init_function_start (thunk_fndecl);
1776 cfun->is_thunk = 1;
1777 insn_locations_init ();
1778 set_curr_insn_location (DECL_SOURCE_LOCATION (thunk_fndecl));
1779 prologue_location = curr_insn_location ();
1780 assemble_start_function (thunk_fndecl, fnname);
1782 targetm.asm_out.output_mi_thunk (asm_out_file, thunk_fndecl,
1783 fixed_offset, virtual_value, alias);
1785 assemble_end_function (thunk_fndecl, fnname);
1786 insn_locations_finalize ();
1787 init_insn_lengths ();
1788 free_after_compilation (cfun);
1789 TREE_ASM_WRITTEN (thunk_fndecl) = 1;
1790 thunk.thunk_p = false;
1791 analyzed = false;
1793 else if (stdarg_p (TREE_TYPE (thunk_fndecl)))
1795 error ("generic thunk code fails for method %qD which uses %<...%>",
1796 thunk_fndecl);
1797 TREE_ASM_WRITTEN (thunk_fndecl) = 1;
1798 analyzed = true;
1799 return false;
1801 else
1803 tree restype;
1804 basic_block bb, then_bb, else_bb, return_bb;
1805 gimple_stmt_iterator bsi;
1806 int nargs = 0;
1807 tree arg;
1808 int i;
1809 tree resdecl;
1810 tree restmp = NULL;
1812 gcall *call;
1813 greturn *ret;
1814 bool alias_is_noreturn = TREE_THIS_VOLATILE (alias);
1816 /* We may be called from expand_thunk that releses body except for
1817 DECL_ARGUMENTS. In this case force_gimple_thunk is true. */
1818 if (in_lto_p && !force_gimple_thunk)
1819 get_untransformed_body ();
1820 a = DECL_ARGUMENTS (thunk_fndecl);
1822 current_function_decl = thunk_fndecl;
1824 /* Ensure thunks are emitted in their correct sections. */
1825 resolve_unique_section (thunk_fndecl, 0,
1826 flag_function_sections);
1828 DECL_IGNORED_P (thunk_fndecl) = 1;
1829 bitmap_obstack_initialize (NULL);
1831 if (thunk.virtual_offset_p)
1832 virtual_offset = size_int (virtual_value);
1834 /* Build the return declaration for the function. */
1835 restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1836 if (DECL_RESULT (thunk_fndecl) == NULL_TREE)
1838 resdecl = build_decl (input_location, RESULT_DECL, 0, restype);
1839 DECL_ARTIFICIAL (resdecl) = 1;
1840 DECL_IGNORED_P (resdecl) = 1;
1841 DECL_RESULT (thunk_fndecl) = resdecl;
1842 DECL_CONTEXT (DECL_RESULT (thunk_fndecl)) = thunk_fndecl;
1844 else
1845 resdecl = DECL_RESULT (thunk_fndecl);
1847 profile_count cfg_count = count;
1848 if (!cfg_count.initialized_p ())
1849 cfg_count = profile_count::from_gcov_type (BB_FREQ_MAX).guessed_local ();
1851 bb = then_bb = else_bb = return_bb
1852 = init_lowered_empty_function (thunk_fndecl, true, cfg_count);
1854 bsi = gsi_start_bb (bb);
1856 /* Build call to the function being thunked. */
1857 if (!VOID_TYPE_P (restype)
1858 && (!alias_is_noreturn
1859 || TREE_ADDRESSABLE (restype)
1860 || TREE_CODE (TYPE_SIZE_UNIT (restype)) != INTEGER_CST))
1862 if (DECL_BY_REFERENCE (resdecl))
1864 restmp = gimple_fold_indirect_ref (resdecl);
1865 if (!restmp)
1866 restmp = build2 (MEM_REF,
1867 TREE_TYPE (TREE_TYPE (DECL_RESULT (alias))),
1868 resdecl,
1869 build_int_cst (TREE_TYPE
1870 (DECL_RESULT (alias)), 0));
1872 else if (!is_gimple_reg_type (restype))
1874 if (aggregate_value_p (resdecl, TREE_TYPE (thunk_fndecl)))
1876 restmp = resdecl;
1878 if (VAR_P (restmp))
1879 add_local_decl (cfun, restmp);
1880 BLOCK_VARS (DECL_INITIAL (current_function_decl)) = restmp;
1882 else
1883 restmp = create_tmp_var (restype, "retval");
1885 else
1886 restmp = create_tmp_reg (restype, "retval");
1889 for (arg = a; arg; arg = DECL_CHAIN (arg))
1890 nargs++;
1891 auto_vec<tree> vargs (nargs);
1892 i = 0;
1893 arg = a;
1894 if (this_adjusting)
1896 vargs.quick_push (thunk_adjust (&bsi, a, 1, fixed_offset,
1897 virtual_offset));
1898 arg = DECL_CHAIN (a);
1899 i = 1;
1902 if (nargs)
1903 for (; i < nargs; i++, arg = DECL_CHAIN (arg))
1905 tree tmp = arg;
1906 if (VECTOR_TYPE_P (TREE_TYPE (arg))
1907 || TREE_CODE (TREE_TYPE (arg)) == COMPLEX_TYPE)
1908 DECL_GIMPLE_REG_P (arg) = 1;
1910 if (!is_gimple_val (arg))
1912 tmp = create_tmp_reg (TYPE_MAIN_VARIANT
1913 (TREE_TYPE (arg)), "arg");
1914 gimple *stmt = gimple_build_assign (tmp, arg);
1915 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1917 vargs.quick_push (tmp);
1919 call = gimple_build_call_vec (build_fold_addr_expr_loc (0, alias), vargs);
1920 callees->call_stmt = call;
1921 gimple_call_set_from_thunk (call, true);
1923 /* Return slot optimization is always possible and in fact requred to
1924 return values with DECL_BY_REFERENCE. */
1925 if (aggregate_value_p (resdecl, TREE_TYPE (thunk_fndecl))
1926 && (!is_gimple_reg_type (TREE_TYPE (resdecl))
1927 || DECL_BY_REFERENCE (resdecl)))
1928 gimple_call_set_return_slot_opt (call, true);
1930 if (restmp)
1932 gimple_call_set_lhs (call, restmp);
1933 gcc_assert (useless_type_conversion_p (TREE_TYPE (restmp),
1934 TREE_TYPE (TREE_TYPE (alias))));
1936 gsi_insert_after (&bsi, call, GSI_NEW_STMT);
1937 if (!alias_is_noreturn)
1939 if (restmp && !this_adjusting
1940 && (fixed_offset || virtual_offset))
1942 tree true_label = NULL_TREE;
1944 if (TREE_CODE (TREE_TYPE (restmp)) == POINTER_TYPE)
1946 gimple *stmt;
1947 edge e;
1948 /* If the return type is a pointer, we need to
1949 protect against NULL. We know there will be an
1950 adjustment, because that's why we're emitting a
1951 thunk. */
1952 then_bb = create_basic_block (NULL, bb);
1953 then_bb->count = cfg_count - cfg_count.apply_scale (1, 16);
1954 return_bb = create_basic_block (NULL, then_bb);
1955 return_bb->count = cfg_count;
1956 else_bb = create_basic_block (NULL, else_bb);
1957 else_bb->count = cfg_count.apply_scale (1, 16);
1958 add_bb_to_loop (then_bb, bb->loop_father);
1959 add_bb_to_loop (return_bb, bb->loop_father);
1960 add_bb_to_loop (else_bb, bb->loop_father);
1961 remove_edge (single_succ_edge (bb));
1962 true_label = gimple_block_label (then_bb);
1963 stmt = gimple_build_cond (NE_EXPR, restmp,
1964 build_zero_cst (TREE_TYPE (restmp)),
1965 NULL_TREE, NULL_TREE);
1966 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1967 e = make_edge (bb, then_bb, EDGE_TRUE_VALUE);
1968 e->probability = profile_probability::guessed_always ()
1969 .apply_scale (1, 16);
1970 e = make_edge (bb, else_bb, EDGE_FALSE_VALUE);
1971 e->probability = profile_probability::guessed_always ()
1972 .apply_scale (1, 16);
1973 make_single_succ_edge (return_bb,
1974 EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
1975 make_single_succ_edge (then_bb, return_bb, EDGE_FALLTHRU);
1976 e = make_edge (else_bb, return_bb, EDGE_FALLTHRU);
1977 e->probability = profile_probability::always ();
1978 bsi = gsi_last_bb (then_bb);
1981 restmp = thunk_adjust (&bsi, restmp, /*this_adjusting=*/0,
1982 fixed_offset, virtual_offset);
1983 if (true_label)
1985 gimple *stmt;
1986 bsi = gsi_last_bb (else_bb);
1987 stmt = gimple_build_assign (restmp,
1988 build_zero_cst (TREE_TYPE (restmp)));
1989 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1990 bsi = gsi_last_bb (return_bb);
1993 else
1994 gimple_call_set_tail (call, true);
1996 /* Build return value. */
1997 if (!DECL_BY_REFERENCE (resdecl))
1998 ret = gimple_build_return (restmp);
1999 else
2000 ret = gimple_build_return (resdecl);
2002 gsi_insert_after (&bsi, ret, GSI_NEW_STMT);
2004 else
2006 gimple_call_set_tail (call, true);
2007 remove_edge (single_succ_edge (bb));
2010 cfun->gimple_df->in_ssa_p = true;
2011 update_max_bb_count ();
2012 profile_status_for_fn (cfun)
2013 = cfg_count.initialized_p () && cfg_count.ipa_p ()
2014 ? PROFILE_READ : PROFILE_GUESSED;
2015 /* FIXME: C++ FE should stop setting TREE_ASM_WRITTEN on thunks. */
2016 TREE_ASM_WRITTEN (thunk_fndecl) = false;
2017 delete_unreachable_blocks ();
2018 update_ssa (TODO_update_ssa);
2019 checking_verify_flow_info ();
2020 free_dominance_info (CDI_DOMINATORS);
2022 /* Since we want to emit the thunk, we explicitly mark its name as
2023 referenced. */
2024 thunk.thunk_p = false;
2025 lowered = true;
2026 bitmap_obstack_release (NULL);
2028 current_function_decl = NULL;
2029 set_cfun (NULL);
2030 return true;
2033 /* Assemble thunks and aliases associated to node. */
2035 void
2036 cgraph_node::assemble_thunks_and_aliases (void)
2038 cgraph_edge *e;
2039 ipa_ref *ref;
2041 for (e = callers; e;)
2042 if (e->caller->thunk.thunk_p
2043 && !e->caller->global.inlined_to
2044 && !e->caller->thunk.add_pointer_bounds_args)
2046 cgraph_node *thunk = e->caller;
2048 e = e->next_caller;
2049 thunk->expand_thunk (true, false);
2050 thunk->assemble_thunks_and_aliases ();
2052 else
2053 e = e->next_caller;
2055 FOR_EACH_ALIAS (this, ref)
2057 cgraph_node *alias = dyn_cast <cgraph_node *> (ref->referring);
2058 if (!alias->transparent_alias)
2060 bool saved_written = TREE_ASM_WRITTEN (decl);
2062 /* Force assemble_alias to really output the alias this time instead
2063 of buffering it in same alias pairs. */
2064 TREE_ASM_WRITTEN (decl) = 1;
2065 do_assemble_alias (alias->decl,
2066 DECL_ASSEMBLER_NAME (decl));
2067 alias->assemble_thunks_and_aliases ();
2068 TREE_ASM_WRITTEN (decl) = saved_written;
2073 /* Expand function specified by node. */
2075 void
2076 cgraph_node::expand (void)
2078 location_t saved_loc;
2080 /* We ought to not compile any inline clones. */
2081 gcc_assert (!global.inlined_to);
2083 /* __RTL functions are compiled as soon as they are parsed, so don't
2084 do it again. */
2085 if (native_rtl_p ())
2086 return;
2088 announce_function (decl);
2089 process = 0;
2090 gcc_assert (lowered);
2091 get_untransformed_body ();
2093 /* Generate RTL for the body of DECL. */
2095 timevar_push (TV_REST_OF_COMPILATION);
2097 gcc_assert (symtab->global_info_ready);
2099 /* Initialize the default bitmap obstack. */
2100 bitmap_obstack_initialize (NULL);
2102 /* Initialize the RTL code for the function. */
2103 saved_loc = input_location;
2104 input_location = DECL_SOURCE_LOCATION (decl);
2106 gcc_assert (DECL_STRUCT_FUNCTION (decl));
2107 push_cfun (DECL_STRUCT_FUNCTION (decl));
2108 init_function_start (decl);
2110 gimple_register_cfg_hooks ();
2112 bitmap_obstack_initialize (&reg_obstack); /* FIXME, only at RTL generation*/
2114 execute_all_ipa_transforms ();
2116 /* Perform all tree transforms and optimizations. */
2118 /* Signal the start of passes. */
2119 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_START, NULL);
2121 execute_pass_list (cfun, g->get_passes ()->all_passes);
2123 /* Signal the end of passes. */
2124 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_END, NULL);
2126 bitmap_obstack_release (&reg_obstack);
2128 /* Release the default bitmap obstack. */
2129 bitmap_obstack_release (NULL);
2131 /* If requested, warn about function definitions where the function will
2132 return a value (usually of some struct or union type) which itself will
2133 take up a lot of stack space. */
2134 if (!DECL_EXTERNAL (decl) && TREE_TYPE (decl))
2136 tree ret_type = TREE_TYPE (TREE_TYPE (decl));
2138 if (ret_type && TYPE_SIZE_UNIT (ret_type)
2139 && TREE_CODE (TYPE_SIZE_UNIT (ret_type)) == INTEGER_CST
2140 && compare_tree_int (TYPE_SIZE_UNIT (ret_type),
2141 warn_larger_than_size) > 0)
2143 unsigned int size_as_int
2144 = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (ret_type));
2146 if (compare_tree_int (TYPE_SIZE_UNIT (ret_type), size_as_int) == 0)
2147 warning (OPT_Wlarger_than_,
2148 "size of return value of %q+D is %u bytes",
2149 decl, size_as_int);
2150 else
2151 warning (OPT_Wlarger_than_,
2152 "size of return value of %q+D is larger than %wu bytes",
2153 decl, warn_larger_than_size);
2157 gimple_set_body (decl, NULL);
2158 if (DECL_STRUCT_FUNCTION (decl) == 0
2159 && !cgraph_node::get (decl)->origin)
2161 /* Stop pointing to the local nodes about to be freed.
2162 But DECL_INITIAL must remain nonzero so we know this
2163 was an actual function definition.
2164 For a nested function, this is done in c_pop_function_context.
2165 If rest_of_compilation set this to 0, leave it 0. */
2166 if (DECL_INITIAL (decl) != 0)
2167 DECL_INITIAL (decl) = error_mark_node;
2170 input_location = saved_loc;
2172 ggc_collect ();
2173 timevar_pop (TV_REST_OF_COMPILATION);
2175 /* Make sure that BE didn't give up on compiling. */
2176 gcc_assert (TREE_ASM_WRITTEN (decl));
2177 if (cfun)
2178 pop_cfun ();
2180 /* It would make a lot more sense to output thunks before function body to get more
2181 forward and lest backwarding jumps. This however would need solving problem
2182 with comdats. See PR48668. Also aliases must come after function itself to
2183 make one pass assemblers, like one on AIX, happy. See PR 50689.
2184 FIXME: Perhaps thunks should be move before function IFF they are not in comdat
2185 groups. */
2186 assemble_thunks_and_aliases ();
2187 release_body ();
2188 /* Eliminate all call edges. This is important so the GIMPLE_CALL no longer
2189 points to the dead function body. */
2190 remove_callees ();
2191 remove_all_references ();
2194 /* Node comparer that is responsible for the order that corresponds
2195 to time when a function was launched for the first time. */
2197 static int
2198 node_cmp (const void *pa, const void *pb)
2200 const cgraph_node *a = *(const cgraph_node * const *) pa;
2201 const cgraph_node *b = *(const cgraph_node * const *) pb;
2203 /* Functions with time profile must be before these without profile. */
2204 if (!a->tp_first_run || !b->tp_first_run)
2205 return a->tp_first_run - b->tp_first_run;
2207 return a->tp_first_run != b->tp_first_run
2208 ? b->tp_first_run - a->tp_first_run
2209 : b->order - a->order;
2212 /* Expand all functions that must be output.
2214 Attempt to topologically sort the nodes so function is output when
2215 all called functions are already assembled to allow data to be
2216 propagated across the callgraph. Use a stack to get smaller distance
2217 between a function and its callees (later we may choose to use a more
2218 sophisticated algorithm for function reordering; we will likely want
2219 to use subsections to make the output functions appear in top-down
2220 order). */
2222 static void
2223 expand_all_functions (void)
2225 cgraph_node *node;
2226 cgraph_node **order = XCNEWVEC (cgraph_node *,
2227 symtab->cgraph_count);
2228 unsigned int expanded_func_count = 0, profiled_func_count = 0;
2229 int order_pos, new_order_pos = 0;
2230 int i;
2232 order_pos = ipa_reverse_postorder (order);
2233 gcc_assert (order_pos == symtab->cgraph_count);
2235 /* Garbage collector may remove inline clones we eliminate during
2236 optimization. So we must be sure to not reference them. */
2237 for (i = 0; i < order_pos; i++)
2238 if (order[i]->process)
2239 order[new_order_pos++] = order[i];
2241 if (flag_profile_reorder_functions)
2242 qsort (order, new_order_pos, sizeof (cgraph_node *), node_cmp);
2244 for (i = new_order_pos - 1; i >= 0; i--)
2246 node = order[i];
2248 if (node->process)
2250 expanded_func_count++;
2251 if(node->tp_first_run)
2252 profiled_func_count++;
2254 if (symtab->dump_file)
2255 fprintf (symtab->dump_file,
2256 "Time profile order in expand_all_functions:%s:%d\n",
2257 node->asm_name (), node->tp_first_run);
2258 node->process = 0;
2259 node->expand ();
2263 if (dump_file)
2264 fprintf (dump_file, "Expanded functions with time profile (%s):%u/%u\n",
2265 main_input_filename, profiled_func_count, expanded_func_count);
2267 if (symtab->dump_file && flag_profile_reorder_functions)
2268 fprintf (symtab->dump_file, "Expanded functions with time profile:%u/%u\n",
2269 profiled_func_count, expanded_func_count);
2271 symtab->process_new_functions ();
2272 free_gimplify_stack ();
2274 free (order);
2277 /* This is used to sort the node types by the cgraph order number. */
2279 enum cgraph_order_sort_kind
2281 ORDER_UNDEFINED = 0,
2282 ORDER_FUNCTION,
2283 ORDER_VAR,
2284 ORDER_VAR_UNDEF,
2285 ORDER_ASM
2288 struct cgraph_order_sort
2290 enum cgraph_order_sort_kind kind;
2291 union
2293 cgraph_node *f;
2294 varpool_node *v;
2295 asm_node *a;
2296 } u;
2299 /* Output all functions, variables, and asm statements in the order
2300 according to their order fields, which is the order in which they
2301 appeared in the file. This implements -fno-toplevel-reorder. In
2302 this mode we may output functions and variables which don't really
2303 need to be output. */
2305 static void
2306 output_in_order (void)
2308 int max;
2309 cgraph_order_sort *nodes;
2310 int i;
2311 cgraph_node *pf;
2312 varpool_node *pv;
2313 asm_node *pa;
2314 max = symtab->order;
2315 nodes = XCNEWVEC (cgraph_order_sort, max);
2317 FOR_EACH_DEFINED_FUNCTION (pf)
2319 if (pf->process && !pf->thunk.thunk_p && !pf->alias)
2321 if (!pf->no_reorder)
2322 continue;
2323 i = pf->order;
2324 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
2325 nodes[i].kind = ORDER_FUNCTION;
2326 nodes[i].u.f = pf;
2330 /* There is a similar loop in symbol_table::output_variables.
2331 Please keep them in sync. */
2332 FOR_EACH_VARIABLE (pv)
2334 if (!pv->no_reorder)
2335 continue;
2336 if (DECL_HARD_REGISTER (pv->decl)
2337 || DECL_HAS_VALUE_EXPR_P (pv->decl))
2338 continue;
2339 i = pv->order;
2340 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
2341 nodes[i].kind = pv->definition ? ORDER_VAR : ORDER_VAR_UNDEF;
2342 nodes[i].u.v = pv;
2345 for (pa = symtab->first_asm_symbol (); pa; pa = pa->next)
2347 i = pa->order;
2348 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
2349 nodes[i].kind = ORDER_ASM;
2350 nodes[i].u.a = pa;
2353 /* In toplevel reorder mode we output all statics; mark them as needed. */
2355 for (i = 0; i < max; ++i)
2356 if (nodes[i].kind == ORDER_VAR)
2357 nodes[i].u.v->finalize_named_section_flags ();
2359 for (i = 0; i < max; ++i)
2361 switch (nodes[i].kind)
2363 case ORDER_FUNCTION:
2364 nodes[i].u.f->process = 0;
2365 nodes[i].u.f->expand ();
2366 break;
2368 case ORDER_VAR:
2369 nodes[i].u.v->assemble_decl ();
2370 break;
2372 case ORDER_VAR_UNDEF:
2373 assemble_undefined_decl (nodes[i].u.v->decl);
2374 break;
2376 case ORDER_ASM:
2377 assemble_asm (nodes[i].u.a->asm_str);
2378 break;
2380 case ORDER_UNDEFINED:
2381 break;
2383 default:
2384 gcc_unreachable ();
2388 symtab->clear_asm_symbols ();
2390 free (nodes);
2393 static void
2394 ipa_passes (void)
2396 gcc::pass_manager *passes = g->get_passes ();
2398 set_cfun (NULL);
2399 current_function_decl = NULL;
2400 gimple_register_cfg_hooks ();
2401 bitmap_obstack_initialize (NULL);
2403 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_START, NULL);
2405 if (!in_lto_p)
2407 execute_ipa_pass_list (passes->all_small_ipa_passes);
2408 if (seen_error ())
2409 return;
2412 /* This extra symtab_remove_unreachable_nodes pass tends to catch some
2413 devirtualization and other changes where removal iterate. */
2414 symtab->remove_unreachable_nodes (symtab->dump_file);
2416 /* If pass_all_early_optimizations was not scheduled, the state of
2417 the cgraph will not be properly updated. Update it now. */
2418 if (symtab->state < IPA_SSA)
2419 symtab->state = IPA_SSA;
2421 if (!in_lto_p)
2423 /* Generate coverage variables and constructors. */
2424 coverage_finish ();
2426 /* Process new functions added. */
2427 set_cfun (NULL);
2428 current_function_decl = NULL;
2429 symtab->process_new_functions ();
2431 execute_ipa_summary_passes
2432 ((ipa_opt_pass_d *) passes->all_regular_ipa_passes);
2435 /* Some targets need to handle LTO assembler output specially. */
2436 if (flag_generate_lto || flag_generate_offload)
2437 targetm.asm_out.lto_start ();
2439 if (!in_lto_p
2440 || flag_incremental_link == INCREMENTAL_LINK_LTO)
2442 if (!quiet_flag)
2443 fprintf (stderr, "Streaming LTO\n");
2444 if (g->have_offload)
2446 section_name_prefix = OFFLOAD_SECTION_NAME_PREFIX;
2447 lto_stream_offload_p = true;
2448 ipa_write_summaries ();
2449 lto_stream_offload_p = false;
2451 if (flag_lto)
2453 section_name_prefix = LTO_SECTION_NAME_PREFIX;
2454 lto_stream_offload_p = false;
2455 ipa_write_summaries ();
2459 if (flag_generate_lto || flag_generate_offload)
2460 targetm.asm_out.lto_end ();
2462 if (!flag_ltrans
2463 && ((in_lto_p && flag_incremental_link != INCREMENTAL_LINK_LTO)
2464 || !flag_lto || flag_fat_lto_objects))
2465 execute_ipa_pass_list (passes->all_regular_ipa_passes);
2466 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_END, NULL);
2468 bitmap_obstack_release (NULL);
2472 /* Return string alias is alias of. */
2474 static tree
2475 get_alias_symbol (tree decl)
2477 tree alias = lookup_attribute ("alias", DECL_ATTRIBUTES (decl));
2478 return get_identifier (TREE_STRING_POINTER
2479 (TREE_VALUE (TREE_VALUE (alias))));
2483 /* Weakrefs may be associated to external decls and thus not output
2484 at expansion time. Emit all necessary aliases. */
2486 void
2487 symbol_table::output_weakrefs (void)
2489 symtab_node *node;
2490 FOR_EACH_SYMBOL (node)
2491 if (node->alias
2492 && !TREE_ASM_WRITTEN (node->decl)
2493 && node->weakref)
2495 tree target;
2497 /* Weakrefs are special by not requiring target definition in current
2498 compilation unit. It is thus bit hard to work out what we want to
2499 alias.
2500 When alias target is defined, we need to fetch it from symtab reference,
2501 otherwise it is pointed to by alias_target. */
2502 if (node->alias_target)
2503 target = (DECL_P (node->alias_target)
2504 ? DECL_ASSEMBLER_NAME (node->alias_target)
2505 : node->alias_target);
2506 else if (node->analyzed)
2507 target = DECL_ASSEMBLER_NAME (node->get_alias_target ()->decl);
2508 else
2510 gcc_unreachable ();
2511 target = get_alias_symbol (node->decl);
2513 do_assemble_alias (node->decl, target);
2517 /* Perform simple optimizations based on callgraph. */
2519 void
2520 symbol_table::compile (void)
2522 if (seen_error ())
2523 return;
2525 symtab_node::checking_verify_symtab_nodes ();
2527 timevar_push (TV_CGRAPHOPT);
2528 if (pre_ipa_mem_report)
2530 fprintf (stderr, "Memory consumption before IPA\n");
2531 dump_memory_report (false);
2533 if (!quiet_flag)
2534 fprintf (stderr, "Performing interprocedural optimizations\n");
2535 state = IPA;
2537 /* If LTO is enabled, initialize the streamer hooks needed by GIMPLE. */
2538 if (flag_generate_lto || flag_generate_offload)
2539 lto_streamer_hooks_init ();
2541 /* Don't run the IPA passes if there was any error or sorry messages. */
2542 if (!seen_error ())
2543 ipa_passes ();
2545 /* Do nothing else if any IPA pass found errors or if we are just streaming LTO. */
2546 if (seen_error ()
2547 || ((!in_lto_p || flag_incremental_link == INCREMENTAL_LINK_LTO)
2548 && flag_lto && !flag_fat_lto_objects))
2550 timevar_pop (TV_CGRAPHOPT);
2551 return;
2554 global_info_ready = true;
2555 if (dump_file)
2557 fprintf (dump_file, "Optimized ");
2558 symtab->dump (dump_file);
2560 if (post_ipa_mem_report)
2562 fprintf (stderr, "Memory consumption after IPA\n");
2563 dump_memory_report (false);
2565 timevar_pop (TV_CGRAPHOPT);
2567 /* Output everything. */
2568 switch_to_section (text_section);
2569 (*debug_hooks->assembly_start) ();
2570 if (!quiet_flag)
2571 fprintf (stderr, "Assembling functions:\n");
2572 symtab_node::checking_verify_symtab_nodes ();
2574 bitmap_obstack_initialize (NULL);
2575 execute_ipa_pass_list (g->get_passes ()->all_late_ipa_passes);
2576 bitmap_obstack_release (NULL);
2577 mark_functions_to_output ();
2579 /* When weakref support is missing, we automatically translate all
2580 references to NODE to references to its ultimate alias target.
2581 The renaming mechanizm uses flag IDENTIFIER_TRANSPARENT_ALIAS and
2582 TREE_CHAIN.
2584 Set up this mapping before we output any assembler but once we are sure
2585 that all symbol renaming is done.
2587 FIXME: All this uglyness can go away if we just do renaming at gimple
2588 level by physically rewritting the IL. At the moment we can only redirect
2589 calls, so we need infrastructure for renaming references as well. */
2590 #ifndef ASM_OUTPUT_WEAKREF
2591 symtab_node *node;
2593 FOR_EACH_SYMBOL (node)
2594 if (node->alias
2595 && lookup_attribute ("weakref", DECL_ATTRIBUTES (node->decl)))
2597 IDENTIFIER_TRANSPARENT_ALIAS
2598 (DECL_ASSEMBLER_NAME (node->decl)) = 1;
2599 TREE_CHAIN (DECL_ASSEMBLER_NAME (node->decl))
2600 = (node->alias_target ? node->alias_target
2601 : DECL_ASSEMBLER_NAME (node->get_alias_target ()->decl));
2603 #endif
2605 state = EXPANSION;
2607 /* Output first asm statements and anything ordered. The process
2608 flag is cleared for these nodes, so we skip them later. */
2609 output_in_order ();
2610 expand_all_functions ();
2611 output_variables ();
2613 process_new_functions ();
2614 state = FINISHED;
2615 output_weakrefs ();
2617 if (dump_file)
2619 fprintf (dump_file, "\nFinal ");
2620 symtab->dump (dump_file);
2622 if (!flag_checking)
2623 return;
2624 symtab_node::verify_symtab_nodes ();
2625 /* Double check that all inline clones are gone and that all
2626 function bodies have been released from memory. */
2627 if (!seen_error ())
2629 cgraph_node *node;
2630 bool error_found = false;
2632 FOR_EACH_DEFINED_FUNCTION (node)
2633 if (node->global.inlined_to
2634 || gimple_has_body_p (node->decl))
2636 error_found = true;
2637 node->debug ();
2639 if (error_found)
2640 internal_error ("nodes with unreleased memory found");
2645 /* Analyze the whole compilation unit once it is parsed completely. */
2647 void
2648 symbol_table::finalize_compilation_unit (void)
2650 timevar_push (TV_CGRAPH);
2652 /* If we're here there's no current function anymore. Some frontends
2653 are lazy in clearing these. */
2654 current_function_decl = NULL;
2655 set_cfun (NULL);
2657 /* Do not skip analyzing the functions if there were errors, we
2658 miss diagnostics for following functions otherwise. */
2660 /* Emit size functions we didn't inline. */
2661 finalize_size_functions ();
2663 /* Mark alias targets necessary and emit diagnostics. */
2664 handle_alias_pairs ();
2666 if (!quiet_flag)
2668 fprintf (stderr, "\nAnalyzing compilation unit\n");
2669 fflush (stderr);
2672 if (flag_dump_passes)
2673 dump_passes ();
2675 /* Gimplify and lower all functions, compute reachability and
2676 remove unreachable nodes. */
2677 analyze_functions (/*first_time=*/true);
2679 /* Mark alias targets necessary and emit diagnostics. */
2680 handle_alias_pairs ();
2682 /* Gimplify and lower thunks. */
2683 analyze_functions (/*first_time=*/false);
2685 /* Offloading requires LTO infrastructure. */
2686 if (!in_lto_p && g->have_offload)
2687 flag_generate_offload = 1;
2689 if (!seen_error ())
2691 /* Emit early debug for reachable functions, and by consequence,
2692 locally scoped symbols. */
2693 struct cgraph_node *cnode;
2694 FOR_EACH_FUNCTION_WITH_GIMPLE_BODY (cnode)
2695 (*debug_hooks->early_global_decl) (cnode->decl);
2697 /* Clean up anything that needs cleaning up after initial debug
2698 generation. */
2699 (*debug_hooks->early_finish) (main_input_filename);
2702 /* Finally drive the pass manager. */
2703 compile ();
2705 timevar_pop (TV_CGRAPH);
2708 /* Reset all state within cgraphunit.c so that we can rerun the compiler
2709 within the same process. For use by toplev::finalize. */
2711 void
2712 cgraphunit_c_finalize (void)
2714 gcc_assert (cgraph_new_nodes.length () == 0);
2715 cgraph_new_nodes.truncate (0);
2717 vtable_entry_type = NULL;
2718 queued_nodes = &symtab_terminator;
2720 first_analyzed = NULL;
2721 first_analyzed_var = NULL;
2724 /* Creates a wrapper from cgraph_node to TARGET node. Thunk is used for this
2725 kind of wrapper method. */
2727 void
2728 cgraph_node::create_wrapper (cgraph_node *target)
2730 /* Preserve DECL_RESULT so we get right by reference flag. */
2731 tree decl_result = DECL_RESULT (decl);
2733 /* Remove the function's body but keep arguments to be reused
2734 for thunk. */
2735 release_body (true);
2736 reset ();
2738 DECL_UNINLINABLE (decl) = false;
2739 DECL_RESULT (decl) = decl_result;
2740 DECL_INITIAL (decl) = NULL;
2741 allocate_struct_function (decl, false);
2742 set_cfun (NULL);
2744 /* Turn alias into thunk and expand it into GIMPLE representation. */
2745 definition = true;
2747 memset (&thunk, 0, sizeof (cgraph_thunk_info));
2748 thunk.thunk_p = true;
2749 create_edge (target, NULL, count);
2750 callees->can_throw_external = !TREE_NOTHROW (target->decl);
2752 tree arguments = DECL_ARGUMENTS (decl);
2754 while (arguments)
2756 TREE_ADDRESSABLE (arguments) = false;
2757 arguments = TREE_CHAIN (arguments);
2760 expand_thunk (false, true);
2762 /* Inline summary set-up. */
2763 analyze ();
2764 inline_analyze_function (this);
2767 #include "gt-cgraphunit.h"