Small ChangeLog tweak.
[official-gcc.git] / gcc / cgraphunit.c
blob7b4f47e6efb41e7c401e7347d86fffca6618c4e9
1 /* Driver of optimization process
2 Copyright (C) 2003-2017 Free Software Foundation, Inc.
3 Contributed by Jan Hubicka
5 This file is part of GCC.
7 GCC is free software; you can redistribute it and/or modify it under
8 the terms of the GNU General Public License as published by the Free
9 Software Foundation; either version 3, or (at your option) any later
10 version.
12 GCC is distributed in the hope that it will be useful, but WITHOUT ANY
13 WARRANTY; without even the implied warranty of MERCHANTABILITY or
14 FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
15 for more details.
17 You should have received a copy of the GNU General Public License
18 along with GCC; see the file COPYING3. If not see
19 <http://www.gnu.org/licenses/>. */
21 /* This module implements main driver of compilation process.
23 The main scope of this file is to act as an interface in between
24 tree based frontends and the backend.
26 The front-end is supposed to use following functionality:
28 - finalize_function
30 This function is called once front-end has parsed whole body of function
31 and it is certain that the function body nor the declaration will change.
33 (There is one exception needed for implementing GCC extern inline
34 function.)
36 - varpool_finalize_decl
38 This function has same behavior as the above but is used for static
39 variables.
41 - add_asm_node
43 Insert new toplevel ASM statement
45 - finalize_compilation_unit
47 This function is called once (source level) compilation unit is finalized
48 and it will no longer change.
50 The symbol table is constructed starting from the trivially needed
51 symbols finalized by the frontend. Functions are lowered into
52 GIMPLE representation and callgraph/reference lists are constructed.
53 Those are used to discover other necessary functions and variables.
55 At the end the bodies of unreachable functions are removed.
57 The function can be called multiple times when multiple source level
58 compilation units are combined.
60 - compile
62 This passes control to the back-end. Optimizations are performed and
63 final assembler is generated. This is done in the following way. Note
64 that with link time optimization the process is split into three
65 stages (compile time, linktime analysis and parallel linktime as
66 indicated bellow).
68 Compile time:
70 1) Inter-procedural optimization.
71 (ipa_passes)
73 This part is further split into:
75 a) early optimizations. These are local passes executed in
76 the topological order on the callgraph.
78 The purpose of early optimiations is to optimize away simple
79 things that may otherwise confuse IP analysis. Very simple
80 propagation across the callgraph is done i.e. to discover
81 functions without side effects and simple inlining is performed.
83 b) early small interprocedural passes.
85 Those are interprocedural passes executed only at compilation
86 time. These include, for example, transational memory lowering,
87 unreachable code removal and other simple transformations.
89 c) IP analysis stage. All interprocedural passes do their
90 analysis.
92 Interprocedural passes differ from small interprocedural
93 passes by their ability to operate across whole program
94 at linktime. Their analysis stage is performed early to
95 both reduce linking times and linktime memory usage by
96 not having to represent whole program in memory.
98 d) LTO sreaming. When doing LTO, everything important gets
99 streamed into the object file.
101 Compile time and or linktime analysis stage (WPA):
103 At linktime units gets streamed back and symbol table is
104 merged. Function bodies are not streamed in and not
105 available.
106 e) IP propagation stage. All IP passes execute their
107 IP propagation. This is done based on the earlier analysis
108 without having function bodies at hand.
109 f) Ltrans streaming. When doing WHOPR LTO, the program
110 is partitioned and streamed into multple object files.
112 Compile time and/or parallel linktime stage (ltrans)
114 Each of the object files is streamed back and compiled
115 separately. Now the function bodies becomes available
116 again.
118 2) Virtual clone materialization
119 (cgraph_materialize_clone)
121 IP passes can produce copies of existing functoins (such
122 as versioned clones or inline clones) without actually
123 manipulating their bodies by creating virtual clones in
124 the callgraph. At this time the virtual clones are
125 turned into real functions
126 3) IP transformation
128 All IP passes transform function bodies based on earlier
129 decision of the IP propagation.
131 4) late small IP passes
133 Simple IP passes working within single program partition.
135 5) Expansion
136 (expand_all_functions)
138 At this stage functions that needs to be output into
139 assembler are identified and compiled in topological order
140 6) Output of variables and aliases
141 Now it is known what variable references was not optimized
142 out and thus all variables are output to the file.
144 Note that with -fno-toplevel-reorder passes 5 and 6
145 are combined together in cgraph_output_in_order.
147 Finally there are functions to manipulate the callgraph from
148 backend.
149 - cgraph_add_new_function is used to add backend produced
150 functions introduced after the unit is finalized.
151 The functions are enqueue for later processing and inserted
152 into callgraph with cgraph_process_new_functions.
154 - cgraph_function_versioning
156 produces a copy of function into new one (a version)
157 and apply simple transformations
160 #include "config.h"
161 #include "system.h"
162 #include "coretypes.h"
163 #include "backend.h"
164 #include "target.h"
165 #include "rtl.h"
166 #include "tree.h"
167 #include "gimple.h"
168 #include "cfghooks.h"
169 #include "regset.h" /* FIXME: For reg_obstack. */
170 #include "alloc-pool.h"
171 #include "tree-pass.h"
172 #include "stringpool.h"
173 #include "gimple-ssa.h"
174 #include "cgraph.h"
175 #include "coverage.h"
176 #include "lto-streamer.h"
177 #include "fold-const.h"
178 #include "varasm.h"
179 #include "stor-layout.h"
180 #include "output.h"
181 #include "cfgcleanup.h"
182 #include "gimple-fold.h"
183 #include "gimplify.h"
184 #include "gimple-iterator.h"
185 #include "gimplify-me.h"
186 #include "tree-cfg.h"
187 #include "tree-into-ssa.h"
188 #include "tree-ssa.h"
189 #include "langhooks.h"
190 #include "toplev.h"
191 #include "debug.h"
192 #include "symbol-summary.h"
193 #include "tree-vrp.h"
194 #include "ipa-prop.h"
195 #include "gimple-pretty-print.h"
196 #include "plugin.h"
197 #include "ipa-fnsummary.h"
198 #include "ipa-utils.h"
199 #include "except.h"
200 #include "cfgloop.h"
201 #include "context.h"
202 #include "pass_manager.h"
203 #include "tree-nested.h"
204 #include "dbgcnt.h"
205 #include "tree-chkp.h"
206 #include "lto-section-names.h"
208 /* Queue of cgraph nodes scheduled to be added into cgraph. This is a
209 secondary queue used during optimization to accommodate passes that
210 may generate new functions that need to be optimized and expanded. */
211 vec<cgraph_node *> cgraph_new_nodes;
213 static void expand_all_functions (void);
214 static void mark_functions_to_output (void);
215 static void handle_alias_pairs (void);
217 /* Used for vtable lookup in thunk adjusting. */
218 static GTY (()) tree vtable_entry_type;
220 /* Return true if this symbol is a function from the C frontend specified
221 directly in RTL form (with "__RTL"). */
223 bool
224 symtab_node::native_rtl_p () const
226 if (TREE_CODE (decl) != FUNCTION_DECL)
227 return false;
228 if (!DECL_STRUCT_FUNCTION (decl))
229 return false;
230 return DECL_STRUCT_FUNCTION (decl)->curr_properties & PROP_rtl;
233 /* Determine if symbol declaration is needed. That is, visible to something
234 either outside this translation unit, something magic in the system
235 configury */
236 bool
237 symtab_node::needed_p (void)
239 /* Double check that no one output the function into assembly file
240 early. */
241 if (!native_rtl_p ())
242 gcc_checking_assert
243 (!DECL_ASSEMBLER_NAME_SET_P (decl)
244 || !TREE_SYMBOL_REFERENCED (DECL_ASSEMBLER_NAME (decl)));
246 if (!definition)
247 return false;
249 if (DECL_EXTERNAL (decl))
250 return false;
252 /* If the user told us it is used, then it must be so. */
253 if (force_output)
254 return true;
256 /* ABI forced symbols are needed when they are external. */
257 if (forced_by_abi && TREE_PUBLIC (decl))
258 return true;
260 /* Keep constructors, destructors and virtual functions. */
261 if (TREE_CODE (decl) == FUNCTION_DECL
262 && (DECL_STATIC_CONSTRUCTOR (decl) || DECL_STATIC_DESTRUCTOR (decl)))
263 return true;
265 /* Externally visible variables must be output. The exception is
266 COMDAT variables that must be output only when they are needed. */
267 if (TREE_PUBLIC (decl) && !DECL_COMDAT (decl))
268 return true;
270 return false;
273 /* Head and terminator of the queue of nodes to be processed while building
274 callgraph. */
276 static symtab_node symtab_terminator;
277 static symtab_node *queued_nodes = &symtab_terminator;
279 /* Add NODE to queue starting at QUEUED_NODES.
280 The queue is linked via AUX pointers and terminated by pointer to 1. */
282 static void
283 enqueue_node (symtab_node *node)
285 if (node->aux)
286 return;
287 gcc_checking_assert (queued_nodes);
288 node->aux = queued_nodes;
289 queued_nodes = node;
292 /* Process CGRAPH_NEW_FUNCTIONS and perform actions necessary to add these
293 functions into callgraph in a way so they look like ordinary reachable
294 functions inserted into callgraph already at construction time. */
296 void
297 symbol_table::process_new_functions (void)
299 tree fndecl;
301 if (!cgraph_new_nodes.exists ())
302 return;
304 handle_alias_pairs ();
305 /* Note that this queue may grow as its being processed, as the new
306 functions may generate new ones. */
307 for (unsigned i = 0; i < cgraph_new_nodes.length (); i++)
309 cgraph_node *node = cgraph_new_nodes[i];
310 fndecl = node->decl;
311 switch (state)
313 case CONSTRUCTION:
314 /* At construction time we just need to finalize function and move
315 it into reachable functions list. */
317 cgraph_node::finalize_function (fndecl, false);
318 call_cgraph_insertion_hooks (node);
319 enqueue_node (node);
320 break;
322 case IPA:
323 case IPA_SSA:
324 case IPA_SSA_AFTER_INLINING:
325 /* When IPA optimization already started, do all essential
326 transformations that has been already performed on the whole
327 cgraph but not on this function. */
329 gimple_register_cfg_hooks ();
330 if (!node->analyzed)
331 node->analyze ();
332 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
333 if ((state == IPA_SSA || state == IPA_SSA_AFTER_INLINING)
334 && !gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
336 bool summaried_computed = ipa_fn_summaries != NULL;
337 g->get_passes ()->execute_early_local_passes ();
338 /* Early passes compure inline parameters to do inlining
339 and splitting. This is redundant for functions added late.
340 Just throw away whatever it did. */
341 if (!summaried_computed)
342 ipa_free_fn_summary ();
344 else if (ipa_fn_summaries != NULL)
345 compute_fn_summary (node, true);
346 free_dominance_info (CDI_POST_DOMINATORS);
347 free_dominance_info (CDI_DOMINATORS);
348 pop_cfun ();
349 call_cgraph_insertion_hooks (node);
350 break;
352 case EXPANSION:
353 /* Functions created during expansion shall be compiled
354 directly. */
355 node->process = 0;
356 call_cgraph_insertion_hooks (node);
357 node->expand ();
358 break;
360 default:
361 gcc_unreachable ();
362 break;
366 cgraph_new_nodes.release ();
369 /* As an GCC extension we allow redefinition of the function. The
370 semantics when both copies of bodies differ is not well defined.
371 We replace the old body with new body so in unit at a time mode
372 we always use new body, while in normal mode we may end up with
373 old body inlined into some functions and new body expanded and
374 inlined in others.
376 ??? It may make more sense to use one body for inlining and other
377 body for expanding the function but this is difficult to do. */
379 void
380 cgraph_node::reset (void)
382 /* If process is set, then we have already begun whole-unit analysis.
383 This is *not* testing for whether we've already emitted the function.
384 That case can be sort-of legitimately seen with real function redefinition
385 errors. I would argue that the front end should never present us with
386 such a case, but don't enforce that for now. */
387 gcc_assert (!process);
389 /* Reset our data structures so we can analyze the function again. */
390 memset (&local, 0, sizeof (local));
391 memset (&global, 0, sizeof (global));
392 memset (&rtl, 0, sizeof (rtl));
393 analyzed = false;
394 definition = false;
395 alias = false;
396 transparent_alias = false;
397 weakref = false;
398 cpp_implicit_alias = false;
400 remove_callees ();
401 remove_all_references ();
404 /* Return true when there are references to the node. INCLUDE_SELF is
405 true if a self reference counts as a reference. */
407 bool
408 symtab_node::referred_to_p (bool include_self)
410 ipa_ref *ref = NULL;
412 /* See if there are any references at all. */
413 if (iterate_referring (0, ref))
414 return true;
415 /* For functions check also calls. */
416 cgraph_node *cn = dyn_cast <cgraph_node *> (this);
417 if (cn && cn->callers)
419 if (include_self)
420 return true;
421 for (cgraph_edge *e = cn->callers; e; e = e->next_caller)
422 if (e->caller != this)
423 return true;
425 return false;
428 /* DECL has been parsed. Take it, queue it, compile it at the whim of the
429 logic in effect. If NO_COLLECT is true, then our caller cannot stand to have
430 the garbage collector run at the moment. We would need to either create
431 a new GC context, or just not compile right now. */
433 void
434 cgraph_node::finalize_function (tree decl, bool no_collect)
436 cgraph_node *node = cgraph_node::get_create (decl);
438 if (node->definition)
440 /* Nested functions should only be defined once. */
441 gcc_assert (!DECL_CONTEXT (decl)
442 || TREE_CODE (DECL_CONTEXT (decl)) != FUNCTION_DECL);
443 node->reset ();
444 node->local.redefined_extern_inline = true;
447 /* Set definition first before calling notice_global_symbol so that
448 it is available to notice_global_symbol. */
449 node->definition = true;
450 notice_global_symbol (decl);
451 node->lowered = DECL_STRUCT_FUNCTION (decl)->cfg != NULL;
453 /* With -fkeep-inline-functions we are keeping all inline functions except
454 for extern inline ones. */
455 if (flag_keep_inline_functions
456 && DECL_DECLARED_INLINE_P (decl)
457 && !DECL_EXTERNAL (decl)
458 && !DECL_DISREGARD_INLINE_LIMITS (decl))
459 node->force_output = 1;
461 /* __RTL functions were already output as soon as they were parsed (due
462 to the large amount of global state in the backend).
463 Mark such functions as "force_output" to reflect the fact that they
464 will be in the asm file when considering the symbols they reference.
465 The attempt to output them later on will bail out immediately. */
466 if (node->native_rtl_p ())
467 node->force_output = 1;
469 /* When not optimizing, also output the static functions. (see
470 PR24561), but don't do so for always_inline functions, functions
471 declared inline and nested functions. These were optimized out
472 in the original implementation and it is unclear whether we want
473 to change the behavior here. */
474 if (((!opt_for_fn (decl, optimize) || flag_keep_static_functions)
475 && !node->cpp_implicit_alias
476 && !DECL_DISREGARD_INLINE_LIMITS (decl)
477 && !DECL_DECLARED_INLINE_P (decl)
478 && !(DECL_CONTEXT (decl)
479 && TREE_CODE (DECL_CONTEXT (decl)) == FUNCTION_DECL))
480 && !DECL_COMDAT (decl) && !DECL_EXTERNAL (decl))
481 node->force_output = 1;
483 /* If we've not yet emitted decl, tell the debug info about it. */
484 if (!TREE_ASM_WRITTEN (decl))
485 (*debug_hooks->deferred_inline_function) (decl);
487 if (!no_collect)
488 ggc_collect ();
490 if (symtab->state == CONSTRUCTION
491 && (node->needed_p () || node->referred_to_p ()))
492 enqueue_node (node);
495 /* Add the function FNDECL to the call graph.
496 Unlike finalize_function, this function is intended to be used
497 by middle end and allows insertion of new function at arbitrary point
498 of compilation. The function can be either in high, low or SSA form
499 GIMPLE.
501 The function is assumed to be reachable and have address taken (so no
502 API breaking optimizations are performed on it).
504 Main work done by this function is to enqueue the function for later
505 processing to avoid need the passes to be re-entrant. */
507 void
508 cgraph_node::add_new_function (tree fndecl, bool lowered)
510 gcc::pass_manager *passes = g->get_passes ();
511 cgraph_node *node;
513 if (dump_file)
515 struct function *fn = DECL_STRUCT_FUNCTION (fndecl);
516 const char *function_type = ((gimple_has_body_p (fndecl))
517 ? (lowered
518 ? (gimple_in_ssa_p (fn)
519 ? "ssa gimple"
520 : "low gimple")
521 : "high gimple")
522 : "to-be-gimplified");
523 fprintf (dump_file,
524 "Added new %s function %s to callgraph\n",
525 function_type,
526 fndecl_name (fndecl));
529 switch (symtab->state)
531 case PARSING:
532 cgraph_node::finalize_function (fndecl, false);
533 break;
534 case CONSTRUCTION:
535 /* Just enqueue function to be processed at nearest occurrence. */
536 node = cgraph_node::get_create (fndecl);
537 if (lowered)
538 node->lowered = true;
539 cgraph_new_nodes.safe_push (node);
540 break;
542 case IPA:
543 case IPA_SSA:
544 case IPA_SSA_AFTER_INLINING:
545 case EXPANSION:
546 /* Bring the function into finalized state and enqueue for later
547 analyzing and compilation. */
548 node = cgraph_node::get_create (fndecl);
549 node->local.local = false;
550 node->definition = true;
551 node->force_output = true;
552 if (TREE_PUBLIC (fndecl))
553 node->externally_visible = true;
554 if (!lowered && symtab->state == EXPANSION)
556 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
557 gimple_register_cfg_hooks ();
558 bitmap_obstack_initialize (NULL);
559 execute_pass_list (cfun, passes->all_lowering_passes);
560 passes->execute_early_local_passes ();
561 bitmap_obstack_release (NULL);
562 pop_cfun ();
564 lowered = true;
566 if (lowered)
567 node->lowered = true;
568 cgraph_new_nodes.safe_push (node);
569 break;
571 case FINISHED:
572 /* At the very end of compilation we have to do all the work up
573 to expansion. */
574 node = cgraph_node::create (fndecl);
575 if (lowered)
576 node->lowered = true;
577 node->definition = true;
578 node->analyze ();
579 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
580 gimple_register_cfg_hooks ();
581 bitmap_obstack_initialize (NULL);
582 if (!gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
583 g->get_passes ()->execute_early_local_passes ();
584 bitmap_obstack_release (NULL);
585 pop_cfun ();
586 node->expand ();
587 break;
589 default:
590 gcc_unreachable ();
593 /* Set a personality if required and we already passed EH lowering. */
594 if (lowered
595 && (function_needs_eh_personality (DECL_STRUCT_FUNCTION (fndecl))
596 == eh_personality_lang))
597 DECL_FUNCTION_PERSONALITY (fndecl) = lang_hooks.eh_personality ();
600 /* Analyze the function scheduled to be output. */
601 void
602 cgraph_node::analyze (void)
604 if (native_rtl_p ())
606 analyzed = true;
607 return;
610 tree decl = this->decl;
611 location_t saved_loc = input_location;
612 input_location = DECL_SOURCE_LOCATION (decl);
614 if (thunk.thunk_p)
616 cgraph_node *t = cgraph_node::get (thunk.alias);
618 create_edge (t, NULL, 0, CGRAPH_FREQ_BASE);
619 callees->can_throw_external = !TREE_NOTHROW (t->decl);
620 /* Target code in expand_thunk may need the thunk's target
621 to be analyzed, so recurse here. */
622 if (!t->analyzed)
623 t->analyze ();
624 if (t->alias)
626 t = t->get_alias_target ();
627 if (!t->analyzed)
628 t->analyze ();
630 if (!expand_thunk (false, false))
632 thunk.alias = NULL;
633 return;
635 thunk.alias = NULL;
637 if (alias)
638 resolve_alias (cgraph_node::get (alias_target), transparent_alias);
639 else if (dispatcher_function)
641 /* Generate the dispatcher body of multi-versioned functions. */
642 cgraph_function_version_info *dispatcher_version_info
643 = function_version ();
644 if (dispatcher_version_info != NULL
645 && (dispatcher_version_info->dispatcher_resolver
646 == NULL_TREE))
648 tree resolver = NULL_TREE;
649 gcc_assert (targetm.generate_version_dispatcher_body);
650 resolver = targetm.generate_version_dispatcher_body (this);
651 gcc_assert (resolver != NULL_TREE);
654 else
656 push_cfun (DECL_STRUCT_FUNCTION (decl));
658 assign_assembler_name_if_needed (decl);
660 /* Make sure to gimplify bodies only once. During analyzing a
661 function we lower it, which will require gimplified nested
662 functions, so we can end up here with an already gimplified
663 body. */
664 if (!gimple_has_body_p (decl))
665 gimplify_function_tree (decl);
667 /* Lower the function. */
668 if (!lowered)
670 if (nested)
671 lower_nested_functions (decl);
672 gcc_assert (!nested);
674 gimple_register_cfg_hooks ();
675 bitmap_obstack_initialize (NULL);
676 execute_pass_list (cfun, g->get_passes ()->all_lowering_passes);
677 free_dominance_info (CDI_POST_DOMINATORS);
678 free_dominance_info (CDI_DOMINATORS);
679 compact_blocks ();
680 bitmap_obstack_release (NULL);
681 lowered = true;
684 pop_cfun ();
686 analyzed = true;
688 input_location = saved_loc;
691 /* C++ frontend produce same body aliases all over the place, even before PCH
692 gets streamed out. It relies on us linking the aliases with their function
693 in order to do the fixups, but ipa-ref is not PCH safe. Consequentely we
694 first produce aliases without links, but once C++ FE is sure he won't sream
695 PCH we build the links via this function. */
697 void
698 symbol_table::process_same_body_aliases (void)
700 symtab_node *node;
701 FOR_EACH_SYMBOL (node)
702 if (node->cpp_implicit_alias && !node->analyzed)
703 node->resolve_alias
704 (VAR_P (node->alias_target)
705 ? (symtab_node *)varpool_node::get_create (node->alias_target)
706 : (symtab_node *)cgraph_node::get_create (node->alias_target));
707 cpp_implicit_aliases_done = true;
710 /* Process attributes common for vars and functions. */
712 static void
713 process_common_attributes (symtab_node *node, tree decl)
715 tree weakref = lookup_attribute ("weakref", DECL_ATTRIBUTES (decl));
717 if (weakref && !lookup_attribute ("alias", DECL_ATTRIBUTES (decl)))
719 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
720 "%<weakref%> attribute should be accompanied with"
721 " an %<alias%> attribute");
722 DECL_WEAK (decl) = 0;
723 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
724 DECL_ATTRIBUTES (decl));
727 if (lookup_attribute ("no_reorder", DECL_ATTRIBUTES (decl)))
728 node->no_reorder = 1;
731 /* Look for externally_visible and used attributes and mark cgraph nodes
732 accordingly.
734 We cannot mark the nodes at the point the attributes are processed (in
735 handle_*_attribute) because the copy of the declarations available at that
736 point may not be canonical. For example, in:
738 void f();
739 void f() __attribute__((used));
741 the declaration we see in handle_used_attribute will be the second
742 declaration -- but the front end will subsequently merge that declaration
743 with the original declaration and discard the second declaration.
745 Furthermore, we can't mark these nodes in finalize_function because:
747 void f() {}
748 void f() __attribute__((externally_visible));
750 is valid.
752 So, we walk the nodes at the end of the translation unit, applying the
753 attributes at that point. */
755 static void
756 process_function_and_variable_attributes (cgraph_node *first,
757 varpool_node *first_var)
759 cgraph_node *node;
760 varpool_node *vnode;
762 for (node = symtab->first_function (); node != first;
763 node = symtab->next_function (node))
765 tree decl = node->decl;
766 if (DECL_PRESERVE_P (decl))
767 node->mark_force_output ();
768 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
770 if (! TREE_PUBLIC (node->decl))
771 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
772 "%<externally_visible%>"
773 " attribute have effect only on public objects");
775 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
776 && (node->definition && !node->alias))
778 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
779 "%<weakref%> attribute ignored"
780 " because function is defined");
781 DECL_WEAK (decl) = 0;
782 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
783 DECL_ATTRIBUTES (decl));
786 if (lookup_attribute ("always_inline", DECL_ATTRIBUTES (decl))
787 && !DECL_DECLARED_INLINE_P (decl)
788 /* redefining extern inline function makes it DECL_UNINLINABLE. */
789 && !DECL_UNINLINABLE (decl))
790 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
791 "always_inline function might not be inlinable");
793 process_common_attributes (node, decl);
795 for (vnode = symtab->first_variable (); vnode != first_var;
796 vnode = symtab->next_variable (vnode))
798 tree decl = vnode->decl;
799 if (DECL_EXTERNAL (decl)
800 && DECL_INITIAL (decl))
801 varpool_node::finalize_decl (decl);
802 if (DECL_PRESERVE_P (decl))
803 vnode->force_output = true;
804 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
806 if (! TREE_PUBLIC (vnode->decl))
807 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
808 "%<externally_visible%>"
809 " attribute have effect only on public objects");
811 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
812 && vnode->definition
813 && DECL_INITIAL (decl))
815 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
816 "%<weakref%> attribute ignored"
817 " because variable is initialized");
818 DECL_WEAK (decl) = 0;
819 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
820 DECL_ATTRIBUTES (decl));
822 process_common_attributes (vnode, decl);
826 /* Mark DECL as finalized. By finalizing the declaration, frontend instruct the
827 middle end to output the variable to asm file, if needed or externally
828 visible. */
830 void
831 varpool_node::finalize_decl (tree decl)
833 varpool_node *node = varpool_node::get_create (decl);
835 gcc_assert (TREE_STATIC (decl) || DECL_EXTERNAL (decl));
837 if (node->definition)
838 return;
839 /* Set definition first before calling notice_global_symbol so that
840 it is available to notice_global_symbol. */
841 node->definition = true;
842 notice_global_symbol (decl);
843 if (TREE_THIS_VOLATILE (decl) || DECL_PRESERVE_P (decl)
844 /* Traditionally we do not eliminate static variables when not
845 optimizing and when not doing toplevel reoder. */
846 || node->no_reorder
847 || ((!flag_toplevel_reorder
848 && !DECL_COMDAT (node->decl)
849 && !DECL_ARTIFICIAL (node->decl))))
850 node->force_output = true;
852 if (symtab->state == CONSTRUCTION
853 && (node->needed_p () || node->referred_to_p ()))
854 enqueue_node (node);
855 if (symtab->state >= IPA_SSA)
856 node->analyze ();
857 /* Some frontends produce various interface variables after compilation
858 finished. */
859 if (symtab->state == FINISHED
860 || (!flag_toplevel_reorder
861 && symtab->state == EXPANSION))
862 node->assemble_decl ();
864 if (DECL_INITIAL (decl))
865 chkp_register_var_initializer (decl);
868 /* EDGE is an polymorphic call. Mark all possible targets as reachable
869 and if there is only one target, perform trivial devirtualization.
870 REACHABLE_CALL_TARGETS collects target lists we already walked to
871 avoid udplicate work. */
873 static void
874 walk_polymorphic_call_targets (hash_set<void *> *reachable_call_targets,
875 cgraph_edge *edge)
877 unsigned int i;
878 void *cache_token;
879 bool final;
880 vec <cgraph_node *>targets
881 = possible_polymorphic_call_targets
882 (edge, &final, &cache_token);
884 if (!reachable_call_targets->add (cache_token))
886 if (symtab->dump_file)
887 dump_possible_polymorphic_call_targets
888 (symtab->dump_file, edge);
890 for (i = 0; i < targets.length (); i++)
892 /* Do not bother to mark virtual methods in anonymous namespace;
893 either we will find use of virtual table defining it, or it is
894 unused. */
895 if (targets[i]->definition
896 && TREE_CODE
897 (TREE_TYPE (targets[i]->decl))
898 == METHOD_TYPE
899 && !type_in_anonymous_namespace_p
900 (TYPE_METHOD_BASETYPE (TREE_TYPE (targets[i]->decl))))
901 enqueue_node (targets[i]);
905 /* Very trivial devirtualization; when the type is
906 final or anonymous (so we know all its derivation)
907 and there is only one possible virtual call target,
908 make the edge direct. */
909 if (final)
911 if (targets.length () <= 1 && dbg_cnt (devirt))
913 cgraph_node *target;
914 if (targets.length () == 1)
915 target = targets[0];
916 else
917 target = cgraph_node::create
918 (builtin_decl_implicit (BUILT_IN_UNREACHABLE));
920 if (symtab->dump_file)
922 fprintf (symtab->dump_file,
923 "Devirtualizing call: ");
924 print_gimple_stmt (symtab->dump_file,
925 edge->call_stmt, 0,
926 TDF_SLIM);
928 if (dump_enabled_p ())
930 location_t locus = gimple_location_safe (edge->call_stmt);
931 dump_printf_loc (MSG_OPTIMIZED_LOCATIONS, locus,
932 "devirtualizing call in %s to %s\n",
933 edge->caller->name (), target->name ());
936 edge->make_direct (target);
937 edge->redirect_call_stmt_to_callee ();
939 /* Call to __builtin_unreachable shouldn't be instrumented. */
940 if (!targets.length ())
941 gimple_call_set_with_bounds (edge->call_stmt, false);
943 if (symtab->dump_file)
945 fprintf (symtab->dump_file,
946 "Devirtualized as: ");
947 print_gimple_stmt (symtab->dump_file,
948 edge->call_stmt, 0,
949 TDF_SLIM);
955 /* Issue appropriate warnings for the global declaration DECL. */
957 static void
958 check_global_declaration (symtab_node *snode)
960 const char *decl_file;
961 tree decl = snode->decl;
963 /* Warn about any function declared static but not defined. We don't
964 warn about variables, because many programs have static variables
965 that exist only to get some text into the object file. */
966 if (TREE_CODE (decl) == FUNCTION_DECL
967 && DECL_INITIAL (decl) == 0
968 && DECL_EXTERNAL (decl)
969 && ! DECL_ARTIFICIAL (decl)
970 && ! TREE_NO_WARNING (decl)
971 && ! TREE_PUBLIC (decl)
972 && (warn_unused_function
973 || snode->referred_to_p (/*include_self=*/false)))
975 if (snode->referred_to_p (/*include_self=*/false))
976 pedwarn (input_location, 0, "%q+F used but never defined", decl);
977 else
978 warning (OPT_Wunused_function, "%q+F declared %<static%> but never defined", decl);
979 /* This symbol is effectively an "extern" declaration now. */
980 TREE_PUBLIC (decl) = 1;
983 /* Warn about static fns or vars defined but not used. */
984 if (((warn_unused_function && TREE_CODE (decl) == FUNCTION_DECL)
985 || (((warn_unused_variable && ! TREE_READONLY (decl))
986 || (warn_unused_const_variable > 0 && TREE_READONLY (decl)
987 && (warn_unused_const_variable == 2
988 || (main_input_filename != NULL
989 && (decl_file = DECL_SOURCE_FILE (decl)) != NULL
990 && filename_cmp (main_input_filename,
991 decl_file) == 0))))
992 && VAR_P (decl)))
993 && ! DECL_IN_SYSTEM_HEADER (decl)
994 && ! snode->referred_to_p (/*include_self=*/false)
995 /* This TREE_USED check is needed in addition to referred_to_p
996 above, because the `__unused__' attribute is not being
997 considered for referred_to_p. */
998 && ! TREE_USED (decl)
999 /* The TREE_USED bit for file-scope decls is kept in the identifier,
1000 to handle multiple external decls in different scopes. */
1001 && ! (DECL_NAME (decl) && TREE_USED (DECL_NAME (decl)))
1002 && ! DECL_EXTERNAL (decl)
1003 && ! DECL_ARTIFICIAL (decl)
1004 && ! DECL_ABSTRACT_ORIGIN (decl)
1005 && ! TREE_PUBLIC (decl)
1006 /* A volatile variable might be used in some non-obvious way. */
1007 && (! VAR_P (decl) || ! TREE_THIS_VOLATILE (decl))
1008 /* Global register variables must be declared to reserve them. */
1009 && ! (VAR_P (decl) && DECL_REGISTER (decl))
1010 /* Global ctors and dtors are called by the runtime. */
1011 && (TREE_CODE (decl) != FUNCTION_DECL
1012 || (!DECL_STATIC_CONSTRUCTOR (decl)
1013 && !DECL_STATIC_DESTRUCTOR (decl)))
1014 /* Otherwise, ask the language. */
1015 && lang_hooks.decls.warn_unused_global (decl))
1016 warning_at (DECL_SOURCE_LOCATION (decl),
1017 (TREE_CODE (decl) == FUNCTION_DECL)
1018 ? OPT_Wunused_function
1019 : (TREE_READONLY (decl)
1020 ? OPT_Wunused_const_variable_
1021 : OPT_Wunused_variable),
1022 "%qD defined but not used", decl);
1025 /* Discover all functions and variables that are trivially needed, analyze
1026 them as well as all functions and variables referred by them */
1027 static cgraph_node *first_analyzed;
1028 static varpool_node *first_analyzed_var;
1030 /* FIRST_TIME is set to TRUE for the first time we are called for a
1031 translation unit from finalize_compilation_unit() or false
1032 otherwise. */
1034 static void
1035 analyze_functions (bool first_time)
1037 /* Keep track of already processed nodes when called multiple times for
1038 intermodule optimization. */
1039 cgraph_node *first_handled = first_analyzed;
1040 varpool_node *first_handled_var = first_analyzed_var;
1041 hash_set<void *> reachable_call_targets;
1043 symtab_node *node;
1044 symtab_node *next;
1045 int i;
1046 ipa_ref *ref;
1047 bool changed = true;
1048 location_t saved_loc = input_location;
1050 bitmap_obstack_initialize (NULL);
1051 symtab->state = CONSTRUCTION;
1052 input_location = UNKNOWN_LOCATION;
1054 /* Ugly, but the fixup can not happen at a time same body alias is created;
1055 C++ FE is confused about the COMDAT groups being right. */
1056 if (symtab->cpp_implicit_aliases_done)
1057 FOR_EACH_SYMBOL (node)
1058 if (node->cpp_implicit_alias)
1059 node->fixup_same_cpp_alias_visibility (node->get_alias_target ());
1060 build_type_inheritance_graph ();
1062 /* Analysis adds static variables that in turn adds references to new functions.
1063 So we need to iterate the process until it stabilize. */
1064 while (changed)
1066 changed = false;
1067 process_function_and_variable_attributes (first_analyzed,
1068 first_analyzed_var);
1070 /* First identify the trivially needed symbols. */
1071 for (node = symtab->first_symbol ();
1072 node != first_analyzed
1073 && node != first_analyzed_var; node = node->next)
1075 /* Convert COMDAT group designators to IDENTIFIER_NODEs. */
1076 node->get_comdat_group_id ();
1077 if (node->needed_p ())
1079 enqueue_node (node);
1080 if (!changed && symtab->dump_file)
1081 fprintf (symtab->dump_file, "Trivially needed symbols:");
1082 changed = true;
1083 if (symtab->dump_file)
1084 fprintf (symtab->dump_file, " %s", node->asm_name ());
1085 if (!changed && symtab->dump_file)
1086 fprintf (symtab->dump_file, "\n");
1088 if (node == first_analyzed
1089 || node == first_analyzed_var)
1090 break;
1092 symtab->process_new_functions ();
1093 first_analyzed_var = symtab->first_variable ();
1094 first_analyzed = symtab->first_function ();
1096 if (changed && symtab->dump_file)
1097 fprintf (symtab->dump_file, "\n");
1099 /* Lower representation, build callgraph edges and references for all trivially
1100 needed symbols and all symbols referred by them. */
1101 while (queued_nodes != &symtab_terminator)
1103 changed = true;
1104 node = queued_nodes;
1105 queued_nodes = (symtab_node *)queued_nodes->aux;
1106 cgraph_node *cnode = dyn_cast <cgraph_node *> (node);
1107 if (cnode && cnode->definition)
1109 cgraph_edge *edge;
1110 tree decl = cnode->decl;
1112 /* ??? It is possible to create extern inline function
1113 and later using weak alias attribute to kill its body.
1114 See gcc.c-torture/compile/20011119-1.c */
1115 if (!DECL_STRUCT_FUNCTION (decl)
1116 && !cnode->alias
1117 && !cnode->thunk.thunk_p
1118 && !cnode->dispatcher_function)
1120 cnode->reset ();
1121 cnode->local.redefined_extern_inline = true;
1122 continue;
1125 if (!cnode->analyzed)
1126 cnode->analyze ();
1128 for (edge = cnode->callees; edge; edge = edge->next_callee)
1129 if (edge->callee->definition
1130 && (!DECL_EXTERNAL (edge->callee->decl)
1131 /* When not optimizing, do not try to analyze extern
1132 inline functions. Doing so is pointless. */
1133 || opt_for_fn (edge->callee->decl, optimize)
1134 /* Weakrefs needs to be preserved. */
1135 || edge->callee->alias
1136 /* always_inline functions are inlined aven at -O0. */
1137 || lookup_attribute
1138 ("always_inline",
1139 DECL_ATTRIBUTES (edge->callee->decl))
1140 /* Multiversioned functions needs the dispatcher to
1141 be produced locally even for extern functions. */
1142 || edge->callee->function_version ()))
1143 enqueue_node (edge->callee);
1144 if (opt_for_fn (cnode->decl, optimize)
1145 && opt_for_fn (cnode->decl, flag_devirtualize))
1147 cgraph_edge *next;
1149 for (edge = cnode->indirect_calls; edge; edge = next)
1151 next = edge->next_callee;
1152 if (edge->indirect_info->polymorphic)
1153 walk_polymorphic_call_targets (&reachable_call_targets,
1154 edge);
1158 /* If decl is a clone of an abstract function,
1159 mark that abstract function so that we don't release its body.
1160 The DECL_INITIAL() of that abstract function declaration
1161 will be later needed to output debug info. */
1162 if (DECL_ABSTRACT_ORIGIN (decl))
1164 cgraph_node *origin_node
1165 = cgraph_node::get_create (DECL_ABSTRACT_ORIGIN (decl));
1166 origin_node->used_as_abstract_origin = true;
1168 /* Preserve a functions function context node. It will
1169 later be needed to output debug info. */
1170 if (tree fn = decl_function_context (decl))
1172 cgraph_node *origin_node = cgraph_node::get_create (fn);
1173 enqueue_node (origin_node);
1176 else
1178 varpool_node *vnode = dyn_cast <varpool_node *> (node);
1179 if (vnode && vnode->definition && !vnode->analyzed)
1180 vnode->analyze ();
1183 if (node->same_comdat_group)
1185 symtab_node *next;
1186 for (next = node->same_comdat_group;
1187 next != node;
1188 next = next->same_comdat_group)
1189 if (!next->comdat_local_p ())
1190 enqueue_node (next);
1192 for (i = 0; node->iterate_reference (i, ref); i++)
1193 if (ref->referred->definition
1194 && (!DECL_EXTERNAL (ref->referred->decl)
1195 || ((TREE_CODE (ref->referred->decl) != FUNCTION_DECL
1196 && optimize)
1197 || (TREE_CODE (ref->referred->decl) == FUNCTION_DECL
1198 && opt_for_fn (ref->referred->decl, optimize))
1199 || node->alias
1200 || ref->referred->alias)))
1201 enqueue_node (ref->referred);
1202 symtab->process_new_functions ();
1205 update_type_inheritance_graph ();
1207 /* Collect entry points to the unit. */
1208 if (symtab->dump_file)
1210 fprintf (symtab->dump_file, "\n\nInitial ");
1211 symtab->dump (symtab->dump_file);
1214 if (first_time)
1216 symtab_node *snode;
1217 FOR_EACH_SYMBOL (snode)
1218 check_global_declaration (snode);
1221 if (symtab->dump_file)
1222 fprintf (symtab->dump_file, "\nRemoving unused symbols:");
1224 for (node = symtab->first_symbol ();
1225 node != first_handled
1226 && node != first_handled_var; node = next)
1228 next = node->next;
1229 if (!node->aux && !node->referred_to_p ())
1231 if (symtab->dump_file)
1232 fprintf (symtab->dump_file, " %s", node->name ());
1234 /* See if the debugger can use anything before the DECL
1235 passes away. Perhaps it can notice a DECL that is now a
1236 constant and can tag the early DIE with an appropriate
1237 attribute.
1239 Otherwise, this is the last chance the debug_hooks have
1240 at looking at optimized away DECLs, since
1241 late_global_decl will subsequently be called from the
1242 contents of the now pruned symbol table. */
1243 if (VAR_P (node->decl)
1244 && !decl_function_context (node->decl))
1246 /* We are reclaiming totally unreachable code and variables
1247 so they effectively appear as readonly. Show that to
1248 the debug machinery. */
1249 TREE_READONLY (node->decl) = 1;
1250 node->definition = false;
1251 (*debug_hooks->late_global_decl) (node->decl);
1254 node->remove ();
1255 continue;
1257 if (cgraph_node *cnode = dyn_cast <cgraph_node *> (node))
1259 tree decl = node->decl;
1261 if (cnode->definition && !gimple_has_body_p (decl)
1262 && !cnode->alias
1263 && !cnode->thunk.thunk_p)
1264 cnode->reset ();
1266 gcc_assert (!cnode->definition || cnode->thunk.thunk_p
1267 || cnode->alias
1268 || gimple_has_body_p (decl)
1269 || cnode->native_rtl_p ());
1270 gcc_assert (cnode->analyzed == cnode->definition);
1272 node->aux = NULL;
1274 for (;node; node = node->next)
1275 node->aux = NULL;
1276 first_analyzed = symtab->first_function ();
1277 first_analyzed_var = symtab->first_variable ();
1278 if (symtab->dump_file)
1280 fprintf (symtab->dump_file, "\n\nReclaimed ");
1281 symtab->dump (symtab->dump_file);
1283 bitmap_obstack_release (NULL);
1284 ggc_collect ();
1285 /* Initialize assembler name hash, in particular we want to trigger C++
1286 mangling and same body alias creation before we free DECL_ARGUMENTS
1287 used by it. */
1288 if (!seen_error ())
1289 symtab->symtab_initialize_asm_name_hash ();
1291 input_location = saved_loc;
1294 /* Translate the ugly representation of aliases as alias pairs into nice
1295 representation in callgraph. We don't handle all cases yet,
1296 unfortunately. */
1298 static void
1299 handle_alias_pairs (void)
1301 alias_pair *p;
1302 unsigned i;
1304 for (i = 0; alias_pairs && alias_pairs->iterate (i, &p);)
1306 symtab_node *target_node = symtab_node::get_for_asmname (p->target);
1308 /* Weakrefs with target not defined in current unit are easy to handle:
1309 they behave just as external variables except we need to note the
1310 alias flag to later output the weakref pseudo op into asm file. */
1311 if (!target_node
1312 && lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)) != NULL)
1314 symtab_node *node = symtab_node::get (p->decl);
1315 if (node)
1317 node->alias_target = p->target;
1318 node->weakref = true;
1319 node->alias = true;
1320 node->transparent_alias = true;
1322 alias_pairs->unordered_remove (i);
1323 continue;
1325 else if (!target_node)
1327 error ("%q+D aliased to undefined symbol %qE", p->decl, p->target);
1328 symtab_node *node = symtab_node::get (p->decl);
1329 if (node)
1330 node->alias = false;
1331 alias_pairs->unordered_remove (i);
1332 continue;
1335 if (DECL_EXTERNAL (target_node->decl)
1336 /* We use local aliases for C++ thunks to force the tailcall
1337 to bind locally. This is a hack - to keep it working do
1338 the following (which is not strictly correct). */
1339 && (TREE_CODE (target_node->decl) != FUNCTION_DECL
1340 || ! DECL_VIRTUAL_P (target_node->decl))
1341 && ! lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)))
1343 error ("%q+D aliased to external symbol %qE",
1344 p->decl, p->target);
1347 if (TREE_CODE (p->decl) == FUNCTION_DECL
1348 && target_node && is_a <cgraph_node *> (target_node))
1350 cgraph_node *src_node = cgraph_node::get (p->decl);
1351 if (src_node && src_node->definition)
1352 src_node->reset ();
1353 cgraph_node::create_alias (p->decl, target_node->decl);
1354 alias_pairs->unordered_remove (i);
1356 else if (VAR_P (p->decl)
1357 && target_node && is_a <varpool_node *> (target_node))
1359 varpool_node::create_alias (p->decl, target_node->decl);
1360 alias_pairs->unordered_remove (i);
1362 else
1364 error ("%q+D alias in between function and variable is not supported",
1365 p->decl);
1366 warning (0, "%q+D aliased declaration",
1367 target_node->decl);
1368 alias_pairs->unordered_remove (i);
1371 vec_free (alias_pairs);
1375 /* Figure out what functions we want to assemble. */
1377 static void
1378 mark_functions_to_output (void)
1380 bool check_same_comdat_groups = false;
1381 cgraph_node *node;
1383 if (flag_checking)
1384 FOR_EACH_FUNCTION (node)
1385 gcc_assert (!node->process);
1387 FOR_EACH_FUNCTION (node)
1389 tree decl = node->decl;
1391 gcc_assert (!node->process || node->same_comdat_group);
1392 if (node->process)
1393 continue;
1395 /* We need to output all local functions that are used and not
1396 always inlined, as well as those that are reachable from
1397 outside the current compilation unit. */
1398 if (node->analyzed
1399 && !node->thunk.thunk_p
1400 && !node->alias
1401 && !node->global.inlined_to
1402 && !TREE_ASM_WRITTEN (decl)
1403 && !DECL_EXTERNAL (decl))
1405 node->process = 1;
1406 if (node->same_comdat_group)
1408 cgraph_node *next;
1409 for (next = dyn_cast<cgraph_node *> (node->same_comdat_group);
1410 next != node;
1411 next = dyn_cast<cgraph_node *> (next->same_comdat_group))
1412 if (!next->thunk.thunk_p && !next->alias
1413 && !next->comdat_local_p ())
1414 next->process = 1;
1417 else if (node->same_comdat_group)
1419 if (flag_checking)
1420 check_same_comdat_groups = true;
1422 else
1424 /* We should've reclaimed all functions that are not needed. */
1425 if (flag_checking
1426 && !node->global.inlined_to
1427 && gimple_has_body_p (decl)
1428 /* FIXME: in ltrans unit when offline copy is outside partition but inline copies
1429 are inside partition, we can end up not removing the body since we no longer
1430 have analyzed node pointing to it. */
1431 && !node->in_other_partition
1432 && !node->alias
1433 && !node->clones
1434 && !DECL_EXTERNAL (decl))
1436 node->debug ();
1437 internal_error ("failed to reclaim unneeded function");
1439 gcc_assert (node->global.inlined_to
1440 || !gimple_has_body_p (decl)
1441 || node->in_other_partition
1442 || node->clones
1443 || DECL_ARTIFICIAL (decl)
1444 || DECL_EXTERNAL (decl));
1449 if (flag_checking && check_same_comdat_groups)
1450 FOR_EACH_FUNCTION (node)
1451 if (node->same_comdat_group && !node->process)
1453 tree decl = node->decl;
1454 if (!node->global.inlined_to
1455 && gimple_has_body_p (decl)
1456 /* FIXME: in an ltrans unit when the offline copy is outside a
1457 partition but inline copies are inside a partition, we can
1458 end up not removing the body since we no longer have an
1459 analyzed node pointing to it. */
1460 && !node->in_other_partition
1461 && !node->clones
1462 && !DECL_EXTERNAL (decl))
1464 node->debug ();
1465 internal_error ("failed to reclaim unneeded function in same "
1466 "comdat group");
1471 /* DECL is FUNCTION_DECL. Initialize datastructures so DECL is a function
1472 in lowered gimple form. IN_SSA is true if the gimple is in SSA.
1474 Set current_function_decl and cfun to newly constructed empty function body.
1475 return basic block in the function body. */
1477 basic_block
1478 init_lowered_empty_function (tree decl, bool in_ssa, gcov_type count)
1480 basic_block bb;
1481 edge e;
1483 current_function_decl = decl;
1484 allocate_struct_function (decl, false);
1485 gimple_register_cfg_hooks ();
1486 init_empty_tree_cfg ();
1487 init_tree_ssa (cfun);
1489 if (in_ssa)
1491 init_ssa_operands (cfun);
1492 cfun->gimple_df->in_ssa_p = true;
1493 cfun->curr_properties |= PROP_ssa;
1496 DECL_INITIAL (decl) = make_node (BLOCK);
1497 BLOCK_SUPERCONTEXT (DECL_INITIAL (decl)) = decl;
1499 DECL_SAVED_TREE (decl) = error_mark_node;
1500 cfun->curr_properties |= (PROP_gimple_lcf | PROP_gimple_leh | PROP_gimple_any
1501 | PROP_cfg | PROP_loops);
1503 set_loops_for_fn (cfun, ggc_cleared_alloc<loops> ());
1504 init_loops_structure (cfun, loops_for_fn (cfun), 1);
1505 loops_for_fn (cfun)->state |= LOOPS_MAY_HAVE_MULTIPLE_LATCHES;
1507 /* Create BB for body of the function and connect it properly. */
1508 ENTRY_BLOCK_PTR_FOR_FN (cfun)->count = count;
1509 ENTRY_BLOCK_PTR_FOR_FN (cfun)->frequency = REG_BR_PROB_BASE;
1510 EXIT_BLOCK_PTR_FOR_FN (cfun)->count = count;
1511 EXIT_BLOCK_PTR_FOR_FN (cfun)->frequency = REG_BR_PROB_BASE;
1512 bb = create_basic_block (NULL, ENTRY_BLOCK_PTR_FOR_FN (cfun));
1513 bb->count = count;
1514 bb->frequency = BB_FREQ_MAX;
1515 e = make_edge (ENTRY_BLOCK_PTR_FOR_FN (cfun), bb, EDGE_FALLTHRU);
1516 e->count = count;
1517 e->probability = REG_BR_PROB_BASE;
1518 e = make_edge (bb, EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
1519 e->count = count;
1520 e->probability = REG_BR_PROB_BASE;
1521 add_bb_to_loop (bb, ENTRY_BLOCK_PTR_FOR_FN (cfun)->loop_father);
1523 return bb;
1526 /* Adjust PTR by the constant FIXED_OFFSET, and by the vtable
1527 offset indicated by VIRTUAL_OFFSET, if that is
1528 non-null. THIS_ADJUSTING is nonzero for a this adjusting thunk and
1529 zero for a result adjusting thunk. */
1531 tree
1532 thunk_adjust (gimple_stmt_iterator * bsi,
1533 tree ptr, bool this_adjusting,
1534 HOST_WIDE_INT fixed_offset, tree virtual_offset)
1536 gassign *stmt;
1537 tree ret;
1539 if (this_adjusting
1540 && fixed_offset != 0)
1542 stmt = gimple_build_assign
1543 (ptr, fold_build_pointer_plus_hwi_loc (input_location,
1544 ptr,
1545 fixed_offset));
1546 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1549 /* If there's a virtual offset, look up that value in the vtable and
1550 adjust the pointer again. */
1551 if (virtual_offset)
1553 tree vtabletmp;
1554 tree vtabletmp2;
1555 tree vtabletmp3;
1557 if (!vtable_entry_type)
1559 tree vfunc_type = make_node (FUNCTION_TYPE);
1560 TREE_TYPE (vfunc_type) = integer_type_node;
1561 TYPE_ARG_TYPES (vfunc_type) = NULL_TREE;
1562 layout_type (vfunc_type);
1564 vtable_entry_type = build_pointer_type (vfunc_type);
1567 vtabletmp =
1568 create_tmp_reg (build_pointer_type
1569 (build_pointer_type (vtable_entry_type)), "vptr");
1571 /* The vptr is always at offset zero in the object. */
1572 stmt = gimple_build_assign (vtabletmp,
1573 build1 (NOP_EXPR, TREE_TYPE (vtabletmp),
1574 ptr));
1575 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1577 /* Form the vtable address. */
1578 vtabletmp2 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp)),
1579 "vtableaddr");
1580 stmt = gimple_build_assign (vtabletmp2,
1581 build_simple_mem_ref (vtabletmp));
1582 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1584 /* Find the entry with the vcall offset. */
1585 stmt = gimple_build_assign (vtabletmp2,
1586 fold_build_pointer_plus_loc (input_location,
1587 vtabletmp2,
1588 virtual_offset));
1589 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1591 /* Get the offset itself. */
1592 vtabletmp3 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp2)),
1593 "vcalloffset");
1594 stmt = gimple_build_assign (vtabletmp3,
1595 build_simple_mem_ref (vtabletmp2));
1596 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1598 /* Adjust the `this' pointer. */
1599 ptr = fold_build_pointer_plus_loc (input_location, ptr, vtabletmp3);
1600 ptr = force_gimple_operand_gsi (bsi, ptr, true, NULL_TREE, false,
1601 GSI_CONTINUE_LINKING);
1604 if (!this_adjusting
1605 && fixed_offset != 0)
1606 /* Adjust the pointer by the constant. */
1608 tree ptrtmp;
1610 if (VAR_P (ptr))
1611 ptrtmp = ptr;
1612 else
1614 ptrtmp = create_tmp_reg (TREE_TYPE (ptr), "ptr");
1615 stmt = gimple_build_assign (ptrtmp, ptr);
1616 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1618 ptr = fold_build_pointer_plus_hwi_loc (input_location,
1619 ptrtmp, fixed_offset);
1622 /* Emit the statement and gimplify the adjustment expression. */
1623 ret = create_tmp_reg (TREE_TYPE (ptr), "adjusted_this");
1624 stmt = gimple_build_assign (ret, ptr);
1625 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1627 return ret;
1630 /* Expand thunk NODE to gimple if possible.
1631 When FORCE_GIMPLE_THUNK is true, gimple thunk is created and
1632 no assembler is produced.
1633 When OUTPUT_ASM_THUNK is true, also produce assembler for
1634 thunks that are not lowered. */
1636 bool
1637 cgraph_node::expand_thunk (bool output_asm_thunks, bool force_gimple_thunk)
1639 bool this_adjusting = thunk.this_adjusting;
1640 HOST_WIDE_INT fixed_offset = thunk.fixed_offset;
1641 HOST_WIDE_INT virtual_value = thunk.virtual_value;
1642 tree virtual_offset = NULL;
1643 tree alias = callees->callee->decl;
1644 tree thunk_fndecl = decl;
1645 tree a;
1647 /* Instrumentation thunk is the same function with
1648 a different signature. Never need to expand it. */
1649 if (thunk.add_pointer_bounds_args)
1650 return false;
1652 if (!force_gimple_thunk && this_adjusting
1653 && targetm.asm_out.can_output_mi_thunk (thunk_fndecl, fixed_offset,
1654 virtual_value, alias))
1656 const char *fnname;
1657 tree fn_block;
1658 tree restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1660 if (!output_asm_thunks)
1662 analyzed = true;
1663 return false;
1666 if (in_lto_p)
1667 get_untransformed_body ();
1668 a = DECL_ARGUMENTS (thunk_fndecl);
1670 current_function_decl = thunk_fndecl;
1672 /* Ensure thunks are emitted in their correct sections. */
1673 resolve_unique_section (thunk_fndecl, 0,
1674 flag_function_sections);
1676 DECL_RESULT (thunk_fndecl)
1677 = build_decl (DECL_SOURCE_LOCATION (thunk_fndecl),
1678 RESULT_DECL, 0, restype);
1679 DECL_CONTEXT (DECL_RESULT (thunk_fndecl)) = thunk_fndecl;
1680 fnname = IDENTIFIER_POINTER (DECL_ASSEMBLER_NAME (thunk_fndecl));
1682 /* The back end expects DECL_INITIAL to contain a BLOCK, so we
1683 create one. */
1684 fn_block = make_node (BLOCK);
1685 BLOCK_VARS (fn_block) = a;
1686 DECL_INITIAL (thunk_fndecl) = fn_block;
1687 BLOCK_SUPERCONTEXT (fn_block) = thunk_fndecl;
1688 allocate_struct_function (thunk_fndecl, false);
1689 init_function_start (thunk_fndecl);
1690 cfun->is_thunk = 1;
1691 insn_locations_init ();
1692 set_curr_insn_location (DECL_SOURCE_LOCATION (thunk_fndecl));
1693 prologue_location = curr_insn_location ();
1694 assemble_start_function (thunk_fndecl, fnname);
1696 targetm.asm_out.output_mi_thunk (asm_out_file, thunk_fndecl,
1697 fixed_offset, virtual_value, alias);
1699 assemble_end_function (thunk_fndecl, fnname);
1700 insn_locations_finalize ();
1701 init_insn_lengths ();
1702 free_after_compilation (cfun);
1703 TREE_ASM_WRITTEN (thunk_fndecl) = 1;
1704 thunk.thunk_p = false;
1705 analyzed = false;
1707 else if (stdarg_p (TREE_TYPE (thunk_fndecl)))
1709 error ("generic thunk code fails for method %qD which uses %<...%>",
1710 thunk_fndecl);
1711 TREE_ASM_WRITTEN (thunk_fndecl) = 1;
1712 analyzed = true;
1713 return false;
1715 else
1717 tree restype;
1718 basic_block bb, then_bb, else_bb, return_bb;
1719 gimple_stmt_iterator bsi;
1720 int nargs = 0;
1721 tree arg;
1722 int i;
1723 tree resdecl;
1724 tree restmp = NULL;
1725 tree resbnd = NULL;
1727 gcall *call;
1728 greturn *ret;
1729 bool alias_is_noreturn = TREE_THIS_VOLATILE (alias);
1731 /* We may be called from expand_thunk that releses body except for
1732 DECL_ARGUMENTS. In this case force_gimple_thunk is true. */
1733 if (in_lto_p && !force_gimple_thunk)
1734 get_untransformed_body ();
1735 a = DECL_ARGUMENTS (thunk_fndecl);
1737 current_function_decl = thunk_fndecl;
1739 /* Ensure thunks are emitted in their correct sections. */
1740 resolve_unique_section (thunk_fndecl, 0,
1741 flag_function_sections);
1743 DECL_IGNORED_P (thunk_fndecl) = 1;
1744 bitmap_obstack_initialize (NULL);
1746 if (thunk.virtual_offset_p)
1747 virtual_offset = size_int (virtual_value);
1749 /* Build the return declaration for the function. */
1750 restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1751 if (DECL_RESULT (thunk_fndecl) == NULL_TREE)
1753 resdecl = build_decl (input_location, RESULT_DECL, 0, restype);
1754 DECL_ARTIFICIAL (resdecl) = 1;
1755 DECL_IGNORED_P (resdecl) = 1;
1756 DECL_RESULT (thunk_fndecl) = resdecl;
1757 DECL_CONTEXT (DECL_RESULT (thunk_fndecl)) = thunk_fndecl;
1759 else
1760 resdecl = DECL_RESULT (thunk_fndecl);
1762 bb = then_bb = else_bb = return_bb
1763 = init_lowered_empty_function (thunk_fndecl, true, count);
1765 bsi = gsi_start_bb (bb);
1767 /* Build call to the function being thunked. */
1768 if (!VOID_TYPE_P (restype)
1769 && (!alias_is_noreturn
1770 || TREE_ADDRESSABLE (restype)
1771 || TREE_CODE (TYPE_SIZE_UNIT (restype)) != INTEGER_CST))
1773 if (DECL_BY_REFERENCE (resdecl))
1775 restmp = gimple_fold_indirect_ref (resdecl);
1776 if (!restmp)
1777 restmp = build2 (MEM_REF,
1778 TREE_TYPE (TREE_TYPE (DECL_RESULT (alias))),
1779 resdecl,
1780 build_int_cst (TREE_TYPE
1781 (DECL_RESULT (alias)), 0));
1783 else if (!is_gimple_reg_type (restype))
1785 if (aggregate_value_p (resdecl, TREE_TYPE (thunk_fndecl)))
1787 restmp = resdecl;
1789 if (VAR_P (restmp))
1790 add_local_decl (cfun, restmp);
1791 BLOCK_VARS (DECL_INITIAL (current_function_decl)) = restmp;
1793 else
1794 restmp = create_tmp_var (restype, "retval");
1796 else
1797 restmp = create_tmp_reg (restype, "retval");
1800 for (arg = a; arg; arg = DECL_CHAIN (arg))
1801 nargs++;
1802 auto_vec<tree> vargs (nargs);
1803 i = 0;
1804 arg = a;
1805 if (this_adjusting)
1807 vargs.quick_push (thunk_adjust (&bsi, a, 1, fixed_offset,
1808 virtual_offset));
1809 arg = DECL_CHAIN (a);
1810 i = 1;
1813 if (nargs)
1814 for (; i < nargs; i++, arg = DECL_CHAIN (arg))
1816 tree tmp = arg;
1817 if (VECTOR_TYPE_P (TREE_TYPE (arg))
1818 || TREE_CODE (TREE_TYPE (arg)) == COMPLEX_TYPE)
1819 DECL_GIMPLE_REG_P (arg) = 1;
1821 if (!is_gimple_val (arg))
1823 tmp = create_tmp_reg (TYPE_MAIN_VARIANT
1824 (TREE_TYPE (arg)), "arg");
1825 gimple *stmt = gimple_build_assign (tmp, arg);
1826 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1828 vargs.quick_push (tmp);
1830 call = gimple_build_call_vec (build_fold_addr_expr_loc (0, alias), vargs);
1831 callees->call_stmt = call;
1832 gimple_call_set_from_thunk (call, true);
1833 gimple_call_set_with_bounds (call, instrumentation_clone);
1835 /* Return slot optimization is always possible and in fact requred to
1836 return values with DECL_BY_REFERENCE. */
1837 if (aggregate_value_p (resdecl, TREE_TYPE (thunk_fndecl))
1838 && (!is_gimple_reg_type (TREE_TYPE (resdecl))
1839 || DECL_BY_REFERENCE (resdecl)))
1840 gimple_call_set_return_slot_opt (call, true);
1842 if (restmp)
1844 gimple_call_set_lhs (call, restmp);
1845 gcc_assert (useless_type_conversion_p (TREE_TYPE (restmp),
1846 TREE_TYPE (TREE_TYPE (alias))));
1848 gsi_insert_after (&bsi, call, GSI_NEW_STMT);
1849 if (!alias_is_noreturn)
1851 if (instrumentation_clone
1852 && !DECL_BY_REFERENCE (resdecl)
1853 && restmp
1854 && BOUNDED_P (restmp))
1856 resbnd = chkp_insert_retbnd_call (NULL, restmp, &bsi);
1857 create_edge (get_create (gimple_call_fndecl (gsi_stmt (bsi))),
1858 as_a <gcall *> (gsi_stmt (bsi)),
1859 callees->count, callees->frequency);
1862 if (restmp && !this_adjusting
1863 && (fixed_offset || virtual_offset))
1865 tree true_label = NULL_TREE;
1867 if (TREE_CODE (TREE_TYPE (restmp)) == POINTER_TYPE)
1869 gimple *stmt;
1870 edge e;
1871 /* If the return type is a pointer, we need to
1872 protect against NULL. We know there will be an
1873 adjustment, because that's why we're emitting a
1874 thunk. */
1875 then_bb = create_basic_block (NULL, bb);
1876 then_bb->count = count - count / 16;
1877 then_bb->frequency = BB_FREQ_MAX - BB_FREQ_MAX / 16;
1878 return_bb = create_basic_block (NULL, then_bb);
1879 return_bb->count = count;
1880 return_bb->frequency = BB_FREQ_MAX;
1881 else_bb = create_basic_block (NULL, else_bb);
1882 then_bb->count = count / 16;
1883 then_bb->frequency = BB_FREQ_MAX / 16;
1884 add_bb_to_loop (then_bb, bb->loop_father);
1885 add_bb_to_loop (return_bb, bb->loop_father);
1886 add_bb_to_loop (else_bb, bb->loop_father);
1887 remove_edge (single_succ_edge (bb));
1888 true_label = gimple_block_label (then_bb);
1889 stmt = gimple_build_cond (NE_EXPR, restmp,
1890 build_zero_cst (TREE_TYPE (restmp)),
1891 NULL_TREE, NULL_TREE);
1892 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1893 e = make_edge (bb, then_bb, EDGE_TRUE_VALUE);
1894 e->probability = REG_BR_PROB_BASE - REG_BR_PROB_BASE / 16;
1895 e->count = count - count / 16;
1896 e = make_edge (bb, else_bb, EDGE_FALSE_VALUE);
1897 e->probability = REG_BR_PROB_BASE / 16;
1898 e->count = count / 16;
1899 e = make_edge (return_bb, EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
1900 e->probability = REG_BR_PROB_BASE;
1901 e->count = count;
1902 e = make_edge (then_bb, return_bb, EDGE_FALLTHRU);
1903 e->probability = REG_BR_PROB_BASE;
1904 e->count = count - count / 16;
1905 e = make_edge (else_bb, return_bb, EDGE_FALLTHRU);
1906 e->probability = REG_BR_PROB_BASE;
1907 e->count = count / 16;
1908 bsi = gsi_last_bb (then_bb);
1911 restmp = thunk_adjust (&bsi, restmp, /*this_adjusting=*/0,
1912 fixed_offset, virtual_offset);
1913 if (true_label)
1915 gimple *stmt;
1916 bsi = gsi_last_bb (else_bb);
1917 stmt = gimple_build_assign (restmp,
1918 build_zero_cst (TREE_TYPE (restmp)));
1919 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1920 bsi = gsi_last_bb (return_bb);
1923 else
1924 gimple_call_set_tail (call, true);
1926 /* Build return value. */
1927 if (!DECL_BY_REFERENCE (resdecl))
1928 ret = gimple_build_return (restmp);
1929 else
1930 ret = gimple_build_return (resdecl);
1931 gimple_return_set_retbnd (ret, resbnd);
1933 gsi_insert_after (&bsi, ret, GSI_NEW_STMT);
1935 else
1937 gimple_call_set_tail (call, true);
1938 remove_edge (single_succ_edge (bb));
1941 cfun->gimple_df->in_ssa_p = true;
1942 profile_status_for_fn (cfun)
1943 = count ? PROFILE_READ : PROFILE_GUESSED;
1944 /* FIXME: C++ FE should stop setting TREE_ASM_WRITTEN on thunks. */
1945 TREE_ASM_WRITTEN (thunk_fndecl) = false;
1946 delete_unreachable_blocks ();
1947 update_ssa (TODO_update_ssa);
1948 checking_verify_flow_info ();
1949 free_dominance_info (CDI_DOMINATORS);
1951 /* Since we want to emit the thunk, we explicitly mark its name as
1952 referenced. */
1953 thunk.thunk_p = false;
1954 lowered = true;
1955 bitmap_obstack_release (NULL);
1957 current_function_decl = NULL;
1958 set_cfun (NULL);
1959 return true;
1962 /* Assemble thunks and aliases associated to node. */
1964 void
1965 cgraph_node::assemble_thunks_and_aliases (void)
1967 cgraph_edge *e;
1968 ipa_ref *ref;
1970 for (e = callers; e;)
1971 if (e->caller->thunk.thunk_p
1972 && !e->caller->global.inlined_to
1973 && !e->caller->thunk.add_pointer_bounds_args)
1975 cgraph_node *thunk = e->caller;
1977 e = e->next_caller;
1978 thunk->expand_thunk (true, false);
1979 thunk->assemble_thunks_and_aliases ();
1981 else
1982 e = e->next_caller;
1984 FOR_EACH_ALIAS (this, ref)
1986 cgraph_node *alias = dyn_cast <cgraph_node *> (ref->referring);
1987 if (!alias->transparent_alias)
1989 bool saved_written = TREE_ASM_WRITTEN (decl);
1991 /* Force assemble_alias to really output the alias this time instead
1992 of buffering it in same alias pairs. */
1993 TREE_ASM_WRITTEN (decl) = 1;
1994 do_assemble_alias (alias->decl,
1995 DECL_ASSEMBLER_NAME (decl));
1996 alias->assemble_thunks_and_aliases ();
1997 TREE_ASM_WRITTEN (decl) = saved_written;
2002 /* Expand function specified by node. */
2004 void
2005 cgraph_node::expand (void)
2007 location_t saved_loc;
2009 /* We ought to not compile any inline clones. */
2010 gcc_assert (!global.inlined_to);
2012 /* __RTL functions are compiled as soon as they are parsed, so don't
2013 do it again. */
2014 if (native_rtl_p ())
2015 return;
2017 announce_function (decl);
2018 process = 0;
2019 gcc_assert (lowered);
2020 get_untransformed_body ();
2022 /* Generate RTL for the body of DECL. */
2024 timevar_push (TV_REST_OF_COMPILATION);
2026 gcc_assert (symtab->global_info_ready);
2028 /* Initialize the default bitmap obstack. */
2029 bitmap_obstack_initialize (NULL);
2031 /* Initialize the RTL code for the function. */
2032 saved_loc = input_location;
2033 input_location = DECL_SOURCE_LOCATION (decl);
2035 gcc_assert (DECL_STRUCT_FUNCTION (decl));
2036 push_cfun (DECL_STRUCT_FUNCTION (decl));
2037 init_function_start (decl);
2039 gimple_register_cfg_hooks ();
2041 bitmap_obstack_initialize (&reg_obstack); /* FIXME, only at RTL generation*/
2043 execute_all_ipa_transforms ();
2045 /* Perform all tree transforms and optimizations. */
2047 /* Signal the start of passes. */
2048 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_START, NULL);
2050 execute_pass_list (cfun, g->get_passes ()->all_passes);
2052 /* Signal the end of passes. */
2053 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_END, NULL);
2055 bitmap_obstack_release (&reg_obstack);
2057 /* Release the default bitmap obstack. */
2058 bitmap_obstack_release (NULL);
2060 /* If requested, warn about function definitions where the function will
2061 return a value (usually of some struct or union type) which itself will
2062 take up a lot of stack space. */
2063 if (warn_larger_than && !DECL_EXTERNAL (decl) && TREE_TYPE (decl))
2065 tree ret_type = TREE_TYPE (TREE_TYPE (decl));
2067 if (ret_type && TYPE_SIZE_UNIT (ret_type)
2068 && TREE_CODE (TYPE_SIZE_UNIT (ret_type)) == INTEGER_CST
2069 && 0 < compare_tree_int (TYPE_SIZE_UNIT (ret_type),
2070 larger_than_size))
2072 unsigned int size_as_int
2073 = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (ret_type));
2075 if (compare_tree_int (TYPE_SIZE_UNIT (ret_type), size_as_int) == 0)
2076 warning (OPT_Wlarger_than_, "size of return value of %q+D is %u bytes",
2077 decl, size_as_int);
2078 else
2079 warning (OPT_Wlarger_than_, "size of return value of %q+D is larger than %wd bytes",
2080 decl, larger_than_size);
2084 gimple_set_body (decl, NULL);
2085 if (DECL_STRUCT_FUNCTION (decl) == 0
2086 && !cgraph_node::get (decl)->origin)
2088 /* Stop pointing to the local nodes about to be freed.
2089 But DECL_INITIAL must remain nonzero so we know this
2090 was an actual function definition.
2091 For a nested function, this is done in c_pop_function_context.
2092 If rest_of_compilation set this to 0, leave it 0. */
2093 if (DECL_INITIAL (decl) != 0)
2094 DECL_INITIAL (decl) = error_mark_node;
2097 input_location = saved_loc;
2099 ggc_collect ();
2100 timevar_pop (TV_REST_OF_COMPILATION);
2102 /* Make sure that BE didn't give up on compiling. */
2103 gcc_assert (TREE_ASM_WRITTEN (decl));
2104 if (cfun)
2105 pop_cfun ();
2107 /* It would make a lot more sense to output thunks before function body to get more
2108 forward and lest backwarding jumps. This however would need solving problem
2109 with comdats. See PR48668. Also aliases must come after function itself to
2110 make one pass assemblers, like one on AIX, happy. See PR 50689.
2111 FIXME: Perhaps thunks should be move before function IFF they are not in comdat
2112 groups. */
2113 assemble_thunks_and_aliases ();
2114 release_body ();
2115 /* Eliminate all call edges. This is important so the GIMPLE_CALL no longer
2116 points to the dead function body. */
2117 remove_callees ();
2118 remove_all_references ();
2121 /* Node comparer that is responsible for the order that corresponds
2122 to time when a function was launched for the first time. */
2124 static int
2125 node_cmp (const void *pa, const void *pb)
2127 const cgraph_node *a = *(const cgraph_node * const *) pa;
2128 const cgraph_node *b = *(const cgraph_node * const *) pb;
2130 /* Functions with time profile must be before these without profile. */
2131 if (!a->tp_first_run || !b->tp_first_run)
2132 return a->tp_first_run - b->tp_first_run;
2134 return a->tp_first_run != b->tp_first_run
2135 ? b->tp_first_run - a->tp_first_run
2136 : b->order - a->order;
2139 /* Expand all functions that must be output.
2141 Attempt to topologically sort the nodes so function is output when
2142 all called functions are already assembled to allow data to be
2143 propagated across the callgraph. Use a stack to get smaller distance
2144 between a function and its callees (later we may choose to use a more
2145 sophisticated algorithm for function reordering; we will likely want
2146 to use subsections to make the output functions appear in top-down
2147 order). */
2149 static void
2150 expand_all_functions (void)
2152 cgraph_node *node;
2153 cgraph_node **order = XCNEWVEC (cgraph_node *,
2154 symtab->cgraph_count);
2155 unsigned int expanded_func_count = 0, profiled_func_count = 0;
2156 int order_pos, new_order_pos = 0;
2157 int i;
2159 order_pos = ipa_reverse_postorder (order);
2160 gcc_assert (order_pos == symtab->cgraph_count);
2162 /* Garbage collector may remove inline clones we eliminate during
2163 optimization. So we must be sure to not reference them. */
2164 for (i = 0; i < order_pos; i++)
2165 if (order[i]->process)
2166 order[new_order_pos++] = order[i];
2168 if (flag_profile_reorder_functions)
2169 qsort (order, new_order_pos, sizeof (cgraph_node *), node_cmp);
2171 for (i = new_order_pos - 1; i >= 0; i--)
2173 node = order[i];
2175 if (node->process)
2177 expanded_func_count++;
2178 if(node->tp_first_run)
2179 profiled_func_count++;
2181 if (symtab->dump_file)
2182 fprintf (symtab->dump_file,
2183 "Time profile order in expand_all_functions:%s:%d\n",
2184 node->asm_name (), node->tp_first_run);
2185 node->process = 0;
2186 node->expand ();
2190 if (dump_file)
2191 fprintf (dump_file, "Expanded functions with time profile (%s):%u/%u\n",
2192 main_input_filename, profiled_func_count, expanded_func_count);
2194 if (symtab->dump_file && flag_profile_reorder_functions)
2195 fprintf (symtab->dump_file, "Expanded functions with time profile:%u/%u\n",
2196 profiled_func_count, expanded_func_count);
2198 symtab->process_new_functions ();
2199 free_gimplify_stack ();
2201 free (order);
2204 /* This is used to sort the node types by the cgraph order number. */
2206 enum cgraph_order_sort_kind
2208 ORDER_UNDEFINED = 0,
2209 ORDER_FUNCTION,
2210 ORDER_VAR,
2211 ORDER_VAR_UNDEF,
2212 ORDER_ASM
2215 struct cgraph_order_sort
2217 enum cgraph_order_sort_kind kind;
2218 union
2220 cgraph_node *f;
2221 varpool_node *v;
2222 asm_node *a;
2223 } u;
2226 /* Output all functions, variables, and asm statements in the order
2227 according to their order fields, which is the order in which they
2228 appeared in the file. This implements -fno-toplevel-reorder. In
2229 this mode we may output functions and variables which don't really
2230 need to be output.
2231 When NO_REORDER is true only do this for symbols marked no reorder. */
2233 static void
2234 output_in_order (bool no_reorder)
2236 int max;
2237 cgraph_order_sort *nodes;
2238 int i;
2239 cgraph_node *pf;
2240 varpool_node *pv;
2241 asm_node *pa;
2242 max = symtab->order;
2243 nodes = XCNEWVEC (cgraph_order_sort, max);
2245 FOR_EACH_DEFINED_FUNCTION (pf)
2247 if (pf->process && !pf->thunk.thunk_p && !pf->alias)
2249 if (no_reorder && !pf->no_reorder)
2250 continue;
2251 i = pf->order;
2252 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
2253 nodes[i].kind = ORDER_FUNCTION;
2254 nodes[i].u.f = pf;
2258 /* There is a similar loop in symbol_table::output_variables.
2259 Please keep them in sync. */
2260 FOR_EACH_VARIABLE (pv)
2262 if (no_reorder && !pv->no_reorder)
2263 continue;
2264 if (DECL_HARD_REGISTER (pv->decl)
2265 || DECL_HAS_VALUE_EXPR_P (pv->decl))
2266 continue;
2267 i = pv->order;
2268 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
2269 nodes[i].kind = pv->definition ? ORDER_VAR : ORDER_VAR_UNDEF;
2270 nodes[i].u.v = pv;
2273 for (pa = symtab->first_asm_symbol (); pa; pa = pa->next)
2275 i = pa->order;
2276 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
2277 nodes[i].kind = ORDER_ASM;
2278 nodes[i].u.a = pa;
2281 /* In toplevel reorder mode we output all statics; mark them as needed. */
2283 for (i = 0; i < max; ++i)
2284 if (nodes[i].kind == ORDER_VAR)
2285 nodes[i].u.v->finalize_named_section_flags ();
2287 for (i = 0; i < max; ++i)
2289 switch (nodes[i].kind)
2291 case ORDER_FUNCTION:
2292 nodes[i].u.f->process = 0;
2293 nodes[i].u.f->expand ();
2294 break;
2296 case ORDER_VAR:
2297 nodes[i].u.v->assemble_decl ();
2298 break;
2300 case ORDER_VAR_UNDEF:
2301 assemble_undefined_decl (nodes[i].u.v->decl);
2302 break;
2304 case ORDER_ASM:
2305 assemble_asm (nodes[i].u.a->asm_str);
2306 break;
2308 case ORDER_UNDEFINED:
2309 break;
2311 default:
2312 gcc_unreachable ();
2316 symtab->clear_asm_symbols ();
2318 free (nodes);
2321 static void
2322 ipa_passes (void)
2324 gcc::pass_manager *passes = g->get_passes ();
2326 set_cfun (NULL);
2327 current_function_decl = NULL;
2328 gimple_register_cfg_hooks ();
2329 bitmap_obstack_initialize (NULL);
2331 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_START, NULL);
2333 if (!in_lto_p)
2335 execute_ipa_pass_list (passes->all_small_ipa_passes);
2336 if (seen_error ())
2337 return;
2340 /* This extra symtab_remove_unreachable_nodes pass tends to catch some
2341 devirtualization and other changes where removal iterate. */
2342 symtab->remove_unreachable_nodes (symtab->dump_file);
2344 /* If pass_all_early_optimizations was not scheduled, the state of
2345 the cgraph will not be properly updated. Update it now. */
2346 if (symtab->state < IPA_SSA)
2347 symtab->state = IPA_SSA;
2349 if (!in_lto_p)
2351 /* Generate coverage variables and constructors. */
2352 coverage_finish ();
2354 /* Process new functions added. */
2355 set_cfun (NULL);
2356 current_function_decl = NULL;
2357 symtab->process_new_functions ();
2359 execute_ipa_summary_passes
2360 ((ipa_opt_pass_d *) passes->all_regular_ipa_passes);
2363 /* Some targets need to handle LTO assembler output specially. */
2364 if (flag_generate_lto || flag_generate_offload)
2365 targetm.asm_out.lto_start ();
2367 if (!in_lto_p)
2369 if (g->have_offload)
2371 section_name_prefix = OFFLOAD_SECTION_NAME_PREFIX;
2372 lto_stream_offload_p = true;
2373 ipa_write_summaries ();
2374 lto_stream_offload_p = false;
2376 if (flag_lto)
2378 section_name_prefix = LTO_SECTION_NAME_PREFIX;
2379 lto_stream_offload_p = false;
2380 ipa_write_summaries ();
2384 if (flag_generate_lto || flag_generate_offload)
2385 targetm.asm_out.lto_end ();
2387 if (!flag_ltrans && (in_lto_p || !flag_lto || flag_fat_lto_objects))
2388 execute_ipa_pass_list (passes->all_regular_ipa_passes);
2389 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_END, NULL);
2391 bitmap_obstack_release (NULL);
2395 /* Return string alias is alias of. */
2397 static tree
2398 get_alias_symbol (tree decl)
2400 tree alias = lookup_attribute ("alias", DECL_ATTRIBUTES (decl));
2401 return get_identifier (TREE_STRING_POINTER
2402 (TREE_VALUE (TREE_VALUE (alias))));
2406 /* Weakrefs may be associated to external decls and thus not output
2407 at expansion time. Emit all necessary aliases. */
2409 void
2410 symbol_table::output_weakrefs (void)
2412 symtab_node *node;
2413 cgraph_node *cnode;
2414 FOR_EACH_SYMBOL (node)
2415 if (node->alias
2416 && !TREE_ASM_WRITTEN (node->decl)
2417 && (!(cnode = dyn_cast <cgraph_node *> (node))
2418 || !cnode->instrumented_version
2419 || !TREE_ASM_WRITTEN (cnode->instrumented_version->decl))
2420 && node->weakref)
2422 tree target;
2424 /* Weakrefs are special by not requiring target definition in current
2425 compilation unit. It is thus bit hard to work out what we want to
2426 alias.
2427 When alias target is defined, we need to fetch it from symtab reference,
2428 otherwise it is pointed to by alias_target. */
2429 if (node->alias_target)
2430 target = (DECL_P (node->alias_target)
2431 ? DECL_ASSEMBLER_NAME (node->alias_target)
2432 : node->alias_target);
2433 else if (node->analyzed)
2434 target = DECL_ASSEMBLER_NAME (node->get_alias_target ()->decl);
2435 else
2437 gcc_unreachable ();
2438 target = get_alias_symbol (node->decl);
2440 do_assemble_alias (node->decl, target);
2444 /* Perform simple optimizations based on callgraph. */
2446 void
2447 symbol_table::compile (void)
2449 if (seen_error ())
2450 return;
2452 symtab_node::checking_verify_symtab_nodes ();
2454 timevar_push (TV_CGRAPHOPT);
2455 if (pre_ipa_mem_report)
2457 fprintf (stderr, "Memory consumption before IPA\n");
2458 dump_memory_report (false);
2460 if (!quiet_flag)
2461 fprintf (stderr, "Performing interprocedural optimizations\n");
2462 state = IPA;
2464 /* Offloading requires LTO infrastructure. */
2465 if (!in_lto_p && g->have_offload)
2466 flag_generate_offload = 1;
2468 /* If LTO is enabled, initialize the streamer hooks needed by GIMPLE. */
2469 if (flag_generate_lto || flag_generate_offload)
2470 lto_streamer_hooks_init ();
2472 /* Don't run the IPA passes if there was any error or sorry messages. */
2473 if (!seen_error ())
2474 ipa_passes ();
2476 /* Do nothing else if any IPA pass found errors or if we are just streaming LTO. */
2477 if (seen_error ()
2478 || (!in_lto_p && flag_lto && !flag_fat_lto_objects))
2480 timevar_pop (TV_CGRAPHOPT);
2481 return;
2484 global_info_ready = true;
2485 if (dump_file)
2487 fprintf (dump_file, "Optimized ");
2488 symtab->dump (dump_file);
2490 if (post_ipa_mem_report)
2492 fprintf (stderr, "Memory consumption after IPA\n");
2493 dump_memory_report (false);
2495 timevar_pop (TV_CGRAPHOPT);
2497 /* Output everything. */
2498 (*debug_hooks->assembly_start) ();
2499 if (!quiet_flag)
2500 fprintf (stderr, "Assembling functions:\n");
2501 symtab_node::checking_verify_symtab_nodes ();
2503 bitmap_obstack_initialize (NULL);
2504 execute_ipa_pass_list (g->get_passes ()->all_late_ipa_passes);
2505 bitmap_obstack_release (NULL);
2506 mark_functions_to_output ();
2508 /* When weakref support is missing, we automatically translate all
2509 references to NODE to references to its ultimate alias target.
2510 The renaming mechanizm uses flag IDENTIFIER_TRANSPARENT_ALIAS and
2511 TREE_CHAIN.
2513 Set up this mapping before we output any assembler but once we are sure
2514 that all symbol renaming is done.
2516 FIXME: All this uglyness can go away if we just do renaming at gimple
2517 level by physically rewritting the IL. At the moment we can only redirect
2518 calls, so we need infrastructure for renaming references as well. */
2519 #ifndef ASM_OUTPUT_WEAKREF
2520 symtab_node *node;
2522 FOR_EACH_SYMBOL (node)
2523 if (node->alias
2524 && lookup_attribute ("weakref", DECL_ATTRIBUTES (node->decl)))
2526 IDENTIFIER_TRANSPARENT_ALIAS
2527 (DECL_ASSEMBLER_NAME (node->decl)) = 1;
2528 TREE_CHAIN (DECL_ASSEMBLER_NAME (node->decl))
2529 = (node->alias_target ? node->alias_target
2530 : DECL_ASSEMBLER_NAME (node->get_alias_target ()->decl));
2532 #endif
2534 state = EXPANSION;
2536 if (!flag_toplevel_reorder)
2537 output_in_order (false);
2538 else
2540 /* Output first asm statements and anything ordered. The process
2541 flag is cleared for these nodes, so we skip them later. */
2542 output_in_order (true);
2543 expand_all_functions ();
2544 output_variables ();
2547 process_new_functions ();
2548 state = FINISHED;
2549 output_weakrefs ();
2551 if (dump_file)
2553 fprintf (dump_file, "\nFinal ");
2554 symtab->dump (dump_file);
2556 if (!flag_checking)
2557 return;
2558 symtab_node::verify_symtab_nodes ();
2559 /* Double check that all inline clones are gone and that all
2560 function bodies have been released from memory. */
2561 if (!seen_error ())
2563 cgraph_node *node;
2564 bool error_found = false;
2566 FOR_EACH_DEFINED_FUNCTION (node)
2567 if (node->global.inlined_to
2568 || gimple_has_body_p (node->decl))
2570 error_found = true;
2571 node->debug ();
2573 if (error_found)
2574 internal_error ("nodes with unreleased memory found");
2579 /* Analyze the whole compilation unit once it is parsed completely. */
2581 void
2582 symbol_table::finalize_compilation_unit (void)
2584 timevar_push (TV_CGRAPH);
2586 /* If we're here there's no current function anymore. Some frontends
2587 are lazy in clearing these. */
2588 current_function_decl = NULL;
2589 set_cfun (NULL);
2591 /* Do not skip analyzing the functions if there were errors, we
2592 miss diagnostics for following functions otherwise. */
2594 /* Emit size functions we didn't inline. */
2595 finalize_size_functions ();
2597 /* Mark alias targets necessary and emit diagnostics. */
2598 handle_alias_pairs ();
2600 if (!quiet_flag)
2602 fprintf (stderr, "\nAnalyzing compilation unit\n");
2603 fflush (stderr);
2606 if (flag_dump_passes)
2607 dump_passes ();
2609 /* Gimplify and lower all functions, compute reachability and
2610 remove unreachable nodes. */
2611 analyze_functions (/*first_time=*/true);
2613 /* Mark alias targets necessary and emit diagnostics. */
2614 handle_alias_pairs ();
2616 /* Gimplify and lower thunks. */
2617 analyze_functions (/*first_time=*/false);
2619 if (!seen_error ())
2621 /* Emit early debug for reachable functions, and by consequence,
2622 locally scoped symbols. */
2623 struct cgraph_node *cnode;
2624 FOR_EACH_FUNCTION_WITH_GIMPLE_BODY (cnode)
2625 (*debug_hooks->early_global_decl) (cnode->decl);
2627 /* Clean up anything that needs cleaning up after initial debug
2628 generation. */
2629 (*debug_hooks->early_finish) (main_input_filename);
2632 /* Finally drive the pass manager. */
2633 compile ();
2635 timevar_pop (TV_CGRAPH);
2638 /* Reset all state within cgraphunit.c so that we can rerun the compiler
2639 within the same process. For use by toplev::finalize. */
2641 void
2642 cgraphunit_c_finalize (void)
2644 gcc_assert (cgraph_new_nodes.length () == 0);
2645 cgraph_new_nodes.truncate (0);
2647 vtable_entry_type = NULL;
2648 queued_nodes = &symtab_terminator;
2650 first_analyzed = NULL;
2651 first_analyzed_var = NULL;
2654 /* Creates a wrapper from cgraph_node to TARGET node. Thunk is used for this
2655 kind of wrapper method. */
2657 void
2658 cgraph_node::create_wrapper (cgraph_node *target)
2660 /* Preserve DECL_RESULT so we get right by reference flag. */
2661 tree decl_result = DECL_RESULT (decl);
2663 /* Remove the function's body but keep arguments to be reused
2664 for thunk. */
2665 release_body (true);
2666 reset ();
2668 DECL_UNINLINABLE (decl) = false;
2669 DECL_RESULT (decl) = decl_result;
2670 DECL_INITIAL (decl) = NULL;
2671 allocate_struct_function (decl, false);
2672 set_cfun (NULL);
2674 /* Turn alias into thunk and expand it into GIMPLE representation. */
2675 definition = true;
2677 memset (&thunk, 0, sizeof (cgraph_thunk_info));
2678 thunk.thunk_p = true;
2679 create_edge (target, NULL, count, CGRAPH_FREQ_BASE);
2680 callees->can_throw_external = !TREE_NOTHROW (target->decl);
2682 tree arguments = DECL_ARGUMENTS (decl);
2684 while (arguments)
2686 TREE_ADDRESSABLE (arguments) = false;
2687 arguments = TREE_CHAIN (arguments);
2690 expand_thunk (false, true);
2692 /* Inline summary set-up. */
2693 analyze ();
2694 inline_analyze_function (this);
2697 #include "gt-cgraphunit.h"