PR rtl-optimization/88470
[official-gcc.git] / gcc / cgraphunit.c
blob2f0b70ff8505bae72c3192ca4a00dfe6c45771ea
1 /* Driver of optimization process
2 Copyright (C) 2003-2018 Free Software Foundation, Inc.
3 Contributed by Jan Hubicka
5 This file is part of GCC.
7 GCC is free software; you can redistribute it and/or modify it under
8 the terms of the GNU General Public License as published by the Free
9 Software Foundation; either version 3, or (at your option) any later
10 version.
12 GCC is distributed in the hope that it will be useful, but WITHOUT ANY
13 WARRANTY; without even the implied warranty of MERCHANTABILITY or
14 FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
15 for more details.
17 You should have received a copy of the GNU General Public License
18 along with GCC; see the file COPYING3. If not see
19 <http://www.gnu.org/licenses/>. */
21 /* This module implements main driver of compilation process.
23 The main scope of this file is to act as an interface in between
24 tree based frontends and the backend.
26 The front-end is supposed to use following functionality:
28 - finalize_function
30 This function is called once front-end has parsed whole body of function
31 and it is certain that the function body nor the declaration will change.
33 (There is one exception needed for implementing GCC extern inline
34 function.)
36 - varpool_finalize_decl
38 This function has same behavior as the above but is used for static
39 variables.
41 - add_asm_node
43 Insert new toplevel ASM statement
45 - finalize_compilation_unit
47 This function is called once (source level) compilation unit is finalized
48 and it will no longer change.
50 The symbol table is constructed starting from the trivially needed
51 symbols finalized by the frontend. Functions are lowered into
52 GIMPLE representation and callgraph/reference lists are constructed.
53 Those are used to discover other necessary functions and variables.
55 At the end the bodies of unreachable functions are removed.
57 The function can be called multiple times when multiple source level
58 compilation units are combined.
60 - compile
62 This passes control to the back-end. Optimizations are performed and
63 final assembler is generated. This is done in the following way. Note
64 that with link time optimization the process is split into three
65 stages (compile time, linktime analysis and parallel linktime as
66 indicated bellow).
68 Compile time:
70 1) Inter-procedural optimization.
71 (ipa_passes)
73 This part is further split into:
75 a) early optimizations. These are local passes executed in
76 the topological order on the callgraph.
78 The purpose of early optimiations is to optimize away simple
79 things that may otherwise confuse IP analysis. Very simple
80 propagation across the callgraph is done i.e. to discover
81 functions without side effects and simple inlining is performed.
83 b) early small interprocedural passes.
85 Those are interprocedural passes executed only at compilation
86 time. These include, for example, transational memory lowering,
87 unreachable code removal and other simple transformations.
89 c) IP analysis stage. All interprocedural passes do their
90 analysis.
92 Interprocedural passes differ from small interprocedural
93 passes by their ability to operate across whole program
94 at linktime. Their analysis stage is performed early to
95 both reduce linking times and linktime memory usage by
96 not having to represent whole program in memory.
98 d) LTO sreaming. When doing LTO, everything important gets
99 streamed into the object file.
101 Compile time and or linktime analysis stage (WPA):
103 At linktime units gets streamed back and symbol table is
104 merged. Function bodies are not streamed in and not
105 available.
106 e) IP propagation stage. All IP passes execute their
107 IP propagation. This is done based on the earlier analysis
108 without having function bodies at hand.
109 f) Ltrans streaming. When doing WHOPR LTO, the program
110 is partitioned and streamed into multple object files.
112 Compile time and/or parallel linktime stage (ltrans)
114 Each of the object files is streamed back and compiled
115 separately. Now the function bodies becomes available
116 again.
118 2) Virtual clone materialization
119 (cgraph_materialize_clone)
121 IP passes can produce copies of existing functoins (such
122 as versioned clones or inline clones) without actually
123 manipulating their bodies by creating virtual clones in
124 the callgraph. At this time the virtual clones are
125 turned into real functions
126 3) IP transformation
128 All IP passes transform function bodies based on earlier
129 decision of the IP propagation.
131 4) late small IP passes
133 Simple IP passes working within single program partition.
135 5) Expansion
136 (expand_all_functions)
138 At this stage functions that needs to be output into
139 assembler are identified and compiled in topological order
140 6) Output of variables and aliases
141 Now it is known what variable references was not optimized
142 out and thus all variables are output to the file.
144 Note that with -fno-toplevel-reorder passes 5 and 6
145 are combined together in cgraph_output_in_order.
147 Finally there are functions to manipulate the callgraph from
148 backend.
149 - cgraph_add_new_function is used to add backend produced
150 functions introduced after the unit is finalized.
151 The functions are enqueue for later processing and inserted
152 into callgraph with cgraph_process_new_functions.
154 - cgraph_function_versioning
156 produces a copy of function into new one (a version)
157 and apply simple transformations
160 #include "config.h"
161 #include "system.h"
162 #include "coretypes.h"
163 #include "backend.h"
164 #include "target.h"
165 #include "rtl.h"
166 #include "tree.h"
167 #include "gimple.h"
168 #include "cfghooks.h"
169 #include "regset.h" /* FIXME: For reg_obstack. */
170 #include "alloc-pool.h"
171 #include "tree-pass.h"
172 #include "stringpool.h"
173 #include "gimple-ssa.h"
174 #include "cgraph.h"
175 #include "coverage.h"
176 #include "lto-streamer.h"
177 #include "fold-const.h"
178 #include "varasm.h"
179 #include "stor-layout.h"
180 #include "output.h"
181 #include "cfgcleanup.h"
182 #include "gimple-fold.h"
183 #include "gimplify.h"
184 #include "gimple-iterator.h"
185 #include "gimplify-me.h"
186 #include "tree-cfg.h"
187 #include "tree-into-ssa.h"
188 #include "tree-ssa.h"
189 #include "langhooks.h"
190 #include "toplev.h"
191 #include "debug.h"
192 #include "symbol-summary.h"
193 #include "tree-vrp.h"
194 #include "ipa-prop.h"
195 #include "gimple-pretty-print.h"
196 #include "plugin.h"
197 #include "ipa-fnsummary.h"
198 #include "ipa-utils.h"
199 #include "except.h"
200 #include "cfgloop.h"
201 #include "context.h"
202 #include "pass_manager.h"
203 #include "tree-nested.h"
204 #include "dbgcnt.h"
205 #include "lto-section-names.h"
206 #include "stringpool.h"
207 #include "attribs.h"
209 /* Queue of cgraph nodes scheduled to be added into cgraph. This is a
210 secondary queue used during optimization to accommodate passes that
211 may generate new functions that need to be optimized and expanded. */
212 vec<cgraph_node *> cgraph_new_nodes;
214 static void expand_all_functions (void);
215 static void mark_functions_to_output (void);
216 static void handle_alias_pairs (void);
218 /* Used for vtable lookup in thunk adjusting. */
219 static GTY (()) tree vtable_entry_type;
221 /* Return true if this symbol is a function from the C frontend specified
222 directly in RTL form (with "__RTL"). */
224 bool
225 symtab_node::native_rtl_p () const
227 if (TREE_CODE (decl) != FUNCTION_DECL)
228 return false;
229 if (!DECL_STRUCT_FUNCTION (decl))
230 return false;
231 return DECL_STRUCT_FUNCTION (decl)->curr_properties & PROP_rtl;
234 /* Determine if symbol declaration is needed. That is, visible to something
235 either outside this translation unit, something magic in the system
236 configury */
237 bool
238 symtab_node::needed_p (void)
240 /* Double check that no one output the function into assembly file
241 early. */
242 if (!native_rtl_p ())
243 gcc_checking_assert
244 (!DECL_ASSEMBLER_NAME_SET_P (decl)
245 || !TREE_SYMBOL_REFERENCED (DECL_ASSEMBLER_NAME (decl)));
247 if (!definition)
248 return false;
250 if (DECL_EXTERNAL (decl))
251 return false;
253 /* If the user told us it is used, then it must be so. */
254 if (force_output)
255 return true;
257 /* ABI forced symbols are needed when they are external. */
258 if (forced_by_abi && TREE_PUBLIC (decl))
259 return true;
261 /* Keep constructors, destructors and virtual functions. */
262 if (TREE_CODE (decl) == FUNCTION_DECL
263 && (DECL_STATIC_CONSTRUCTOR (decl) || DECL_STATIC_DESTRUCTOR (decl)))
264 return true;
266 /* Externally visible variables must be output. The exception is
267 COMDAT variables that must be output only when they are needed. */
268 if (TREE_PUBLIC (decl) && !DECL_COMDAT (decl))
269 return true;
271 return false;
274 /* Head and terminator of the queue of nodes to be processed while building
275 callgraph. */
277 static symtab_node symtab_terminator;
278 static symtab_node *queued_nodes = &symtab_terminator;
280 /* Add NODE to queue starting at QUEUED_NODES.
281 The queue is linked via AUX pointers and terminated by pointer to 1. */
283 static void
284 enqueue_node (symtab_node *node)
286 if (node->aux)
287 return;
288 gcc_checking_assert (queued_nodes);
289 node->aux = queued_nodes;
290 queued_nodes = node;
293 /* Process CGRAPH_NEW_FUNCTIONS and perform actions necessary to add these
294 functions into callgraph in a way so they look like ordinary reachable
295 functions inserted into callgraph already at construction time. */
297 void
298 symbol_table::process_new_functions (void)
300 tree fndecl;
302 if (!cgraph_new_nodes.exists ())
303 return;
305 handle_alias_pairs ();
306 /* Note that this queue may grow as its being processed, as the new
307 functions may generate new ones. */
308 for (unsigned i = 0; i < cgraph_new_nodes.length (); i++)
310 cgraph_node *node = cgraph_new_nodes[i];
311 fndecl = node->decl;
312 switch (state)
314 case CONSTRUCTION:
315 /* At construction time we just need to finalize function and move
316 it into reachable functions list. */
318 cgraph_node::finalize_function (fndecl, false);
319 call_cgraph_insertion_hooks (node);
320 enqueue_node (node);
321 break;
323 case IPA:
324 case IPA_SSA:
325 case IPA_SSA_AFTER_INLINING:
326 /* When IPA optimization already started, do all essential
327 transformations that has been already performed on the whole
328 cgraph but not on this function. */
330 gimple_register_cfg_hooks ();
331 if (!node->analyzed)
332 node->analyze ();
333 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
334 if ((state == IPA_SSA || state == IPA_SSA_AFTER_INLINING)
335 && !gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
337 bool summaried_computed = ipa_fn_summaries != NULL;
338 g->get_passes ()->execute_early_local_passes ();
339 /* Early passes compure inline parameters to do inlining
340 and splitting. This is redundant for functions added late.
341 Just throw away whatever it did. */
342 if (!summaried_computed)
343 ipa_free_fn_summary ();
345 else if (ipa_fn_summaries != NULL)
346 compute_fn_summary (node, true);
347 free_dominance_info (CDI_POST_DOMINATORS);
348 free_dominance_info (CDI_DOMINATORS);
349 pop_cfun ();
350 call_cgraph_insertion_hooks (node);
351 break;
353 case EXPANSION:
354 /* Functions created during expansion shall be compiled
355 directly. */
356 node->process = 0;
357 call_cgraph_insertion_hooks (node);
358 node->expand ();
359 break;
361 default:
362 gcc_unreachable ();
363 break;
367 cgraph_new_nodes.release ();
370 /* As an GCC extension we allow redefinition of the function. The
371 semantics when both copies of bodies differ is not well defined.
372 We replace the old body with new body so in unit at a time mode
373 we always use new body, while in normal mode we may end up with
374 old body inlined into some functions and new body expanded and
375 inlined in others.
377 ??? It may make more sense to use one body for inlining and other
378 body for expanding the function but this is difficult to do. */
380 void
381 cgraph_node::reset (void)
383 /* If process is set, then we have already begun whole-unit analysis.
384 This is *not* testing for whether we've already emitted the function.
385 That case can be sort-of legitimately seen with real function redefinition
386 errors. I would argue that the front end should never present us with
387 such a case, but don't enforce that for now. */
388 gcc_assert (!process);
390 /* Reset our data structures so we can analyze the function again. */
391 memset (&local, 0, sizeof (local));
392 memset (&global, 0, sizeof (global));
393 memset (&rtl, 0, sizeof (rtl));
394 analyzed = false;
395 definition = false;
396 alias = false;
397 transparent_alias = false;
398 weakref = false;
399 cpp_implicit_alias = false;
401 remove_callees ();
402 remove_all_references ();
405 /* Return true when there are references to the node. INCLUDE_SELF is
406 true if a self reference counts as a reference. */
408 bool
409 symtab_node::referred_to_p (bool include_self)
411 ipa_ref *ref = NULL;
413 /* See if there are any references at all. */
414 if (iterate_referring (0, ref))
415 return true;
416 /* For functions check also calls. */
417 cgraph_node *cn = dyn_cast <cgraph_node *> (this);
418 if (cn && cn->callers)
420 if (include_self)
421 return true;
422 for (cgraph_edge *e = cn->callers; e; e = e->next_caller)
423 if (e->caller != this)
424 return true;
426 return false;
429 /* DECL has been parsed. Take it, queue it, compile it at the whim of the
430 logic in effect. If NO_COLLECT is true, then our caller cannot stand to have
431 the garbage collector run at the moment. We would need to either create
432 a new GC context, or just not compile right now. */
434 void
435 cgraph_node::finalize_function (tree decl, bool no_collect)
437 cgraph_node *node = cgraph_node::get_create (decl);
439 if (node->definition)
441 /* Nested functions should only be defined once. */
442 gcc_assert (!DECL_CONTEXT (decl)
443 || TREE_CODE (DECL_CONTEXT (decl)) != FUNCTION_DECL);
444 node->reset ();
445 node->local.redefined_extern_inline = true;
448 /* Set definition first before calling notice_global_symbol so that
449 it is available to notice_global_symbol. */
450 node->definition = true;
451 notice_global_symbol (decl);
452 node->lowered = DECL_STRUCT_FUNCTION (decl)->cfg != NULL;
453 if (!flag_toplevel_reorder)
454 node->no_reorder = true;
456 /* With -fkeep-inline-functions we are keeping all inline functions except
457 for extern inline ones. */
458 if (flag_keep_inline_functions
459 && DECL_DECLARED_INLINE_P (decl)
460 && !DECL_EXTERNAL (decl)
461 && !DECL_DISREGARD_INLINE_LIMITS (decl))
462 node->force_output = 1;
464 /* __RTL functions were already output as soon as they were parsed (due
465 to the large amount of global state in the backend).
466 Mark such functions as "force_output" to reflect the fact that they
467 will be in the asm file when considering the symbols they reference.
468 The attempt to output them later on will bail out immediately. */
469 if (node->native_rtl_p ())
470 node->force_output = 1;
472 /* When not optimizing, also output the static functions. (see
473 PR24561), but don't do so for always_inline functions, functions
474 declared inline and nested functions. These were optimized out
475 in the original implementation and it is unclear whether we want
476 to change the behavior here. */
477 if (((!opt_for_fn (decl, optimize) || flag_keep_static_functions
478 || node->no_reorder)
479 && !node->cpp_implicit_alias
480 && !DECL_DISREGARD_INLINE_LIMITS (decl)
481 && !DECL_DECLARED_INLINE_P (decl)
482 && !(DECL_CONTEXT (decl)
483 && TREE_CODE (DECL_CONTEXT (decl)) == FUNCTION_DECL))
484 && !DECL_COMDAT (decl) && !DECL_EXTERNAL (decl))
485 node->force_output = 1;
487 /* If we've not yet emitted decl, tell the debug info about it. */
488 if (!TREE_ASM_WRITTEN (decl))
489 (*debug_hooks->deferred_inline_function) (decl);
491 if (!no_collect)
492 ggc_collect ();
494 if (symtab->state == CONSTRUCTION
495 && (node->needed_p () || node->referred_to_p ()))
496 enqueue_node (node);
499 /* Add the function FNDECL to the call graph.
500 Unlike finalize_function, this function is intended to be used
501 by middle end and allows insertion of new function at arbitrary point
502 of compilation. The function can be either in high, low or SSA form
503 GIMPLE.
505 The function is assumed to be reachable and have address taken (so no
506 API breaking optimizations are performed on it).
508 Main work done by this function is to enqueue the function for later
509 processing to avoid need the passes to be re-entrant. */
511 void
512 cgraph_node::add_new_function (tree fndecl, bool lowered)
514 gcc::pass_manager *passes = g->get_passes ();
515 cgraph_node *node;
517 if (dump_file)
519 struct function *fn = DECL_STRUCT_FUNCTION (fndecl);
520 const char *function_type = ((gimple_has_body_p (fndecl))
521 ? (lowered
522 ? (gimple_in_ssa_p (fn)
523 ? "ssa gimple"
524 : "low gimple")
525 : "high gimple")
526 : "to-be-gimplified");
527 fprintf (dump_file,
528 "Added new %s function %s to callgraph\n",
529 function_type,
530 fndecl_name (fndecl));
533 switch (symtab->state)
535 case PARSING:
536 cgraph_node::finalize_function (fndecl, false);
537 break;
538 case CONSTRUCTION:
539 /* Just enqueue function to be processed at nearest occurrence. */
540 node = cgraph_node::get_create (fndecl);
541 if (lowered)
542 node->lowered = true;
543 cgraph_new_nodes.safe_push (node);
544 break;
546 case IPA:
547 case IPA_SSA:
548 case IPA_SSA_AFTER_INLINING:
549 case EXPANSION:
550 /* Bring the function into finalized state and enqueue for later
551 analyzing and compilation. */
552 node = cgraph_node::get_create (fndecl);
553 node->local.local = false;
554 node->definition = true;
555 node->force_output = true;
556 if (TREE_PUBLIC (fndecl))
557 node->externally_visible = true;
558 if (!lowered && symtab->state == EXPANSION)
560 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
561 gimple_register_cfg_hooks ();
562 bitmap_obstack_initialize (NULL);
563 execute_pass_list (cfun, passes->all_lowering_passes);
564 passes->execute_early_local_passes ();
565 bitmap_obstack_release (NULL);
566 pop_cfun ();
568 lowered = true;
570 if (lowered)
571 node->lowered = true;
572 cgraph_new_nodes.safe_push (node);
573 break;
575 case FINISHED:
576 /* At the very end of compilation we have to do all the work up
577 to expansion. */
578 node = cgraph_node::create (fndecl);
579 if (lowered)
580 node->lowered = true;
581 node->definition = true;
582 node->analyze ();
583 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
584 gimple_register_cfg_hooks ();
585 bitmap_obstack_initialize (NULL);
586 if (!gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
587 g->get_passes ()->execute_early_local_passes ();
588 bitmap_obstack_release (NULL);
589 pop_cfun ();
590 node->expand ();
591 break;
593 default:
594 gcc_unreachable ();
597 /* Set a personality if required and we already passed EH lowering. */
598 if (lowered
599 && (function_needs_eh_personality (DECL_STRUCT_FUNCTION (fndecl))
600 == eh_personality_lang))
601 DECL_FUNCTION_PERSONALITY (fndecl) = lang_hooks.eh_personality ();
604 /* Analyze the function scheduled to be output. */
605 void
606 cgraph_node::analyze (void)
608 if (native_rtl_p ())
610 analyzed = true;
611 return;
614 tree decl = this->decl;
615 location_t saved_loc = input_location;
616 input_location = DECL_SOURCE_LOCATION (decl);
618 if (thunk.thunk_p)
620 cgraph_node *t = cgraph_node::get (thunk.alias);
622 create_edge (t, NULL, t->count);
623 callees->can_throw_external = !TREE_NOTHROW (t->decl);
624 /* Target code in expand_thunk may need the thunk's target
625 to be analyzed, so recurse here. */
626 if (!t->analyzed && t->definition)
627 t->analyze ();
628 if (t->alias)
630 t = t->get_alias_target ();
631 if (!t->analyzed && t->definition)
632 t->analyze ();
634 bool ret = expand_thunk (false, false);
635 thunk.alias = NULL;
636 if (!ret)
637 return;
639 if (alias)
640 resolve_alias (cgraph_node::get (alias_target), transparent_alias);
641 else if (dispatcher_function)
643 /* Generate the dispatcher body of multi-versioned functions. */
644 cgraph_function_version_info *dispatcher_version_info
645 = function_version ();
646 if (dispatcher_version_info != NULL
647 && (dispatcher_version_info->dispatcher_resolver
648 == NULL_TREE))
650 tree resolver = NULL_TREE;
651 gcc_assert (targetm.generate_version_dispatcher_body);
652 resolver = targetm.generate_version_dispatcher_body (this);
653 gcc_assert (resolver != NULL_TREE);
656 else
658 push_cfun (DECL_STRUCT_FUNCTION (decl));
660 assign_assembler_name_if_needed (decl);
662 /* Make sure to gimplify bodies only once. During analyzing a
663 function we lower it, which will require gimplified nested
664 functions, so we can end up here with an already gimplified
665 body. */
666 if (!gimple_has_body_p (decl))
667 gimplify_function_tree (decl);
669 /* Lower the function. */
670 if (!lowered)
672 if (nested)
673 lower_nested_functions (decl);
674 gcc_assert (!nested);
676 gimple_register_cfg_hooks ();
677 bitmap_obstack_initialize (NULL);
678 execute_pass_list (cfun, g->get_passes ()->all_lowering_passes);
679 free_dominance_info (CDI_POST_DOMINATORS);
680 free_dominance_info (CDI_DOMINATORS);
681 compact_blocks ();
682 bitmap_obstack_release (NULL);
683 lowered = true;
686 pop_cfun ();
688 analyzed = true;
690 input_location = saved_loc;
693 /* C++ frontend produce same body aliases all over the place, even before PCH
694 gets streamed out. It relies on us linking the aliases with their function
695 in order to do the fixups, but ipa-ref is not PCH safe. Consequentely we
696 first produce aliases without links, but once C++ FE is sure he won't sream
697 PCH we build the links via this function. */
699 void
700 symbol_table::process_same_body_aliases (void)
702 symtab_node *node;
703 FOR_EACH_SYMBOL (node)
704 if (node->cpp_implicit_alias && !node->analyzed)
705 node->resolve_alias
706 (VAR_P (node->alias_target)
707 ? (symtab_node *)varpool_node::get_create (node->alias_target)
708 : (symtab_node *)cgraph_node::get_create (node->alias_target));
709 cpp_implicit_aliases_done = true;
712 /* Process attributes common for vars and functions. */
714 static void
715 process_common_attributes (symtab_node *node, tree decl)
717 tree weakref = lookup_attribute ("weakref", DECL_ATTRIBUTES (decl));
719 if (weakref && !lookup_attribute ("alias", DECL_ATTRIBUTES (decl)))
721 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
722 "%<weakref%> attribute should be accompanied with"
723 " an %<alias%> attribute");
724 DECL_WEAK (decl) = 0;
725 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
726 DECL_ATTRIBUTES (decl));
729 if (lookup_attribute ("no_reorder", DECL_ATTRIBUTES (decl)))
730 node->no_reorder = 1;
733 /* Look for externally_visible and used attributes and mark cgraph nodes
734 accordingly.
736 We cannot mark the nodes at the point the attributes are processed (in
737 handle_*_attribute) because the copy of the declarations available at that
738 point may not be canonical. For example, in:
740 void f();
741 void f() __attribute__((used));
743 the declaration we see in handle_used_attribute will be the second
744 declaration -- but the front end will subsequently merge that declaration
745 with the original declaration and discard the second declaration.
747 Furthermore, we can't mark these nodes in finalize_function because:
749 void f() {}
750 void f() __attribute__((externally_visible));
752 is valid.
754 So, we walk the nodes at the end of the translation unit, applying the
755 attributes at that point. */
757 static void
758 process_function_and_variable_attributes (cgraph_node *first,
759 varpool_node *first_var)
761 cgraph_node *node;
762 varpool_node *vnode;
764 for (node = symtab->first_function (); node != first;
765 node = symtab->next_function (node))
767 tree decl = node->decl;
768 if (DECL_PRESERVE_P (decl))
769 node->mark_force_output ();
770 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
772 if (! TREE_PUBLIC (node->decl))
773 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
774 "%<externally_visible%>"
775 " attribute have effect only on public objects");
777 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
778 && (node->definition && !node->alias))
780 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
781 "%<weakref%> attribute ignored"
782 " because function is defined");
783 DECL_WEAK (decl) = 0;
784 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
785 DECL_ATTRIBUTES (decl));
787 else if (lookup_attribute ("alias", DECL_ATTRIBUTES (decl))
788 && node->definition
789 && !node->alias)
790 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
791 "%<alias%> attribute ignored"
792 " because function is defined");
794 if (lookup_attribute ("always_inline", DECL_ATTRIBUTES (decl))
795 && !DECL_DECLARED_INLINE_P (decl)
796 /* redefining extern inline function makes it DECL_UNINLINABLE. */
797 && !DECL_UNINLINABLE (decl))
798 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
799 "always_inline function might not be inlinable");
801 process_common_attributes (node, decl);
803 for (vnode = symtab->first_variable (); vnode != first_var;
804 vnode = symtab->next_variable (vnode))
806 tree decl = vnode->decl;
807 if (DECL_EXTERNAL (decl)
808 && DECL_INITIAL (decl))
809 varpool_node::finalize_decl (decl);
810 if (DECL_PRESERVE_P (decl))
811 vnode->force_output = true;
812 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
814 if (! TREE_PUBLIC (vnode->decl))
815 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
816 "%<externally_visible%>"
817 " attribute have effect only on public objects");
819 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
820 && vnode->definition
821 && DECL_INITIAL (decl))
823 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
824 "%<weakref%> attribute ignored"
825 " because variable is initialized");
826 DECL_WEAK (decl) = 0;
827 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
828 DECL_ATTRIBUTES (decl));
830 process_common_attributes (vnode, decl);
834 /* Mark DECL as finalized. By finalizing the declaration, frontend instruct the
835 middle end to output the variable to asm file, if needed or externally
836 visible. */
838 void
839 varpool_node::finalize_decl (tree decl)
841 varpool_node *node = varpool_node::get_create (decl);
843 gcc_assert (TREE_STATIC (decl) || DECL_EXTERNAL (decl));
845 if (node->definition)
846 return;
847 /* Set definition first before calling notice_global_symbol so that
848 it is available to notice_global_symbol. */
849 node->definition = true;
850 notice_global_symbol (decl);
851 if (!flag_toplevel_reorder)
852 node->no_reorder = true;
853 if (TREE_THIS_VOLATILE (decl) || DECL_PRESERVE_P (decl)
854 /* Traditionally we do not eliminate static variables when not
855 optimizing and when not doing toplevel reoder. */
856 || (node->no_reorder && !DECL_COMDAT (node->decl)
857 && !DECL_ARTIFICIAL (node->decl)))
858 node->force_output = true;
860 if (symtab->state == CONSTRUCTION
861 && (node->needed_p () || node->referred_to_p ()))
862 enqueue_node (node);
863 if (symtab->state >= IPA_SSA)
864 node->analyze ();
865 /* Some frontends produce various interface variables after compilation
866 finished. */
867 if (symtab->state == FINISHED
868 || (node->no_reorder
869 && symtab->state == EXPANSION))
870 node->assemble_decl ();
873 /* EDGE is an polymorphic call. Mark all possible targets as reachable
874 and if there is only one target, perform trivial devirtualization.
875 REACHABLE_CALL_TARGETS collects target lists we already walked to
876 avoid udplicate work. */
878 static void
879 walk_polymorphic_call_targets (hash_set<void *> *reachable_call_targets,
880 cgraph_edge *edge)
882 unsigned int i;
883 void *cache_token;
884 bool final;
885 vec <cgraph_node *>targets
886 = possible_polymorphic_call_targets
887 (edge, &final, &cache_token);
889 if (!reachable_call_targets->add (cache_token))
891 if (symtab->dump_file)
892 dump_possible_polymorphic_call_targets
893 (symtab->dump_file, edge);
895 for (i = 0; i < targets.length (); i++)
897 /* Do not bother to mark virtual methods in anonymous namespace;
898 either we will find use of virtual table defining it, or it is
899 unused. */
900 if (targets[i]->definition
901 && TREE_CODE
902 (TREE_TYPE (targets[i]->decl))
903 == METHOD_TYPE
904 && !type_in_anonymous_namespace_p
905 (TYPE_METHOD_BASETYPE (TREE_TYPE (targets[i]->decl))))
906 enqueue_node (targets[i]);
910 /* Very trivial devirtualization; when the type is
911 final or anonymous (so we know all its derivation)
912 and there is only one possible virtual call target,
913 make the edge direct. */
914 if (final)
916 if (targets.length () <= 1 && dbg_cnt (devirt))
918 cgraph_node *target;
919 if (targets.length () == 1)
920 target = targets[0];
921 else
922 target = cgraph_node::create
923 (builtin_decl_implicit (BUILT_IN_UNREACHABLE));
925 if (symtab->dump_file)
927 fprintf (symtab->dump_file,
928 "Devirtualizing call: ");
929 print_gimple_stmt (symtab->dump_file,
930 edge->call_stmt, 0,
931 TDF_SLIM);
933 if (dump_enabled_p ())
935 dump_printf_loc (MSG_OPTIMIZED_LOCATIONS, edge->call_stmt,
936 "devirtualizing call in %s to %s\n",
937 edge->caller->name (), target->name ());
940 edge->make_direct (target);
941 edge->redirect_call_stmt_to_callee ();
943 if (symtab->dump_file)
945 fprintf (symtab->dump_file,
946 "Devirtualized as: ");
947 print_gimple_stmt (symtab->dump_file,
948 edge->call_stmt, 0,
949 TDF_SLIM);
955 /* Issue appropriate warnings for the global declaration DECL. */
957 static void
958 check_global_declaration (symtab_node *snode)
960 const char *decl_file;
961 tree decl = snode->decl;
963 /* Warn about any function declared static but not defined. We don't
964 warn about variables, because many programs have static variables
965 that exist only to get some text into the object file. */
966 if (TREE_CODE (decl) == FUNCTION_DECL
967 && DECL_INITIAL (decl) == 0
968 && DECL_EXTERNAL (decl)
969 && ! DECL_ARTIFICIAL (decl)
970 && ! TREE_NO_WARNING (decl)
971 && ! TREE_PUBLIC (decl)
972 && (warn_unused_function
973 || snode->referred_to_p (/*include_self=*/false)))
975 if (snode->referred_to_p (/*include_self=*/false))
976 pedwarn (input_location, 0, "%q+F used but never defined", decl);
977 else
978 warning (OPT_Wunused_function, "%q+F declared %<static%> but never defined", decl);
979 /* This symbol is effectively an "extern" declaration now. */
980 TREE_PUBLIC (decl) = 1;
983 /* Warn about static fns or vars defined but not used. */
984 if (((warn_unused_function && TREE_CODE (decl) == FUNCTION_DECL)
985 || (((warn_unused_variable && ! TREE_READONLY (decl))
986 || (warn_unused_const_variable > 0 && TREE_READONLY (decl)
987 && (warn_unused_const_variable == 2
988 || (main_input_filename != NULL
989 && (decl_file = DECL_SOURCE_FILE (decl)) != NULL
990 && filename_cmp (main_input_filename,
991 decl_file) == 0))))
992 && VAR_P (decl)))
993 && ! DECL_IN_SYSTEM_HEADER (decl)
994 && ! snode->referred_to_p (/*include_self=*/false)
995 /* This TREE_USED check is needed in addition to referred_to_p
996 above, because the `__unused__' attribute is not being
997 considered for referred_to_p. */
998 && ! TREE_USED (decl)
999 /* The TREE_USED bit for file-scope decls is kept in the identifier,
1000 to handle multiple external decls in different scopes. */
1001 && ! (DECL_NAME (decl) && TREE_USED (DECL_NAME (decl)))
1002 && ! DECL_EXTERNAL (decl)
1003 && ! DECL_ARTIFICIAL (decl)
1004 && ! DECL_ABSTRACT_ORIGIN (decl)
1005 && ! TREE_PUBLIC (decl)
1006 /* A volatile variable might be used in some non-obvious way. */
1007 && (! VAR_P (decl) || ! TREE_THIS_VOLATILE (decl))
1008 /* Global register variables must be declared to reserve them. */
1009 && ! (VAR_P (decl) && DECL_REGISTER (decl))
1010 /* Global ctors and dtors are called by the runtime. */
1011 && (TREE_CODE (decl) != FUNCTION_DECL
1012 || (!DECL_STATIC_CONSTRUCTOR (decl)
1013 && !DECL_STATIC_DESTRUCTOR (decl)))
1014 /* Otherwise, ask the language. */
1015 && lang_hooks.decls.warn_unused_global (decl))
1016 warning_at (DECL_SOURCE_LOCATION (decl),
1017 (TREE_CODE (decl) == FUNCTION_DECL)
1018 ? OPT_Wunused_function
1019 : (TREE_READONLY (decl)
1020 ? OPT_Wunused_const_variable_
1021 : OPT_Wunused_variable),
1022 "%qD defined but not used", decl);
1025 /* Discover all functions and variables that are trivially needed, analyze
1026 them as well as all functions and variables referred by them */
1027 static cgraph_node *first_analyzed;
1028 static varpool_node *first_analyzed_var;
1030 /* FIRST_TIME is set to TRUE for the first time we are called for a
1031 translation unit from finalize_compilation_unit() or false
1032 otherwise. */
1034 static void
1035 analyze_functions (bool first_time)
1037 /* Keep track of already processed nodes when called multiple times for
1038 intermodule optimization. */
1039 cgraph_node *first_handled = first_analyzed;
1040 varpool_node *first_handled_var = first_analyzed_var;
1041 hash_set<void *> reachable_call_targets;
1043 symtab_node *node;
1044 symtab_node *next;
1045 int i;
1046 ipa_ref *ref;
1047 bool changed = true;
1048 location_t saved_loc = input_location;
1050 bitmap_obstack_initialize (NULL);
1051 symtab->state = CONSTRUCTION;
1052 input_location = UNKNOWN_LOCATION;
1054 /* Ugly, but the fixup can not happen at a time same body alias is created;
1055 C++ FE is confused about the COMDAT groups being right. */
1056 if (symtab->cpp_implicit_aliases_done)
1057 FOR_EACH_SYMBOL (node)
1058 if (node->cpp_implicit_alias)
1059 node->fixup_same_cpp_alias_visibility (node->get_alias_target ());
1060 build_type_inheritance_graph ();
1062 /* Analysis adds static variables that in turn adds references to new functions.
1063 So we need to iterate the process until it stabilize. */
1064 while (changed)
1066 changed = false;
1067 process_function_and_variable_attributes (first_analyzed,
1068 first_analyzed_var);
1070 /* First identify the trivially needed symbols. */
1071 for (node = symtab->first_symbol ();
1072 node != first_analyzed
1073 && node != first_analyzed_var; node = node->next)
1075 /* Convert COMDAT group designators to IDENTIFIER_NODEs. */
1076 node->get_comdat_group_id ();
1077 if (node->needed_p ())
1079 enqueue_node (node);
1080 if (!changed && symtab->dump_file)
1081 fprintf (symtab->dump_file, "Trivially needed symbols:");
1082 changed = true;
1083 if (symtab->dump_file)
1084 fprintf (symtab->dump_file, " %s", node->asm_name ());
1085 if (!changed && symtab->dump_file)
1086 fprintf (symtab->dump_file, "\n");
1088 if (node == first_analyzed
1089 || node == first_analyzed_var)
1090 break;
1092 symtab->process_new_functions ();
1093 first_analyzed_var = symtab->first_variable ();
1094 first_analyzed = symtab->first_function ();
1096 if (changed && symtab->dump_file)
1097 fprintf (symtab->dump_file, "\n");
1099 /* Lower representation, build callgraph edges and references for all trivially
1100 needed symbols and all symbols referred by them. */
1101 while (queued_nodes != &symtab_terminator)
1103 changed = true;
1104 node = queued_nodes;
1105 queued_nodes = (symtab_node *)queued_nodes->aux;
1106 cgraph_node *cnode = dyn_cast <cgraph_node *> (node);
1107 if (cnode && cnode->definition)
1109 cgraph_edge *edge;
1110 tree decl = cnode->decl;
1112 /* ??? It is possible to create extern inline function
1113 and later using weak alias attribute to kill its body.
1114 See gcc.c-torture/compile/20011119-1.c */
1115 if (!DECL_STRUCT_FUNCTION (decl)
1116 && !cnode->alias
1117 && !cnode->thunk.thunk_p
1118 && !cnode->dispatcher_function)
1120 cnode->reset ();
1121 cnode->local.redefined_extern_inline = true;
1122 continue;
1125 if (!cnode->analyzed)
1126 cnode->analyze ();
1128 for (edge = cnode->callees; edge; edge = edge->next_callee)
1129 if (edge->callee->definition
1130 && (!DECL_EXTERNAL (edge->callee->decl)
1131 /* When not optimizing, do not try to analyze extern
1132 inline functions. Doing so is pointless. */
1133 || opt_for_fn (edge->callee->decl, optimize)
1134 /* Weakrefs needs to be preserved. */
1135 || edge->callee->alias
1136 /* always_inline functions are inlined aven at -O0. */
1137 || lookup_attribute
1138 ("always_inline",
1139 DECL_ATTRIBUTES (edge->callee->decl))
1140 /* Multiversioned functions needs the dispatcher to
1141 be produced locally even for extern functions. */
1142 || edge->callee->function_version ()))
1143 enqueue_node (edge->callee);
1144 if (opt_for_fn (cnode->decl, optimize)
1145 && opt_for_fn (cnode->decl, flag_devirtualize))
1147 cgraph_edge *next;
1149 for (edge = cnode->indirect_calls; edge; edge = next)
1151 next = edge->next_callee;
1152 if (edge->indirect_info->polymorphic)
1153 walk_polymorphic_call_targets (&reachable_call_targets,
1154 edge);
1158 /* If decl is a clone of an abstract function,
1159 mark that abstract function so that we don't release its body.
1160 The DECL_INITIAL() of that abstract function declaration
1161 will be later needed to output debug info. */
1162 if (DECL_ABSTRACT_ORIGIN (decl))
1164 cgraph_node *origin_node
1165 = cgraph_node::get_create (DECL_ABSTRACT_ORIGIN (decl));
1166 origin_node->used_as_abstract_origin = true;
1168 /* Preserve a functions function context node. It will
1169 later be needed to output debug info. */
1170 if (tree fn = decl_function_context (decl))
1172 cgraph_node *origin_node = cgraph_node::get_create (fn);
1173 enqueue_node (origin_node);
1176 else
1178 varpool_node *vnode = dyn_cast <varpool_node *> (node);
1179 if (vnode && vnode->definition && !vnode->analyzed)
1180 vnode->analyze ();
1183 if (node->same_comdat_group)
1185 symtab_node *next;
1186 for (next = node->same_comdat_group;
1187 next != node;
1188 next = next->same_comdat_group)
1189 if (!next->comdat_local_p ())
1190 enqueue_node (next);
1192 for (i = 0; node->iterate_reference (i, ref); i++)
1193 if (ref->referred->definition
1194 && (!DECL_EXTERNAL (ref->referred->decl)
1195 || ((TREE_CODE (ref->referred->decl) != FUNCTION_DECL
1196 && optimize)
1197 || (TREE_CODE (ref->referred->decl) == FUNCTION_DECL
1198 && opt_for_fn (ref->referred->decl, optimize))
1199 || node->alias
1200 || ref->referred->alias)))
1201 enqueue_node (ref->referred);
1202 symtab->process_new_functions ();
1205 update_type_inheritance_graph ();
1207 /* Collect entry points to the unit. */
1208 if (symtab->dump_file)
1210 fprintf (symtab->dump_file, "\n\nInitial ");
1211 symtab->dump (symtab->dump_file);
1214 if (first_time)
1216 symtab_node *snode;
1217 FOR_EACH_SYMBOL (snode)
1218 check_global_declaration (snode);
1221 if (symtab->dump_file)
1222 fprintf (symtab->dump_file, "\nRemoving unused symbols:");
1224 for (node = symtab->first_symbol ();
1225 node != first_handled
1226 && node != first_handled_var; node = next)
1228 next = node->next;
1229 if (!node->aux && !node->referred_to_p ())
1231 if (symtab->dump_file)
1232 fprintf (symtab->dump_file, " %s", node->name ());
1234 /* See if the debugger can use anything before the DECL
1235 passes away. Perhaps it can notice a DECL that is now a
1236 constant and can tag the early DIE with an appropriate
1237 attribute.
1239 Otherwise, this is the last chance the debug_hooks have
1240 at looking at optimized away DECLs, since
1241 late_global_decl will subsequently be called from the
1242 contents of the now pruned symbol table. */
1243 if (VAR_P (node->decl)
1244 && !decl_function_context (node->decl))
1246 /* We are reclaiming totally unreachable code and variables
1247 so they effectively appear as readonly. Show that to
1248 the debug machinery. */
1249 TREE_READONLY (node->decl) = 1;
1250 node->definition = false;
1251 (*debug_hooks->late_global_decl) (node->decl);
1254 node->remove ();
1255 continue;
1257 if (cgraph_node *cnode = dyn_cast <cgraph_node *> (node))
1259 tree decl = node->decl;
1261 if (cnode->definition && !gimple_has_body_p (decl)
1262 && !cnode->alias
1263 && !cnode->thunk.thunk_p)
1264 cnode->reset ();
1266 gcc_assert (!cnode->definition || cnode->thunk.thunk_p
1267 || cnode->alias
1268 || gimple_has_body_p (decl)
1269 || cnode->native_rtl_p ());
1270 gcc_assert (cnode->analyzed == cnode->definition);
1272 node->aux = NULL;
1274 for (;node; node = node->next)
1275 node->aux = NULL;
1276 first_analyzed = symtab->first_function ();
1277 first_analyzed_var = symtab->first_variable ();
1278 if (symtab->dump_file)
1280 fprintf (symtab->dump_file, "\n\nReclaimed ");
1281 symtab->dump (symtab->dump_file);
1283 bitmap_obstack_release (NULL);
1284 ggc_collect ();
1285 /* Initialize assembler name hash, in particular we want to trigger C++
1286 mangling and same body alias creation before we free DECL_ARGUMENTS
1287 used by it. */
1288 if (!seen_error ())
1289 symtab->symtab_initialize_asm_name_hash ();
1291 input_location = saved_loc;
1294 /* Check declaration of the type of ALIAS for compatibility with its TARGET
1295 (which may be an ifunc resolver) and issue a diagnostic when they are
1296 not compatible according to language rules (plus a C++ extension for
1297 non-static member functions). */
1299 static void
1300 maybe_diag_incompatible_alias (tree alias, tree target)
1302 tree altype = TREE_TYPE (alias);
1303 tree targtype = TREE_TYPE (target);
1305 bool ifunc = cgraph_node::get (alias)->ifunc_resolver;
1306 tree funcptr = altype;
1308 if (ifunc)
1310 /* Handle attribute ifunc first. */
1311 if (TREE_CODE (altype) == METHOD_TYPE)
1313 /* Set FUNCPTR to the type of the alias target. If the type
1314 is a non-static member function of class C, construct a type
1315 of an ordinary function taking C* as the first argument,
1316 followed by the member function argument list, and use it
1317 instead to check for incompatibility. This conversion is
1318 not defined by the language but an extension provided by
1319 G++. */
1321 tree rettype = TREE_TYPE (altype);
1322 tree args = TYPE_ARG_TYPES (altype);
1323 altype = build_function_type (rettype, args);
1324 funcptr = altype;
1327 targtype = TREE_TYPE (targtype);
1329 if (POINTER_TYPE_P (targtype))
1331 targtype = TREE_TYPE (targtype);
1333 /* Only issue Wattribute-alias for conversions to void* with
1334 -Wextra. */
1335 if (VOID_TYPE_P (targtype) && !extra_warnings)
1336 return;
1338 /* Proceed to handle incompatible ifunc resolvers below. */
1340 else
1342 funcptr = build_pointer_type (funcptr);
1344 error_at (DECL_SOURCE_LOCATION (target),
1345 "%<ifunc%> resolver for %qD must return %qT",
1346 alias, funcptr);
1347 inform (DECL_SOURCE_LOCATION (alias),
1348 "resolver indirect function declared here");
1349 return;
1353 if ((!FUNC_OR_METHOD_TYPE_P (targtype)
1354 || (prototype_p (altype)
1355 && prototype_p (targtype)
1356 && !types_compatible_p (altype, targtype))))
1358 /* Warn for incompatibilities. Avoid warning for functions
1359 without a prototype to make it possible to declare aliases
1360 without knowing the exact type, as libstdc++ does. */
1361 if (ifunc)
1363 funcptr = build_pointer_type (funcptr);
1365 auto_diagnostic_group d;
1366 if (warning_at (DECL_SOURCE_LOCATION (target),
1367 OPT_Wattribute_alias_,
1368 "%<ifunc%> resolver for %qD should return %qT",
1369 alias, funcptr))
1370 inform (DECL_SOURCE_LOCATION (alias),
1371 "resolver indirect function declared here");
1373 else
1375 auto_diagnostic_group d;
1376 if (warning_at (DECL_SOURCE_LOCATION (alias),
1377 OPT_Wattribute_alias_,
1378 "%qD alias between functions of incompatible "
1379 "types %qT and %qT", alias, altype, targtype))
1380 inform (DECL_SOURCE_LOCATION (target),
1381 "aliased declaration here");
1386 /* Translate the ugly representation of aliases as alias pairs into nice
1387 representation in callgraph. We don't handle all cases yet,
1388 unfortunately. */
1390 static void
1391 handle_alias_pairs (void)
1393 alias_pair *p;
1394 unsigned i;
1396 for (i = 0; alias_pairs && alias_pairs->iterate (i, &p);)
1398 symtab_node *target_node = symtab_node::get_for_asmname (p->target);
1400 /* Weakrefs with target not defined in current unit are easy to handle:
1401 they behave just as external variables except we need to note the
1402 alias flag to later output the weakref pseudo op into asm file. */
1403 if (!target_node
1404 && lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)) != NULL)
1406 symtab_node *node = symtab_node::get (p->decl);
1407 if (node)
1409 node->alias_target = p->target;
1410 node->weakref = true;
1411 node->alias = true;
1412 node->transparent_alias = true;
1414 alias_pairs->unordered_remove (i);
1415 continue;
1417 else if (!target_node)
1419 error ("%q+D aliased to undefined symbol %qE", p->decl, p->target);
1420 symtab_node *node = symtab_node::get (p->decl);
1421 if (node)
1422 node->alias = false;
1423 alias_pairs->unordered_remove (i);
1424 continue;
1427 if (DECL_EXTERNAL (target_node->decl)
1428 /* We use local aliases for C++ thunks to force the tailcall
1429 to bind locally. This is a hack - to keep it working do
1430 the following (which is not strictly correct). */
1431 && (TREE_CODE (target_node->decl) != FUNCTION_DECL
1432 || ! DECL_VIRTUAL_P (target_node->decl))
1433 && ! lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)))
1435 error ("%q+D aliased to external symbol %qE",
1436 p->decl, p->target);
1439 if (TREE_CODE (p->decl) == FUNCTION_DECL
1440 && target_node && is_a <cgraph_node *> (target_node))
1442 maybe_diag_incompatible_alias (p->decl, target_node->decl);
1444 maybe_diag_alias_attributes (p->decl, target_node->decl);
1446 cgraph_node *src_node = cgraph_node::get (p->decl);
1447 if (src_node && src_node->definition)
1448 src_node->reset ();
1449 cgraph_node::create_alias (p->decl, target_node->decl);
1450 alias_pairs->unordered_remove (i);
1452 else if (VAR_P (p->decl)
1453 && target_node && is_a <varpool_node *> (target_node))
1455 varpool_node::create_alias (p->decl, target_node->decl);
1456 alias_pairs->unordered_remove (i);
1458 else
1460 error ("%q+D alias between function and variable is not supported",
1461 p->decl);
1462 inform (DECL_SOURCE_LOCATION (target_node->decl),
1463 "aliased declaration here");
1465 alias_pairs->unordered_remove (i);
1468 vec_free (alias_pairs);
1472 /* Figure out what functions we want to assemble. */
1474 static void
1475 mark_functions_to_output (void)
1477 bool check_same_comdat_groups = false;
1478 cgraph_node *node;
1480 if (flag_checking)
1481 FOR_EACH_FUNCTION (node)
1482 gcc_assert (!node->process);
1484 FOR_EACH_FUNCTION (node)
1486 tree decl = node->decl;
1488 gcc_assert (!node->process || node->same_comdat_group);
1489 if (node->process)
1490 continue;
1492 /* We need to output all local functions that are used and not
1493 always inlined, as well as those that are reachable from
1494 outside the current compilation unit. */
1495 if (node->analyzed
1496 && !node->thunk.thunk_p
1497 && !node->alias
1498 && !node->global.inlined_to
1499 && !TREE_ASM_WRITTEN (decl)
1500 && !DECL_EXTERNAL (decl))
1502 node->process = 1;
1503 if (node->same_comdat_group)
1505 cgraph_node *next;
1506 for (next = dyn_cast<cgraph_node *> (node->same_comdat_group);
1507 next != node;
1508 next = dyn_cast<cgraph_node *> (next->same_comdat_group))
1509 if (!next->thunk.thunk_p && !next->alias
1510 && !next->comdat_local_p ())
1511 next->process = 1;
1514 else if (node->same_comdat_group)
1516 if (flag_checking)
1517 check_same_comdat_groups = true;
1519 else
1521 /* We should've reclaimed all functions that are not needed. */
1522 if (flag_checking
1523 && !node->global.inlined_to
1524 && gimple_has_body_p (decl)
1525 /* FIXME: in ltrans unit when offline copy is outside partition but inline copies
1526 are inside partition, we can end up not removing the body since we no longer
1527 have analyzed node pointing to it. */
1528 && !node->in_other_partition
1529 && !node->alias
1530 && !node->clones
1531 && !DECL_EXTERNAL (decl))
1533 node->debug ();
1534 internal_error ("failed to reclaim unneeded function");
1536 gcc_assert (node->global.inlined_to
1537 || !gimple_has_body_p (decl)
1538 || node->in_other_partition
1539 || node->clones
1540 || DECL_ARTIFICIAL (decl)
1541 || DECL_EXTERNAL (decl));
1546 if (flag_checking && check_same_comdat_groups)
1547 FOR_EACH_FUNCTION (node)
1548 if (node->same_comdat_group && !node->process)
1550 tree decl = node->decl;
1551 if (!node->global.inlined_to
1552 && gimple_has_body_p (decl)
1553 /* FIXME: in an ltrans unit when the offline copy is outside a
1554 partition but inline copies are inside a partition, we can
1555 end up not removing the body since we no longer have an
1556 analyzed node pointing to it. */
1557 && !node->in_other_partition
1558 && !node->clones
1559 && !DECL_EXTERNAL (decl))
1561 node->debug ();
1562 internal_error ("failed to reclaim unneeded function in same "
1563 "comdat group");
1568 /* DECL is FUNCTION_DECL. Initialize datastructures so DECL is a function
1569 in lowered gimple form. IN_SSA is true if the gimple is in SSA.
1571 Set current_function_decl and cfun to newly constructed empty function body.
1572 return basic block in the function body. */
1574 basic_block
1575 init_lowered_empty_function (tree decl, bool in_ssa, profile_count count)
1577 basic_block bb;
1578 edge e;
1580 current_function_decl = decl;
1581 allocate_struct_function (decl, false);
1582 gimple_register_cfg_hooks ();
1583 init_empty_tree_cfg ();
1584 init_tree_ssa (cfun);
1586 if (in_ssa)
1588 init_ssa_operands (cfun);
1589 cfun->gimple_df->in_ssa_p = true;
1590 cfun->curr_properties |= PROP_ssa;
1593 DECL_INITIAL (decl) = make_node (BLOCK);
1594 BLOCK_SUPERCONTEXT (DECL_INITIAL (decl)) = decl;
1596 DECL_SAVED_TREE (decl) = error_mark_node;
1597 cfun->curr_properties |= (PROP_gimple_lcf | PROP_gimple_leh | PROP_gimple_any
1598 | PROP_cfg | PROP_loops);
1600 set_loops_for_fn (cfun, ggc_cleared_alloc<loops> ());
1601 init_loops_structure (cfun, loops_for_fn (cfun), 1);
1602 loops_for_fn (cfun)->state |= LOOPS_MAY_HAVE_MULTIPLE_LATCHES;
1604 /* Create BB for body of the function and connect it properly. */
1605 ENTRY_BLOCK_PTR_FOR_FN (cfun)->count = count;
1606 EXIT_BLOCK_PTR_FOR_FN (cfun)->count = count;
1607 bb = create_basic_block (NULL, ENTRY_BLOCK_PTR_FOR_FN (cfun));
1608 bb->count = count;
1609 e = make_edge (ENTRY_BLOCK_PTR_FOR_FN (cfun), bb, EDGE_FALLTHRU);
1610 e->probability = profile_probability::always ();
1611 e = make_edge (bb, EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
1612 e->probability = profile_probability::always ();
1613 add_bb_to_loop (bb, ENTRY_BLOCK_PTR_FOR_FN (cfun)->loop_father);
1615 return bb;
1618 /* Adjust PTR by the constant FIXED_OFFSET, by the vtable offset indicated by
1619 VIRTUAL_OFFSET, and by the indirect offset indicated by INDIRECT_OFFSET, if
1620 it is non-null. THIS_ADJUSTING is nonzero for a this adjusting thunk and zero
1621 for a result adjusting thunk. */
1623 tree
1624 thunk_adjust (gimple_stmt_iterator * bsi,
1625 tree ptr, bool this_adjusting,
1626 HOST_WIDE_INT fixed_offset, tree virtual_offset,
1627 HOST_WIDE_INT indirect_offset)
1629 gassign *stmt;
1630 tree ret;
1632 if (this_adjusting
1633 && fixed_offset != 0)
1635 stmt = gimple_build_assign
1636 (ptr, fold_build_pointer_plus_hwi_loc (input_location,
1637 ptr,
1638 fixed_offset));
1639 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1642 if (!vtable_entry_type && (virtual_offset || indirect_offset != 0))
1644 tree vfunc_type = make_node (FUNCTION_TYPE);
1645 TREE_TYPE (vfunc_type) = integer_type_node;
1646 TYPE_ARG_TYPES (vfunc_type) = NULL_TREE;
1647 layout_type (vfunc_type);
1649 vtable_entry_type = build_pointer_type (vfunc_type);
1652 /* If there's a virtual offset, look up that value in the vtable and
1653 adjust the pointer again. */
1654 if (virtual_offset)
1656 tree vtabletmp;
1657 tree vtabletmp2;
1658 tree vtabletmp3;
1660 vtabletmp =
1661 create_tmp_reg (build_pointer_type
1662 (build_pointer_type (vtable_entry_type)), "vptr");
1664 /* The vptr is always at offset zero in the object. */
1665 stmt = gimple_build_assign (vtabletmp,
1666 build1 (NOP_EXPR, TREE_TYPE (vtabletmp),
1667 ptr));
1668 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1670 /* Form the vtable address. */
1671 vtabletmp2 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp)),
1672 "vtableaddr");
1673 stmt = gimple_build_assign (vtabletmp2,
1674 build_simple_mem_ref (vtabletmp));
1675 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1677 /* Find the entry with the vcall offset. */
1678 stmt = gimple_build_assign (vtabletmp2,
1679 fold_build_pointer_plus_loc (input_location,
1680 vtabletmp2,
1681 virtual_offset));
1682 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1684 /* Get the offset itself. */
1685 vtabletmp3 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp2)),
1686 "vcalloffset");
1687 stmt = gimple_build_assign (vtabletmp3,
1688 build_simple_mem_ref (vtabletmp2));
1689 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1691 /* Adjust the `this' pointer. */
1692 ptr = fold_build_pointer_plus_loc (input_location, ptr, vtabletmp3);
1693 ptr = force_gimple_operand_gsi (bsi, ptr, true, NULL_TREE, false,
1694 GSI_CONTINUE_LINKING);
1697 /* Likewise for an offset that is stored in the object that contains the
1698 vtable. */
1699 if (indirect_offset != 0)
1701 tree offset_ptr, offset_tree;
1703 /* Get the address of the offset. */
1704 offset_ptr
1705 = create_tmp_reg (build_pointer_type
1706 (build_pointer_type (vtable_entry_type)),
1707 "offset_ptr");
1708 stmt = gimple_build_assign (offset_ptr,
1709 build1 (NOP_EXPR, TREE_TYPE (offset_ptr),
1710 ptr));
1711 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1713 stmt = gimple_build_assign
1714 (offset_ptr,
1715 fold_build_pointer_plus_hwi_loc (input_location, offset_ptr,
1716 indirect_offset));
1717 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1719 /* Get the offset itself. */
1720 offset_tree = create_tmp_reg (TREE_TYPE (TREE_TYPE (offset_ptr)),
1721 "offset");
1722 stmt = gimple_build_assign (offset_tree,
1723 build_simple_mem_ref (offset_ptr));
1724 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1726 /* Adjust the `this' pointer. */
1727 ptr = fold_build_pointer_plus_loc (input_location, ptr, offset_tree);
1728 ptr = force_gimple_operand_gsi (bsi, ptr, true, NULL_TREE, false,
1729 GSI_CONTINUE_LINKING);
1732 if (!this_adjusting
1733 && fixed_offset != 0)
1734 /* Adjust the pointer by the constant. */
1736 tree ptrtmp;
1738 if (VAR_P (ptr))
1739 ptrtmp = ptr;
1740 else
1742 ptrtmp = create_tmp_reg (TREE_TYPE (ptr), "ptr");
1743 stmt = gimple_build_assign (ptrtmp, ptr);
1744 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1746 ptr = fold_build_pointer_plus_hwi_loc (input_location,
1747 ptrtmp, fixed_offset);
1750 /* Emit the statement and gimplify the adjustment expression. */
1751 ret = create_tmp_reg (TREE_TYPE (ptr), "adjusted_this");
1752 stmt = gimple_build_assign (ret, ptr);
1753 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1755 return ret;
1758 /* Expand thunk NODE to gimple if possible.
1759 When FORCE_GIMPLE_THUNK is true, gimple thunk is created and
1760 no assembler is produced.
1761 When OUTPUT_ASM_THUNK is true, also produce assembler for
1762 thunks that are not lowered. */
1764 bool
1765 cgraph_node::expand_thunk (bool output_asm_thunks, bool force_gimple_thunk)
1767 bool this_adjusting = thunk.this_adjusting;
1768 HOST_WIDE_INT fixed_offset = thunk.fixed_offset;
1769 HOST_WIDE_INT virtual_value = thunk.virtual_value;
1770 HOST_WIDE_INT indirect_offset = thunk.indirect_offset;
1771 tree virtual_offset = NULL;
1772 tree alias = callees->callee->decl;
1773 tree thunk_fndecl = decl;
1774 tree a;
1776 /* Instrumentation thunk is the same function with
1777 a different signature. Never need to expand it. */
1778 if (thunk.add_pointer_bounds_args)
1779 return false;
1781 if (!force_gimple_thunk
1782 && this_adjusting
1783 && indirect_offset == 0
1784 && !DECL_EXTERNAL (alias)
1785 && !DECL_STATIC_CHAIN (alias)
1786 && targetm.asm_out.can_output_mi_thunk (thunk_fndecl, fixed_offset,
1787 virtual_value, alias))
1789 const char *fnname;
1790 tree fn_block;
1791 tree restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1793 if (!output_asm_thunks)
1795 analyzed = true;
1796 return false;
1799 if (in_lto_p)
1800 get_untransformed_body ();
1801 a = DECL_ARGUMENTS (thunk_fndecl);
1803 current_function_decl = thunk_fndecl;
1805 /* Ensure thunks are emitted in their correct sections. */
1806 resolve_unique_section (thunk_fndecl, 0,
1807 flag_function_sections);
1809 DECL_RESULT (thunk_fndecl)
1810 = build_decl (DECL_SOURCE_LOCATION (thunk_fndecl),
1811 RESULT_DECL, 0, restype);
1812 DECL_CONTEXT (DECL_RESULT (thunk_fndecl)) = thunk_fndecl;
1813 fnname = IDENTIFIER_POINTER (DECL_ASSEMBLER_NAME (thunk_fndecl));
1815 /* The back end expects DECL_INITIAL to contain a BLOCK, so we
1816 create one. */
1817 fn_block = make_node (BLOCK);
1818 BLOCK_VARS (fn_block) = a;
1819 DECL_INITIAL (thunk_fndecl) = fn_block;
1820 BLOCK_SUPERCONTEXT (fn_block) = thunk_fndecl;
1821 allocate_struct_function (thunk_fndecl, false);
1822 init_function_start (thunk_fndecl);
1823 cfun->is_thunk = 1;
1824 insn_locations_init ();
1825 set_curr_insn_location (DECL_SOURCE_LOCATION (thunk_fndecl));
1826 prologue_location = curr_insn_location ();
1827 assemble_start_function (thunk_fndecl, fnname);
1829 targetm.asm_out.output_mi_thunk (asm_out_file, thunk_fndecl,
1830 fixed_offset, virtual_value, alias);
1832 assemble_end_function (thunk_fndecl, fnname);
1833 insn_locations_finalize ();
1834 init_insn_lengths ();
1835 free_after_compilation (cfun);
1836 TREE_ASM_WRITTEN (thunk_fndecl) = 1;
1837 thunk.thunk_p = false;
1838 analyzed = false;
1840 else if (stdarg_p (TREE_TYPE (thunk_fndecl)))
1842 error ("generic thunk code fails for method %qD which uses %<...%>",
1843 thunk_fndecl);
1844 TREE_ASM_WRITTEN (thunk_fndecl) = 1;
1845 analyzed = true;
1846 return false;
1848 else
1850 tree restype;
1851 basic_block bb, then_bb, else_bb, return_bb;
1852 gimple_stmt_iterator bsi;
1853 int nargs = 0;
1854 tree arg;
1855 int i;
1856 tree resdecl;
1857 tree restmp = NULL;
1859 gcall *call;
1860 greturn *ret;
1861 bool alias_is_noreturn = TREE_THIS_VOLATILE (alias);
1863 /* We may be called from expand_thunk that releses body except for
1864 DECL_ARGUMENTS. In this case force_gimple_thunk is true. */
1865 if (in_lto_p && !force_gimple_thunk)
1866 get_untransformed_body ();
1868 /* We need to force DECL_IGNORED_P when the thunk is created
1869 after early debug was run. */
1870 if (force_gimple_thunk)
1871 DECL_IGNORED_P (thunk_fndecl) = 1;
1873 a = DECL_ARGUMENTS (thunk_fndecl);
1875 current_function_decl = thunk_fndecl;
1877 /* Ensure thunks are emitted in their correct sections. */
1878 resolve_unique_section (thunk_fndecl, 0,
1879 flag_function_sections);
1881 bitmap_obstack_initialize (NULL);
1883 if (thunk.virtual_offset_p)
1884 virtual_offset = size_int (virtual_value);
1886 /* Build the return declaration for the function. */
1887 restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1888 if (DECL_RESULT (thunk_fndecl) == NULL_TREE)
1890 resdecl = build_decl (input_location, RESULT_DECL, 0, restype);
1891 DECL_ARTIFICIAL (resdecl) = 1;
1892 DECL_IGNORED_P (resdecl) = 1;
1893 DECL_CONTEXT (resdecl) = thunk_fndecl;
1894 DECL_RESULT (thunk_fndecl) = resdecl;
1896 else
1897 resdecl = DECL_RESULT (thunk_fndecl);
1899 profile_count cfg_count = count;
1900 if (!cfg_count.initialized_p ())
1901 cfg_count = profile_count::from_gcov_type (BB_FREQ_MAX).guessed_local ();
1903 bb = then_bb = else_bb = return_bb
1904 = init_lowered_empty_function (thunk_fndecl, true, cfg_count);
1906 bsi = gsi_start_bb (bb);
1908 /* Build call to the function being thunked. */
1909 if (!VOID_TYPE_P (restype)
1910 && (!alias_is_noreturn
1911 || TREE_ADDRESSABLE (restype)
1912 || TREE_CODE (TYPE_SIZE_UNIT (restype)) != INTEGER_CST))
1914 if (DECL_BY_REFERENCE (resdecl))
1916 restmp = gimple_fold_indirect_ref (resdecl);
1917 if (!restmp)
1918 restmp = build2 (MEM_REF,
1919 TREE_TYPE (TREE_TYPE (DECL_RESULT (alias))),
1920 resdecl,
1921 build_int_cst (TREE_TYPE
1922 (DECL_RESULT (alias)), 0));
1924 else if (!is_gimple_reg_type (restype))
1926 if (aggregate_value_p (resdecl, TREE_TYPE (thunk_fndecl)))
1928 restmp = resdecl;
1930 if (VAR_P (restmp))
1932 add_local_decl (cfun, restmp);
1933 BLOCK_VARS (DECL_INITIAL (current_function_decl))
1934 = restmp;
1937 else
1938 restmp = create_tmp_var (restype, "retval");
1940 else
1941 restmp = create_tmp_reg (restype, "retval");
1944 for (arg = a; arg; arg = DECL_CHAIN (arg))
1945 nargs++;
1946 auto_vec<tree> vargs (nargs);
1947 i = 0;
1948 arg = a;
1949 if (this_adjusting)
1951 vargs.quick_push (thunk_adjust (&bsi, a, 1, fixed_offset,
1952 virtual_offset, indirect_offset));
1953 arg = DECL_CHAIN (a);
1954 i = 1;
1957 if (nargs)
1958 for (; i < nargs; i++, arg = DECL_CHAIN (arg))
1960 tree tmp = arg;
1961 if (VECTOR_TYPE_P (TREE_TYPE (arg))
1962 || TREE_CODE (TREE_TYPE (arg)) == COMPLEX_TYPE)
1963 DECL_GIMPLE_REG_P (arg) = 1;
1965 if (!is_gimple_val (arg))
1967 tmp = create_tmp_reg (TYPE_MAIN_VARIANT
1968 (TREE_TYPE (arg)), "arg");
1969 gimple *stmt = gimple_build_assign (tmp, arg);
1970 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1972 vargs.quick_push (tmp);
1974 call = gimple_build_call_vec (build_fold_addr_expr_loc (0, alias), vargs);
1975 callees->call_stmt = call;
1976 gimple_call_set_from_thunk (call, true);
1977 if (DECL_STATIC_CHAIN (alias))
1979 tree p = DECL_STRUCT_FUNCTION (alias)->static_chain_decl;
1980 tree type = TREE_TYPE (p);
1981 tree decl = build_decl (DECL_SOURCE_LOCATION (thunk_fndecl),
1982 PARM_DECL, create_tmp_var_name ("CHAIN"),
1983 type);
1984 DECL_ARTIFICIAL (decl) = 1;
1985 DECL_IGNORED_P (decl) = 1;
1986 TREE_USED (decl) = 1;
1987 DECL_CONTEXT (decl) = thunk_fndecl;
1988 DECL_ARG_TYPE (decl) = type;
1989 TREE_READONLY (decl) = 1;
1991 struct function *sf = DECL_STRUCT_FUNCTION (thunk_fndecl);
1992 sf->static_chain_decl = decl;
1994 gimple_call_set_chain (call, decl);
1997 /* Return slot optimization is always possible and in fact requred to
1998 return values with DECL_BY_REFERENCE. */
1999 if (aggregate_value_p (resdecl, TREE_TYPE (thunk_fndecl))
2000 && (!is_gimple_reg_type (TREE_TYPE (resdecl))
2001 || DECL_BY_REFERENCE (resdecl)))
2002 gimple_call_set_return_slot_opt (call, true);
2004 if (restmp)
2006 gimple_call_set_lhs (call, restmp);
2007 gcc_assert (useless_type_conversion_p (TREE_TYPE (restmp),
2008 TREE_TYPE (TREE_TYPE (alias))));
2010 gsi_insert_after (&bsi, call, GSI_NEW_STMT);
2011 if (!alias_is_noreturn)
2013 if (restmp && !this_adjusting
2014 && (fixed_offset || virtual_offset))
2016 tree true_label = NULL_TREE;
2018 if (TREE_CODE (TREE_TYPE (restmp)) == POINTER_TYPE)
2020 gimple *stmt;
2021 edge e;
2022 /* If the return type is a pointer, we need to
2023 protect against NULL. We know there will be an
2024 adjustment, because that's why we're emitting a
2025 thunk. */
2026 then_bb = create_basic_block (NULL, bb);
2027 then_bb->count = cfg_count - cfg_count.apply_scale (1, 16);
2028 return_bb = create_basic_block (NULL, then_bb);
2029 return_bb->count = cfg_count;
2030 else_bb = create_basic_block (NULL, else_bb);
2031 else_bb->count = cfg_count.apply_scale (1, 16);
2032 add_bb_to_loop (then_bb, bb->loop_father);
2033 add_bb_to_loop (return_bb, bb->loop_father);
2034 add_bb_to_loop (else_bb, bb->loop_father);
2035 remove_edge (single_succ_edge (bb));
2036 true_label = gimple_block_label (then_bb);
2037 stmt = gimple_build_cond (NE_EXPR, restmp,
2038 build_zero_cst (TREE_TYPE (restmp)),
2039 NULL_TREE, NULL_TREE);
2040 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
2041 e = make_edge (bb, then_bb, EDGE_TRUE_VALUE);
2042 e->probability = profile_probability::guessed_always ()
2043 .apply_scale (1, 16);
2044 e = make_edge (bb, else_bb, EDGE_FALSE_VALUE);
2045 e->probability = profile_probability::guessed_always ()
2046 .apply_scale (1, 16);
2047 make_single_succ_edge (return_bb,
2048 EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
2049 make_single_succ_edge (then_bb, return_bb, EDGE_FALLTHRU);
2050 e = make_edge (else_bb, return_bb, EDGE_FALLTHRU);
2051 e->probability = profile_probability::always ();
2052 bsi = gsi_last_bb (then_bb);
2055 restmp = thunk_adjust (&bsi, restmp, /*this_adjusting=*/0,
2056 fixed_offset, virtual_offset,
2057 indirect_offset);
2058 if (true_label)
2060 gimple *stmt;
2061 bsi = gsi_last_bb (else_bb);
2062 stmt = gimple_build_assign (restmp,
2063 build_zero_cst (TREE_TYPE (restmp)));
2064 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
2065 bsi = gsi_last_bb (return_bb);
2068 else
2069 gimple_call_set_tail (call, true);
2071 /* Build return value. */
2072 if (!DECL_BY_REFERENCE (resdecl))
2073 ret = gimple_build_return (restmp);
2074 else
2075 ret = gimple_build_return (resdecl);
2077 gsi_insert_after (&bsi, ret, GSI_NEW_STMT);
2079 else
2081 gimple_call_set_tail (call, true);
2082 remove_edge (single_succ_edge (bb));
2085 cfun->gimple_df->in_ssa_p = true;
2086 update_max_bb_count ();
2087 profile_status_for_fn (cfun)
2088 = cfg_count.initialized_p () && cfg_count.ipa_p ()
2089 ? PROFILE_READ : PROFILE_GUESSED;
2090 /* FIXME: C++ FE should stop setting TREE_ASM_WRITTEN on thunks. */
2091 TREE_ASM_WRITTEN (thunk_fndecl) = false;
2092 delete_unreachable_blocks ();
2093 update_ssa (TODO_update_ssa);
2094 checking_verify_flow_info ();
2095 free_dominance_info (CDI_DOMINATORS);
2097 /* Since we want to emit the thunk, we explicitly mark its name as
2098 referenced. */
2099 thunk.thunk_p = false;
2100 lowered = true;
2101 bitmap_obstack_release (NULL);
2103 current_function_decl = NULL;
2104 set_cfun (NULL);
2105 return true;
2108 /* Assemble thunks and aliases associated to node. */
2110 void
2111 cgraph_node::assemble_thunks_and_aliases (void)
2113 cgraph_edge *e;
2114 ipa_ref *ref;
2116 for (e = callers; e;)
2117 if (e->caller->thunk.thunk_p
2118 && !e->caller->global.inlined_to
2119 && !e->caller->thunk.add_pointer_bounds_args)
2121 cgraph_node *thunk = e->caller;
2123 e = e->next_caller;
2124 thunk->expand_thunk (true, false);
2125 thunk->assemble_thunks_and_aliases ();
2127 else
2128 e = e->next_caller;
2130 FOR_EACH_ALIAS (this, ref)
2132 cgraph_node *alias = dyn_cast <cgraph_node *> (ref->referring);
2133 if (!alias->transparent_alias)
2135 bool saved_written = TREE_ASM_WRITTEN (decl);
2137 /* Force assemble_alias to really output the alias this time instead
2138 of buffering it in same alias pairs. */
2139 TREE_ASM_WRITTEN (decl) = 1;
2140 do_assemble_alias (alias->decl,
2141 DECL_ASSEMBLER_NAME (decl));
2142 alias->assemble_thunks_and_aliases ();
2143 TREE_ASM_WRITTEN (decl) = saved_written;
2148 /* Expand function specified by node. */
2150 void
2151 cgraph_node::expand (void)
2153 location_t saved_loc;
2155 /* We ought to not compile any inline clones. */
2156 gcc_assert (!global.inlined_to);
2158 /* __RTL functions are compiled as soon as they are parsed, so don't
2159 do it again. */
2160 if (native_rtl_p ())
2161 return;
2163 announce_function (decl);
2164 process = 0;
2165 gcc_assert (lowered);
2166 get_untransformed_body ();
2168 /* Generate RTL for the body of DECL. */
2170 timevar_push (TV_REST_OF_COMPILATION);
2172 gcc_assert (symtab->global_info_ready);
2174 /* Initialize the default bitmap obstack. */
2175 bitmap_obstack_initialize (NULL);
2177 /* Initialize the RTL code for the function. */
2178 saved_loc = input_location;
2179 input_location = DECL_SOURCE_LOCATION (decl);
2181 gcc_assert (DECL_STRUCT_FUNCTION (decl));
2182 push_cfun (DECL_STRUCT_FUNCTION (decl));
2183 init_function_start (decl);
2185 gimple_register_cfg_hooks ();
2187 bitmap_obstack_initialize (&reg_obstack); /* FIXME, only at RTL generation*/
2189 execute_all_ipa_transforms ();
2191 /* Perform all tree transforms and optimizations. */
2193 /* Signal the start of passes. */
2194 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_START, NULL);
2196 execute_pass_list (cfun, g->get_passes ()->all_passes);
2198 /* Signal the end of passes. */
2199 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_END, NULL);
2201 bitmap_obstack_release (&reg_obstack);
2203 /* Release the default bitmap obstack. */
2204 bitmap_obstack_release (NULL);
2206 /* If requested, warn about function definitions where the function will
2207 return a value (usually of some struct or union type) which itself will
2208 take up a lot of stack space. */
2209 if (!DECL_EXTERNAL (decl) && TREE_TYPE (decl))
2211 tree ret_type = TREE_TYPE (TREE_TYPE (decl));
2213 if (ret_type && TYPE_SIZE_UNIT (ret_type)
2214 && TREE_CODE (TYPE_SIZE_UNIT (ret_type)) == INTEGER_CST
2215 && compare_tree_int (TYPE_SIZE_UNIT (ret_type),
2216 warn_larger_than_size) > 0)
2218 unsigned int size_as_int
2219 = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (ret_type));
2221 if (compare_tree_int (TYPE_SIZE_UNIT (ret_type), size_as_int) == 0)
2222 warning (OPT_Wlarger_than_,
2223 "size of return value of %q+D is %u bytes",
2224 decl, size_as_int);
2225 else
2226 warning (OPT_Wlarger_than_,
2227 "size of return value of %q+D is larger than %wu bytes",
2228 decl, warn_larger_than_size);
2232 gimple_set_body (decl, NULL);
2233 if (DECL_STRUCT_FUNCTION (decl) == 0
2234 && !cgraph_node::get (decl)->origin)
2236 /* Stop pointing to the local nodes about to be freed.
2237 But DECL_INITIAL must remain nonzero so we know this
2238 was an actual function definition.
2239 For a nested function, this is done in c_pop_function_context.
2240 If rest_of_compilation set this to 0, leave it 0. */
2241 if (DECL_INITIAL (decl) != 0)
2242 DECL_INITIAL (decl) = error_mark_node;
2245 input_location = saved_loc;
2247 ggc_collect ();
2248 timevar_pop (TV_REST_OF_COMPILATION);
2250 /* Make sure that BE didn't give up on compiling. */
2251 gcc_assert (TREE_ASM_WRITTEN (decl));
2252 if (cfun)
2253 pop_cfun ();
2255 /* It would make a lot more sense to output thunks before function body to get more
2256 forward and lest backwarding jumps. This however would need solving problem
2257 with comdats. See PR48668. Also aliases must come after function itself to
2258 make one pass assemblers, like one on AIX, happy. See PR 50689.
2259 FIXME: Perhaps thunks should be move before function IFF they are not in comdat
2260 groups. */
2261 assemble_thunks_and_aliases ();
2262 release_body ();
2263 /* Eliminate all call edges. This is important so the GIMPLE_CALL no longer
2264 points to the dead function body. */
2265 remove_callees ();
2266 remove_all_references ();
2269 /* Node comparer that is responsible for the order that corresponds
2270 to time when a function was launched for the first time. */
2272 static int
2273 node_cmp (const void *pa, const void *pb)
2275 const cgraph_node *a = *(const cgraph_node * const *) pa;
2276 const cgraph_node *b = *(const cgraph_node * const *) pb;
2278 /* Functions with time profile must be before these without profile. */
2279 if (!a->tp_first_run || !b->tp_first_run)
2280 return a->tp_first_run - b->tp_first_run;
2282 return a->tp_first_run != b->tp_first_run
2283 ? b->tp_first_run - a->tp_first_run
2284 : b->order - a->order;
2287 /* Expand all functions that must be output.
2289 Attempt to topologically sort the nodes so function is output when
2290 all called functions are already assembled to allow data to be
2291 propagated across the callgraph. Use a stack to get smaller distance
2292 between a function and its callees (later we may choose to use a more
2293 sophisticated algorithm for function reordering; we will likely want
2294 to use subsections to make the output functions appear in top-down
2295 order). */
2297 static void
2298 expand_all_functions (void)
2300 cgraph_node *node;
2301 cgraph_node **order = XCNEWVEC (cgraph_node *,
2302 symtab->cgraph_count);
2303 unsigned int expanded_func_count = 0, profiled_func_count = 0;
2304 int order_pos, new_order_pos = 0;
2305 int i;
2307 order_pos = ipa_reverse_postorder (order);
2308 gcc_assert (order_pos == symtab->cgraph_count);
2310 /* Garbage collector may remove inline clones we eliminate during
2311 optimization. So we must be sure to not reference them. */
2312 for (i = 0; i < order_pos; i++)
2313 if (order[i]->process)
2314 order[new_order_pos++] = order[i];
2316 if (flag_profile_reorder_functions)
2317 qsort (order, new_order_pos, sizeof (cgraph_node *), node_cmp);
2319 for (i = new_order_pos - 1; i >= 0; i--)
2321 node = order[i];
2323 if (node->process)
2325 expanded_func_count++;
2326 if(node->tp_first_run)
2327 profiled_func_count++;
2329 if (symtab->dump_file)
2330 fprintf (symtab->dump_file,
2331 "Time profile order in expand_all_functions:%s:%d\n",
2332 node->asm_name (), node->tp_first_run);
2333 node->process = 0;
2334 node->expand ();
2338 if (dump_file)
2339 fprintf (dump_file, "Expanded functions with time profile (%s):%u/%u\n",
2340 main_input_filename, profiled_func_count, expanded_func_count);
2342 if (symtab->dump_file && flag_profile_reorder_functions)
2343 fprintf (symtab->dump_file, "Expanded functions with time profile:%u/%u\n",
2344 profiled_func_count, expanded_func_count);
2346 symtab->process_new_functions ();
2347 free_gimplify_stack ();
2349 free (order);
2352 /* This is used to sort the node types by the cgraph order number. */
2354 enum cgraph_order_sort_kind
2356 ORDER_UNDEFINED = 0,
2357 ORDER_FUNCTION,
2358 ORDER_VAR,
2359 ORDER_VAR_UNDEF,
2360 ORDER_ASM
2363 struct cgraph_order_sort
2365 enum cgraph_order_sort_kind kind;
2366 union
2368 cgraph_node *f;
2369 varpool_node *v;
2370 asm_node *a;
2371 } u;
2374 /* Output all functions, variables, and asm statements in the order
2375 according to their order fields, which is the order in which they
2376 appeared in the file. This implements -fno-toplevel-reorder. In
2377 this mode we may output functions and variables which don't really
2378 need to be output. */
2380 static void
2381 output_in_order (void)
2383 int max;
2384 cgraph_order_sort *nodes;
2385 int i;
2386 cgraph_node *pf;
2387 varpool_node *pv;
2388 asm_node *pa;
2389 max = symtab->order;
2390 nodes = XCNEWVEC (cgraph_order_sort, max);
2392 FOR_EACH_DEFINED_FUNCTION (pf)
2394 if (pf->process && !pf->thunk.thunk_p && !pf->alias)
2396 if (!pf->no_reorder)
2397 continue;
2398 i = pf->order;
2399 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
2400 nodes[i].kind = ORDER_FUNCTION;
2401 nodes[i].u.f = pf;
2405 /* There is a similar loop in symbol_table::output_variables.
2406 Please keep them in sync. */
2407 FOR_EACH_VARIABLE (pv)
2409 if (!pv->no_reorder)
2410 continue;
2411 if (DECL_HARD_REGISTER (pv->decl)
2412 || DECL_HAS_VALUE_EXPR_P (pv->decl))
2413 continue;
2414 i = pv->order;
2415 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
2416 nodes[i].kind = pv->definition ? ORDER_VAR : ORDER_VAR_UNDEF;
2417 nodes[i].u.v = pv;
2420 for (pa = symtab->first_asm_symbol (); pa; pa = pa->next)
2422 i = pa->order;
2423 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
2424 nodes[i].kind = ORDER_ASM;
2425 nodes[i].u.a = pa;
2428 /* In toplevel reorder mode we output all statics; mark them as needed. */
2430 for (i = 0; i < max; ++i)
2431 if (nodes[i].kind == ORDER_VAR)
2432 nodes[i].u.v->finalize_named_section_flags ();
2434 for (i = 0; i < max; ++i)
2436 switch (nodes[i].kind)
2438 case ORDER_FUNCTION:
2439 nodes[i].u.f->process = 0;
2440 nodes[i].u.f->expand ();
2441 break;
2443 case ORDER_VAR:
2444 nodes[i].u.v->assemble_decl ();
2445 break;
2447 case ORDER_VAR_UNDEF:
2448 assemble_undefined_decl (nodes[i].u.v->decl);
2449 break;
2451 case ORDER_ASM:
2452 assemble_asm (nodes[i].u.a->asm_str);
2453 break;
2455 case ORDER_UNDEFINED:
2456 break;
2458 default:
2459 gcc_unreachable ();
2463 symtab->clear_asm_symbols ();
2465 free (nodes);
2468 static void
2469 ipa_passes (void)
2471 gcc::pass_manager *passes = g->get_passes ();
2473 set_cfun (NULL);
2474 current_function_decl = NULL;
2475 gimple_register_cfg_hooks ();
2476 bitmap_obstack_initialize (NULL);
2478 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_START, NULL);
2480 if (!in_lto_p)
2482 execute_ipa_pass_list (passes->all_small_ipa_passes);
2483 if (seen_error ())
2484 return;
2487 /* This extra symtab_remove_unreachable_nodes pass tends to catch some
2488 devirtualization and other changes where removal iterate. */
2489 symtab->remove_unreachable_nodes (symtab->dump_file);
2491 /* If pass_all_early_optimizations was not scheduled, the state of
2492 the cgraph will not be properly updated. Update it now. */
2493 if (symtab->state < IPA_SSA)
2494 symtab->state = IPA_SSA;
2496 if (!in_lto_p)
2498 /* Generate coverage variables and constructors. */
2499 coverage_finish ();
2501 /* Process new functions added. */
2502 set_cfun (NULL);
2503 current_function_decl = NULL;
2504 symtab->process_new_functions ();
2506 execute_ipa_summary_passes
2507 ((ipa_opt_pass_d *) passes->all_regular_ipa_passes);
2510 /* Some targets need to handle LTO assembler output specially. */
2511 if (flag_generate_lto || flag_generate_offload)
2512 targetm.asm_out.lto_start ();
2514 if (!in_lto_p
2515 || flag_incremental_link == INCREMENTAL_LINK_LTO)
2517 if (!quiet_flag)
2518 fprintf (stderr, "Streaming LTO\n");
2519 if (g->have_offload)
2521 section_name_prefix = OFFLOAD_SECTION_NAME_PREFIX;
2522 lto_stream_offload_p = true;
2523 ipa_write_summaries ();
2524 lto_stream_offload_p = false;
2526 if (flag_lto)
2528 section_name_prefix = LTO_SECTION_NAME_PREFIX;
2529 lto_stream_offload_p = false;
2530 ipa_write_summaries ();
2534 if (flag_generate_lto || flag_generate_offload)
2535 targetm.asm_out.lto_end ();
2537 if (!flag_ltrans
2538 && ((in_lto_p && flag_incremental_link != INCREMENTAL_LINK_LTO)
2539 || !flag_lto || flag_fat_lto_objects))
2540 execute_ipa_pass_list (passes->all_regular_ipa_passes);
2541 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_END, NULL);
2543 bitmap_obstack_release (NULL);
2547 /* Return string alias is alias of. */
2549 static tree
2550 get_alias_symbol (tree decl)
2552 tree alias = lookup_attribute ("alias", DECL_ATTRIBUTES (decl));
2553 return get_identifier (TREE_STRING_POINTER
2554 (TREE_VALUE (TREE_VALUE (alias))));
2558 /* Weakrefs may be associated to external decls and thus not output
2559 at expansion time. Emit all necessary aliases. */
2561 void
2562 symbol_table::output_weakrefs (void)
2564 symtab_node *node;
2565 FOR_EACH_SYMBOL (node)
2566 if (node->alias
2567 && !TREE_ASM_WRITTEN (node->decl)
2568 && node->weakref)
2570 tree target;
2572 /* Weakrefs are special by not requiring target definition in current
2573 compilation unit. It is thus bit hard to work out what we want to
2574 alias.
2575 When alias target is defined, we need to fetch it from symtab reference,
2576 otherwise it is pointed to by alias_target. */
2577 if (node->alias_target)
2578 target = (DECL_P (node->alias_target)
2579 ? DECL_ASSEMBLER_NAME (node->alias_target)
2580 : node->alias_target);
2581 else if (node->analyzed)
2582 target = DECL_ASSEMBLER_NAME (node->get_alias_target ()->decl);
2583 else
2585 gcc_unreachable ();
2586 target = get_alias_symbol (node->decl);
2588 do_assemble_alias (node->decl, target);
2592 /* Perform simple optimizations based on callgraph. */
2594 void
2595 symbol_table::compile (void)
2597 if (seen_error ())
2598 return;
2600 symtab_node::checking_verify_symtab_nodes ();
2602 timevar_push (TV_CGRAPHOPT);
2603 if (pre_ipa_mem_report)
2605 fprintf (stderr, "Memory consumption before IPA\n");
2606 dump_memory_report (false);
2608 if (!quiet_flag)
2609 fprintf (stderr, "Performing interprocedural optimizations\n");
2610 state = IPA;
2612 /* If LTO is enabled, initialize the streamer hooks needed by GIMPLE. */
2613 if (flag_generate_lto || flag_generate_offload)
2614 lto_streamer_hooks_init ();
2616 /* Don't run the IPA passes if there was any error or sorry messages. */
2617 if (!seen_error ())
2618 ipa_passes ();
2620 /* Do nothing else if any IPA pass found errors or if we are just streaming LTO. */
2621 if (seen_error ()
2622 || ((!in_lto_p || flag_incremental_link == INCREMENTAL_LINK_LTO)
2623 && flag_lto && !flag_fat_lto_objects))
2625 timevar_pop (TV_CGRAPHOPT);
2626 return;
2629 global_info_ready = true;
2630 if (dump_file)
2632 fprintf (dump_file, "Optimized ");
2633 symtab->dump (dump_file);
2635 if (post_ipa_mem_report)
2637 fprintf (stderr, "Memory consumption after IPA\n");
2638 dump_memory_report (false);
2640 timevar_pop (TV_CGRAPHOPT);
2642 /* Output everything. */
2643 switch_to_section (text_section);
2644 (*debug_hooks->assembly_start) ();
2645 if (!quiet_flag)
2646 fprintf (stderr, "Assembling functions:\n");
2647 symtab_node::checking_verify_symtab_nodes ();
2649 bitmap_obstack_initialize (NULL);
2650 execute_ipa_pass_list (g->get_passes ()->all_late_ipa_passes);
2651 bitmap_obstack_release (NULL);
2652 mark_functions_to_output ();
2654 /* When weakref support is missing, we automatically translate all
2655 references to NODE to references to its ultimate alias target.
2656 The renaming mechanizm uses flag IDENTIFIER_TRANSPARENT_ALIAS and
2657 TREE_CHAIN.
2659 Set up this mapping before we output any assembler but once we are sure
2660 that all symbol renaming is done.
2662 FIXME: All this uglyness can go away if we just do renaming at gimple
2663 level by physically rewritting the IL. At the moment we can only redirect
2664 calls, so we need infrastructure for renaming references as well. */
2665 #ifndef ASM_OUTPUT_WEAKREF
2666 symtab_node *node;
2668 FOR_EACH_SYMBOL (node)
2669 if (node->alias
2670 && lookup_attribute ("weakref", DECL_ATTRIBUTES (node->decl)))
2672 IDENTIFIER_TRANSPARENT_ALIAS
2673 (DECL_ASSEMBLER_NAME (node->decl)) = 1;
2674 TREE_CHAIN (DECL_ASSEMBLER_NAME (node->decl))
2675 = (node->alias_target ? node->alias_target
2676 : DECL_ASSEMBLER_NAME (node->get_alias_target ()->decl));
2678 #endif
2680 state = EXPANSION;
2682 /* Output first asm statements and anything ordered. The process
2683 flag is cleared for these nodes, so we skip them later. */
2684 output_in_order ();
2685 expand_all_functions ();
2686 output_variables ();
2688 process_new_functions ();
2689 state = FINISHED;
2690 output_weakrefs ();
2692 if (dump_file)
2694 fprintf (dump_file, "\nFinal ");
2695 symtab->dump (dump_file);
2697 if (!flag_checking)
2698 return;
2699 symtab_node::verify_symtab_nodes ();
2700 /* Double check that all inline clones are gone and that all
2701 function bodies have been released from memory. */
2702 if (!seen_error ())
2704 cgraph_node *node;
2705 bool error_found = false;
2707 FOR_EACH_DEFINED_FUNCTION (node)
2708 if (node->global.inlined_to
2709 || gimple_has_body_p (node->decl))
2711 error_found = true;
2712 node->debug ();
2714 if (error_found)
2715 internal_error ("nodes with unreleased memory found");
2719 /* Earlydebug dump file, flags, and number. */
2721 static int debuginfo_early_dump_nr;
2722 static FILE *debuginfo_early_dump_file;
2723 static dump_flags_t debuginfo_early_dump_flags;
2725 /* Debug dump file, flags, and number. */
2727 static int debuginfo_dump_nr;
2728 static FILE *debuginfo_dump_file;
2729 static dump_flags_t debuginfo_dump_flags;
2731 /* Register the debug and earlydebug dump files. */
2733 void
2734 debuginfo_early_init (void)
2736 gcc::dump_manager *dumps = g->get_dumps ();
2737 debuginfo_early_dump_nr = dumps->dump_register (".earlydebug", "earlydebug",
2738 "earlydebug", DK_tree,
2739 OPTGROUP_NONE,
2740 false);
2741 debuginfo_dump_nr = dumps->dump_register (".debug", "debug",
2742 "debug", DK_tree,
2743 OPTGROUP_NONE,
2744 false);
2747 /* Initialize the debug and earlydebug dump files. */
2749 void
2750 debuginfo_init (void)
2752 gcc::dump_manager *dumps = g->get_dumps ();
2753 debuginfo_dump_file = dump_begin (debuginfo_dump_nr, NULL);
2754 debuginfo_dump_flags = dumps->get_dump_file_info (debuginfo_dump_nr)->pflags;
2755 debuginfo_early_dump_file = dump_begin (debuginfo_early_dump_nr, NULL);
2756 debuginfo_early_dump_flags
2757 = dumps->get_dump_file_info (debuginfo_early_dump_nr)->pflags;
2760 /* Finalize the debug and earlydebug dump files. */
2762 void
2763 debuginfo_fini (void)
2765 if (debuginfo_dump_file)
2766 dump_end (debuginfo_dump_nr, debuginfo_dump_file);
2767 if (debuginfo_early_dump_file)
2768 dump_end (debuginfo_early_dump_nr, debuginfo_early_dump_file);
2771 /* Set dump_file to the debug dump file. */
2773 void
2774 debuginfo_start (void)
2776 set_dump_file (debuginfo_dump_file);
2779 /* Undo setting dump_file to the debug dump file. */
2781 void
2782 debuginfo_stop (void)
2784 set_dump_file (NULL);
2787 /* Set dump_file to the earlydebug dump file. */
2789 void
2790 debuginfo_early_start (void)
2792 set_dump_file (debuginfo_early_dump_file);
2795 /* Undo setting dump_file to the earlydebug dump file. */
2797 void
2798 debuginfo_early_stop (void)
2800 set_dump_file (NULL);
2803 /* Analyze the whole compilation unit once it is parsed completely. */
2805 void
2806 symbol_table::finalize_compilation_unit (void)
2808 timevar_push (TV_CGRAPH);
2810 /* If we're here there's no current function anymore. Some frontends
2811 are lazy in clearing these. */
2812 current_function_decl = NULL;
2813 set_cfun (NULL);
2815 /* Do not skip analyzing the functions if there were errors, we
2816 miss diagnostics for following functions otherwise. */
2818 /* Emit size functions we didn't inline. */
2819 finalize_size_functions ();
2821 /* Mark alias targets necessary and emit diagnostics. */
2822 handle_alias_pairs ();
2824 if (!quiet_flag)
2826 fprintf (stderr, "\nAnalyzing compilation unit\n");
2827 fflush (stderr);
2830 if (flag_dump_passes)
2831 dump_passes ();
2833 /* Gimplify and lower all functions, compute reachability and
2834 remove unreachable nodes. */
2835 analyze_functions (/*first_time=*/true);
2837 /* Mark alias targets necessary and emit diagnostics. */
2838 handle_alias_pairs ();
2840 /* Gimplify and lower thunks. */
2841 analyze_functions (/*first_time=*/false);
2843 /* Offloading requires LTO infrastructure. */
2844 if (!in_lto_p && g->have_offload)
2845 flag_generate_offload = 1;
2847 if (!seen_error ())
2849 /* Emit early debug for reachable functions, and by consequence,
2850 locally scoped symbols. */
2851 struct cgraph_node *cnode;
2852 FOR_EACH_FUNCTION_WITH_GIMPLE_BODY (cnode)
2853 (*debug_hooks->early_global_decl) (cnode->decl);
2855 /* Clean up anything that needs cleaning up after initial debug
2856 generation. */
2857 debuginfo_early_start ();
2858 (*debug_hooks->early_finish) (main_input_filename);
2859 debuginfo_early_stop ();
2862 /* Finally drive the pass manager. */
2863 compile ();
2865 timevar_pop (TV_CGRAPH);
2868 /* Reset all state within cgraphunit.c so that we can rerun the compiler
2869 within the same process. For use by toplev::finalize. */
2871 void
2872 cgraphunit_c_finalize (void)
2874 gcc_assert (cgraph_new_nodes.length () == 0);
2875 cgraph_new_nodes.truncate (0);
2877 vtable_entry_type = NULL;
2878 queued_nodes = &symtab_terminator;
2880 first_analyzed = NULL;
2881 first_analyzed_var = NULL;
2884 /* Creates a wrapper from cgraph_node to TARGET node. Thunk is used for this
2885 kind of wrapper method. */
2887 void
2888 cgraph_node::create_wrapper (cgraph_node *target)
2890 /* Preserve DECL_RESULT so we get right by reference flag. */
2891 tree decl_result = DECL_RESULT (decl);
2893 /* Remove the function's body but keep arguments to be reused
2894 for thunk. */
2895 release_body (true);
2896 reset ();
2898 DECL_UNINLINABLE (decl) = false;
2899 DECL_RESULT (decl) = decl_result;
2900 DECL_INITIAL (decl) = NULL;
2901 allocate_struct_function (decl, false);
2902 set_cfun (NULL);
2904 /* Turn alias into thunk and expand it into GIMPLE representation. */
2905 definition = true;
2907 memset (&thunk, 0, sizeof (cgraph_thunk_info));
2908 thunk.thunk_p = true;
2909 create_edge (target, NULL, count);
2910 callees->can_throw_external = !TREE_NOTHROW (target->decl);
2912 tree arguments = DECL_ARGUMENTS (decl);
2914 while (arguments)
2916 TREE_ADDRESSABLE (arguments) = false;
2917 arguments = TREE_CHAIN (arguments);
2920 expand_thunk (false, true);
2922 /* Inline summary set-up. */
2923 analyze ();
2924 inline_analyze_function (this);
2927 #include "gt-cgraphunit.h"