Daily bump.
[official-gcc.git] / gcc / cgraphunit.c
blobc0baaeaefa4ee2a5699e3e402c6d74ff5118e51f
1 /* Driver of optimization process
2 Copyright (C) 2003-2018 Free Software Foundation, Inc.
3 Contributed by Jan Hubicka
5 This file is part of GCC.
7 GCC is free software; you can redistribute it and/or modify it under
8 the terms of the GNU General Public License as published by the Free
9 Software Foundation; either version 3, or (at your option) any later
10 version.
12 GCC is distributed in the hope that it will be useful, but WITHOUT ANY
13 WARRANTY; without even the implied warranty of MERCHANTABILITY or
14 FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
15 for more details.
17 You should have received a copy of the GNU General Public License
18 along with GCC; see the file COPYING3. If not see
19 <http://www.gnu.org/licenses/>. */
21 /* This module implements main driver of compilation process.
23 The main scope of this file is to act as an interface in between
24 tree based frontends and the backend.
26 The front-end is supposed to use following functionality:
28 - finalize_function
30 This function is called once front-end has parsed whole body of function
31 and it is certain that the function body nor the declaration will change.
33 (There is one exception needed for implementing GCC extern inline
34 function.)
36 - varpool_finalize_decl
38 This function has same behavior as the above but is used for static
39 variables.
41 - add_asm_node
43 Insert new toplevel ASM statement
45 - finalize_compilation_unit
47 This function is called once (source level) compilation unit is finalized
48 and it will no longer change.
50 The symbol table is constructed starting from the trivially needed
51 symbols finalized by the frontend. Functions are lowered into
52 GIMPLE representation and callgraph/reference lists are constructed.
53 Those are used to discover other necessary functions and variables.
55 At the end the bodies of unreachable functions are removed.
57 The function can be called multiple times when multiple source level
58 compilation units are combined.
60 - compile
62 This passes control to the back-end. Optimizations are performed and
63 final assembler is generated. This is done in the following way. Note
64 that with link time optimization the process is split into three
65 stages (compile time, linktime analysis and parallel linktime as
66 indicated bellow).
68 Compile time:
70 1) Inter-procedural optimization.
71 (ipa_passes)
73 This part is further split into:
75 a) early optimizations. These are local passes executed in
76 the topological order on the callgraph.
78 The purpose of early optimiations is to optimize away simple
79 things that may otherwise confuse IP analysis. Very simple
80 propagation across the callgraph is done i.e. to discover
81 functions without side effects and simple inlining is performed.
83 b) early small interprocedural passes.
85 Those are interprocedural passes executed only at compilation
86 time. These include, for example, transational memory lowering,
87 unreachable code removal and other simple transformations.
89 c) IP analysis stage. All interprocedural passes do their
90 analysis.
92 Interprocedural passes differ from small interprocedural
93 passes by their ability to operate across whole program
94 at linktime. Their analysis stage is performed early to
95 both reduce linking times and linktime memory usage by
96 not having to represent whole program in memory.
98 d) LTO sreaming. When doing LTO, everything important gets
99 streamed into the object file.
101 Compile time and or linktime analysis stage (WPA):
103 At linktime units gets streamed back and symbol table is
104 merged. Function bodies are not streamed in and not
105 available.
106 e) IP propagation stage. All IP passes execute their
107 IP propagation. This is done based on the earlier analysis
108 without having function bodies at hand.
109 f) Ltrans streaming. When doing WHOPR LTO, the program
110 is partitioned and streamed into multple object files.
112 Compile time and/or parallel linktime stage (ltrans)
114 Each of the object files is streamed back and compiled
115 separately. Now the function bodies becomes available
116 again.
118 2) Virtual clone materialization
119 (cgraph_materialize_clone)
121 IP passes can produce copies of existing functoins (such
122 as versioned clones or inline clones) without actually
123 manipulating their bodies by creating virtual clones in
124 the callgraph. At this time the virtual clones are
125 turned into real functions
126 3) IP transformation
128 All IP passes transform function bodies based on earlier
129 decision of the IP propagation.
131 4) late small IP passes
133 Simple IP passes working within single program partition.
135 5) Expansion
136 (expand_all_functions)
138 At this stage functions that needs to be output into
139 assembler are identified and compiled in topological order
140 6) Output of variables and aliases
141 Now it is known what variable references was not optimized
142 out and thus all variables are output to the file.
144 Note that with -fno-toplevel-reorder passes 5 and 6
145 are combined together in cgraph_output_in_order.
147 Finally there are functions to manipulate the callgraph from
148 backend.
149 - cgraph_add_new_function is used to add backend produced
150 functions introduced after the unit is finalized.
151 The functions are enqueue for later processing and inserted
152 into callgraph with cgraph_process_new_functions.
154 - cgraph_function_versioning
156 produces a copy of function into new one (a version)
157 and apply simple transformations
160 #include "config.h"
161 #include "system.h"
162 #include "coretypes.h"
163 #include "backend.h"
164 #include "target.h"
165 #include "rtl.h"
166 #include "tree.h"
167 #include "gimple.h"
168 #include "cfghooks.h"
169 #include "regset.h" /* FIXME: For reg_obstack. */
170 #include "alloc-pool.h"
171 #include "tree-pass.h"
172 #include "stringpool.h"
173 #include "gimple-ssa.h"
174 #include "cgraph.h"
175 #include "coverage.h"
176 #include "lto-streamer.h"
177 #include "fold-const.h"
178 #include "varasm.h"
179 #include "stor-layout.h"
180 #include "output.h"
181 #include "cfgcleanup.h"
182 #include "gimple-fold.h"
183 #include "gimplify.h"
184 #include "gimple-iterator.h"
185 #include "gimplify-me.h"
186 #include "tree-cfg.h"
187 #include "tree-into-ssa.h"
188 #include "tree-ssa.h"
189 #include "langhooks.h"
190 #include "toplev.h"
191 #include "debug.h"
192 #include "symbol-summary.h"
193 #include "tree-vrp.h"
194 #include "ipa-prop.h"
195 #include "gimple-pretty-print.h"
196 #include "plugin.h"
197 #include "ipa-fnsummary.h"
198 #include "ipa-utils.h"
199 #include "except.h"
200 #include "cfgloop.h"
201 #include "context.h"
202 #include "pass_manager.h"
203 #include "tree-nested.h"
204 #include "dbgcnt.h"
205 #include "lto-section-names.h"
206 #include "stringpool.h"
207 #include "attribs.h"
209 /* Queue of cgraph nodes scheduled to be added into cgraph. This is a
210 secondary queue used during optimization to accommodate passes that
211 may generate new functions that need to be optimized and expanded. */
212 vec<cgraph_node *> cgraph_new_nodes;
214 static void expand_all_functions (void);
215 static void mark_functions_to_output (void);
216 static void handle_alias_pairs (void);
218 /* Used for vtable lookup in thunk adjusting. */
219 static GTY (()) tree vtable_entry_type;
221 /* Return true if this symbol is a function from the C frontend specified
222 directly in RTL form (with "__RTL"). */
224 bool
225 symtab_node::native_rtl_p () const
227 if (TREE_CODE (decl) != FUNCTION_DECL)
228 return false;
229 if (!DECL_STRUCT_FUNCTION (decl))
230 return false;
231 return DECL_STRUCT_FUNCTION (decl)->curr_properties & PROP_rtl;
234 /* Determine if symbol declaration is needed. That is, visible to something
235 either outside this translation unit, something magic in the system
236 configury */
237 bool
238 symtab_node::needed_p (void)
240 /* Double check that no one output the function into assembly file
241 early. */
242 if (!native_rtl_p ())
243 gcc_checking_assert
244 (!DECL_ASSEMBLER_NAME_SET_P (decl)
245 || !TREE_SYMBOL_REFERENCED (DECL_ASSEMBLER_NAME (decl)));
247 if (!definition)
248 return false;
250 if (DECL_EXTERNAL (decl))
251 return false;
253 /* If the user told us it is used, then it must be so. */
254 if (force_output)
255 return true;
257 /* ABI forced symbols are needed when they are external. */
258 if (forced_by_abi && TREE_PUBLIC (decl))
259 return true;
261 /* Keep constructors, destructors and virtual functions. */
262 if (TREE_CODE (decl) == FUNCTION_DECL
263 && (DECL_STATIC_CONSTRUCTOR (decl) || DECL_STATIC_DESTRUCTOR (decl)))
264 return true;
266 /* Externally visible variables must be output. The exception is
267 COMDAT variables that must be output only when they are needed. */
268 if (TREE_PUBLIC (decl) && !DECL_COMDAT (decl))
269 return true;
271 return false;
274 /* Head and terminator of the queue of nodes to be processed while building
275 callgraph. */
277 static symtab_node symtab_terminator;
278 static symtab_node *queued_nodes = &symtab_terminator;
280 /* Add NODE to queue starting at QUEUED_NODES.
281 The queue is linked via AUX pointers and terminated by pointer to 1. */
283 static void
284 enqueue_node (symtab_node *node)
286 if (node->aux)
287 return;
288 gcc_checking_assert (queued_nodes);
289 node->aux = queued_nodes;
290 queued_nodes = node;
293 /* Process CGRAPH_NEW_FUNCTIONS and perform actions necessary to add these
294 functions into callgraph in a way so they look like ordinary reachable
295 functions inserted into callgraph already at construction time. */
297 void
298 symbol_table::process_new_functions (void)
300 tree fndecl;
302 if (!cgraph_new_nodes.exists ())
303 return;
305 handle_alias_pairs ();
306 /* Note that this queue may grow as its being processed, as the new
307 functions may generate new ones. */
308 for (unsigned i = 0; i < cgraph_new_nodes.length (); i++)
310 cgraph_node *node = cgraph_new_nodes[i];
311 fndecl = node->decl;
312 switch (state)
314 case CONSTRUCTION:
315 /* At construction time we just need to finalize function and move
316 it into reachable functions list. */
318 cgraph_node::finalize_function (fndecl, false);
319 call_cgraph_insertion_hooks (node);
320 enqueue_node (node);
321 break;
323 case IPA:
324 case IPA_SSA:
325 case IPA_SSA_AFTER_INLINING:
326 /* When IPA optimization already started, do all essential
327 transformations that has been already performed on the whole
328 cgraph but not on this function. */
330 gimple_register_cfg_hooks ();
331 if (!node->analyzed)
332 node->analyze ();
333 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
334 if ((state == IPA_SSA || state == IPA_SSA_AFTER_INLINING)
335 && !gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
337 bool summaried_computed = ipa_fn_summaries != NULL;
338 g->get_passes ()->execute_early_local_passes ();
339 /* Early passes compure inline parameters to do inlining
340 and splitting. This is redundant for functions added late.
341 Just throw away whatever it did. */
342 if (!summaried_computed)
343 ipa_free_fn_summary ();
345 else if (ipa_fn_summaries != NULL)
346 compute_fn_summary (node, true);
347 free_dominance_info (CDI_POST_DOMINATORS);
348 free_dominance_info (CDI_DOMINATORS);
349 pop_cfun ();
350 call_cgraph_insertion_hooks (node);
351 break;
353 case EXPANSION:
354 /* Functions created during expansion shall be compiled
355 directly. */
356 node->process = 0;
357 call_cgraph_insertion_hooks (node);
358 node->expand ();
359 break;
361 default:
362 gcc_unreachable ();
363 break;
367 cgraph_new_nodes.release ();
370 /* As an GCC extension we allow redefinition of the function. The
371 semantics when both copies of bodies differ is not well defined.
372 We replace the old body with new body so in unit at a time mode
373 we always use new body, while in normal mode we may end up with
374 old body inlined into some functions and new body expanded and
375 inlined in others.
377 ??? It may make more sense to use one body for inlining and other
378 body for expanding the function but this is difficult to do. */
380 void
381 cgraph_node::reset (void)
383 /* If process is set, then we have already begun whole-unit analysis.
384 This is *not* testing for whether we've already emitted the function.
385 That case can be sort-of legitimately seen with real function redefinition
386 errors. I would argue that the front end should never present us with
387 such a case, but don't enforce that for now. */
388 gcc_assert (!process);
390 /* Reset our data structures so we can analyze the function again. */
391 memset (&local, 0, sizeof (local));
392 memset (&global, 0, sizeof (global));
393 memset (&rtl, 0, sizeof (rtl));
394 analyzed = false;
395 definition = false;
396 alias = false;
397 transparent_alias = false;
398 weakref = false;
399 cpp_implicit_alias = false;
401 remove_callees ();
402 remove_all_references ();
405 /* Return true when there are references to the node. INCLUDE_SELF is
406 true if a self reference counts as a reference. */
408 bool
409 symtab_node::referred_to_p (bool include_self)
411 ipa_ref *ref = NULL;
413 /* See if there are any references at all. */
414 if (iterate_referring (0, ref))
415 return true;
416 /* For functions check also calls. */
417 cgraph_node *cn = dyn_cast <cgraph_node *> (this);
418 if (cn && cn->callers)
420 if (include_self)
421 return true;
422 for (cgraph_edge *e = cn->callers; e; e = e->next_caller)
423 if (e->caller != this)
424 return true;
426 return false;
429 /* DECL has been parsed. Take it, queue it, compile it at the whim of the
430 logic in effect. If NO_COLLECT is true, then our caller cannot stand to have
431 the garbage collector run at the moment. We would need to either create
432 a new GC context, or just not compile right now. */
434 void
435 cgraph_node::finalize_function (tree decl, bool no_collect)
437 cgraph_node *node = cgraph_node::get_create (decl);
439 if (node->definition)
441 /* Nested functions should only be defined once. */
442 gcc_assert (!DECL_CONTEXT (decl)
443 || TREE_CODE (DECL_CONTEXT (decl)) != FUNCTION_DECL);
444 node->reset ();
445 node->local.redefined_extern_inline = true;
448 /* Set definition first before calling notice_global_symbol so that
449 it is available to notice_global_symbol. */
450 node->definition = true;
451 notice_global_symbol (decl);
452 node->lowered = DECL_STRUCT_FUNCTION (decl)->cfg != NULL;
453 if (!flag_toplevel_reorder)
454 node->no_reorder = true;
456 /* With -fkeep-inline-functions we are keeping all inline functions except
457 for extern inline ones. */
458 if (flag_keep_inline_functions
459 && DECL_DECLARED_INLINE_P (decl)
460 && !DECL_EXTERNAL (decl)
461 && !DECL_DISREGARD_INLINE_LIMITS (decl))
462 node->force_output = 1;
464 /* __RTL functions were already output as soon as they were parsed (due
465 to the large amount of global state in the backend).
466 Mark such functions as "force_output" to reflect the fact that they
467 will be in the asm file when considering the symbols they reference.
468 The attempt to output them later on will bail out immediately. */
469 if (node->native_rtl_p ())
470 node->force_output = 1;
472 /* When not optimizing, also output the static functions. (see
473 PR24561), but don't do so for always_inline functions, functions
474 declared inline and nested functions. These were optimized out
475 in the original implementation and it is unclear whether we want
476 to change the behavior here. */
477 if (((!opt_for_fn (decl, optimize) || flag_keep_static_functions
478 || node->no_reorder)
479 && !node->cpp_implicit_alias
480 && !DECL_DISREGARD_INLINE_LIMITS (decl)
481 && !DECL_DECLARED_INLINE_P (decl)
482 && !(DECL_CONTEXT (decl)
483 && TREE_CODE (DECL_CONTEXT (decl)) == FUNCTION_DECL))
484 && !DECL_COMDAT (decl) && !DECL_EXTERNAL (decl))
485 node->force_output = 1;
487 /* If we've not yet emitted decl, tell the debug info about it. */
488 if (!TREE_ASM_WRITTEN (decl))
489 (*debug_hooks->deferred_inline_function) (decl);
491 if (!no_collect)
492 ggc_collect ();
494 if (symtab->state == CONSTRUCTION
495 && (node->needed_p () || node->referred_to_p ()))
496 enqueue_node (node);
499 /* Add the function FNDECL to the call graph.
500 Unlike finalize_function, this function is intended to be used
501 by middle end and allows insertion of new function at arbitrary point
502 of compilation. The function can be either in high, low or SSA form
503 GIMPLE.
505 The function is assumed to be reachable and have address taken (so no
506 API breaking optimizations are performed on it).
508 Main work done by this function is to enqueue the function for later
509 processing to avoid need the passes to be re-entrant. */
511 void
512 cgraph_node::add_new_function (tree fndecl, bool lowered)
514 gcc::pass_manager *passes = g->get_passes ();
515 cgraph_node *node;
517 if (dump_file)
519 struct function *fn = DECL_STRUCT_FUNCTION (fndecl);
520 const char *function_type = ((gimple_has_body_p (fndecl))
521 ? (lowered
522 ? (gimple_in_ssa_p (fn)
523 ? "ssa gimple"
524 : "low gimple")
525 : "high gimple")
526 : "to-be-gimplified");
527 fprintf (dump_file,
528 "Added new %s function %s to callgraph\n",
529 function_type,
530 fndecl_name (fndecl));
533 switch (symtab->state)
535 case PARSING:
536 cgraph_node::finalize_function (fndecl, false);
537 break;
538 case CONSTRUCTION:
539 /* Just enqueue function to be processed at nearest occurrence. */
540 node = cgraph_node::get_create (fndecl);
541 if (lowered)
542 node->lowered = true;
543 cgraph_new_nodes.safe_push (node);
544 break;
546 case IPA:
547 case IPA_SSA:
548 case IPA_SSA_AFTER_INLINING:
549 case EXPANSION:
550 /* Bring the function into finalized state and enqueue for later
551 analyzing and compilation. */
552 node = cgraph_node::get_create (fndecl);
553 node->local.local = false;
554 node->definition = true;
555 node->force_output = true;
556 if (TREE_PUBLIC (fndecl))
557 node->externally_visible = true;
558 if (!lowered && symtab->state == EXPANSION)
560 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
561 gimple_register_cfg_hooks ();
562 bitmap_obstack_initialize (NULL);
563 execute_pass_list (cfun, passes->all_lowering_passes);
564 passes->execute_early_local_passes ();
565 bitmap_obstack_release (NULL);
566 pop_cfun ();
568 lowered = true;
570 if (lowered)
571 node->lowered = true;
572 cgraph_new_nodes.safe_push (node);
573 break;
575 case FINISHED:
576 /* At the very end of compilation we have to do all the work up
577 to expansion. */
578 node = cgraph_node::create (fndecl);
579 if (lowered)
580 node->lowered = true;
581 node->definition = true;
582 node->analyze ();
583 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
584 gimple_register_cfg_hooks ();
585 bitmap_obstack_initialize (NULL);
586 if (!gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
587 g->get_passes ()->execute_early_local_passes ();
588 bitmap_obstack_release (NULL);
589 pop_cfun ();
590 node->expand ();
591 break;
593 default:
594 gcc_unreachable ();
597 /* Set a personality if required and we already passed EH lowering. */
598 if (lowered
599 && (function_needs_eh_personality (DECL_STRUCT_FUNCTION (fndecl))
600 == eh_personality_lang))
601 DECL_FUNCTION_PERSONALITY (fndecl) = lang_hooks.eh_personality ();
604 /* Analyze the function scheduled to be output. */
605 void
606 cgraph_node::analyze (void)
608 if (native_rtl_p ())
610 analyzed = true;
611 return;
614 tree decl = this->decl;
615 location_t saved_loc = input_location;
616 input_location = DECL_SOURCE_LOCATION (decl);
618 if (thunk.thunk_p)
620 cgraph_node *t = cgraph_node::get (thunk.alias);
622 create_edge (t, NULL, t->count);
623 callees->can_throw_external = !TREE_NOTHROW (t->decl);
624 /* Target code in expand_thunk may need the thunk's target
625 to be analyzed, so recurse here. */
626 if (!t->analyzed && t->definition)
627 t->analyze ();
628 if (t->alias)
630 t = t->get_alias_target ();
631 if (!t->analyzed && t->definition)
632 t->analyze ();
634 bool ret = expand_thunk (false, false);
635 thunk.alias = NULL;
636 if (!ret)
637 return;
639 if (alias)
640 resolve_alias (cgraph_node::get (alias_target), transparent_alias);
641 else if (dispatcher_function)
643 /* Generate the dispatcher body of multi-versioned functions. */
644 cgraph_function_version_info *dispatcher_version_info
645 = function_version ();
646 if (dispatcher_version_info != NULL
647 && (dispatcher_version_info->dispatcher_resolver
648 == NULL_TREE))
650 tree resolver = NULL_TREE;
651 gcc_assert (targetm.generate_version_dispatcher_body);
652 resolver = targetm.generate_version_dispatcher_body (this);
653 gcc_assert (resolver != NULL_TREE);
656 else
658 push_cfun (DECL_STRUCT_FUNCTION (decl));
660 assign_assembler_name_if_needed (decl);
662 /* Make sure to gimplify bodies only once. During analyzing a
663 function we lower it, which will require gimplified nested
664 functions, so we can end up here with an already gimplified
665 body. */
666 if (!gimple_has_body_p (decl))
667 gimplify_function_tree (decl);
669 /* Lower the function. */
670 if (!lowered)
672 if (nested)
673 lower_nested_functions (decl);
674 gcc_assert (!nested);
676 gimple_register_cfg_hooks ();
677 bitmap_obstack_initialize (NULL);
678 execute_pass_list (cfun, g->get_passes ()->all_lowering_passes);
679 free_dominance_info (CDI_POST_DOMINATORS);
680 free_dominance_info (CDI_DOMINATORS);
681 compact_blocks ();
682 bitmap_obstack_release (NULL);
683 lowered = true;
686 pop_cfun ();
688 analyzed = true;
690 input_location = saved_loc;
693 /* C++ frontend produce same body aliases all over the place, even before PCH
694 gets streamed out. It relies on us linking the aliases with their function
695 in order to do the fixups, but ipa-ref is not PCH safe. Consequentely we
696 first produce aliases without links, but once C++ FE is sure he won't sream
697 PCH we build the links via this function. */
699 void
700 symbol_table::process_same_body_aliases (void)
702 symtab_node *node;
703 FOR_EACH_SYMBOL (node)
704 if (node->cpp_implicit_alias && !node->analyzed)
705 node->resolve_alias
706 (VAR_P (node->alias_target)
707 ? (symtab_node *)varpool_node::get_create (node->alias_target)
708 : (symtab_node *)cgraph_node::get_create (node->alias_target));
709 cpp_implicit_aliases_done = true;
712 /* Process attributes common for vars and functions. */
714 static void
715 process_common_attributes (symtab_node *node, tree decl)
717 tree weakref = lookup_attribute ("weakref", DECL_ATTRIBUTES (decl));
719 if (weakref && !lookup_attribute ("alias", DECL_ATTRIBUTES (decl)))
721 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
722 "%<weakref%> attribute should be accompanied with"
723 " an %<alias%> attribute");
724 DECL_WEAK (decl) = 0;
725 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
726 DECL_ATTRIBUTES (decl));
729 if (lookup_attribute ("no_reorder", DECL_ATTRIBUTES (decl)))
730 node->no_reorder = 1;
733 /* Look for externally_visible and used attributes and mark cgraph nodes
734 accordingly.
736 We cannot mark the nodes at the point the attributes are processed (in
737 handle_*_attribute) because the copy of the declarations available at that
738 point may not be canonical. For example, in:
740 void f();
741 void f() __attribute__((used));
743 the declaration we see in handle_used_attribute will be the second
744 declaration -- but the front end will subsequently merge that declaration
745 with the original declaration and discard the second declaration.
747 Furthermore, we can't mark these nodes in finalize_function because:
749 void f() {}
750 void f() __attribute__((externally_visible));
752 is valid.
754 So, we walk the nodes at the end of the translation unit, applying the
755 attributes at that point. */
757 static void
758 process_function_and_variable_attributes (cgraph_node *first,
759 varpool_node *first_var)
761 cgraph_node *node;
762 varpool_node *vnode;
764 for (node = symtab->first_function (); node != first;
765 node = symtab->next_function (node))
767 tree decl = node->decl;
768 if (DECL_PRESERVE_P (decl))
769 node->mark_force_output ();
770 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
772 if (! TREE_PUBLIC (node->decl))
773 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
774 "%<externally_visible%>"
775 " attribute have effect only on public objects");
777 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
778 && (node->definition && !node->alias))
780 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
781 "%<weakref%> attribute ignored"
782 " because function is defined");
783 DECL_WEAK (decl) = 0;
784 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
785 DECL_ATTRIBUTES (decl));
788 if (lookup_attribute ("always_inline", DECL_ATTRIBUTES (decl))
789 && !DECL_DECLARED_INLINE_P (decl)
790 /* redefining extern inline function makes it DECL_UNINLINABLE. */
791 && !DECL_UNINLINABLE (decl))
792 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
793 "always_inline function might not be inlinable");
795 process_common_attributes (node, decl);
797 for (vnode = symtab->first_variable (); vnode != first_var;
798 vnode = symtab->next_variable (vnode))
800 tree decl = vnode->decl;
801 if (DECL_EXTERNAL (decl)
802 && DECL_INITIAL (decl))
803 varpool_node::finalize_decl (decl);
804 if (DECL_PRESERVE_P (decl))
805 vnode->force_output = true;
806 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
808 if (! TREE_PUBLIC (vnode->decl))
809 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
810 "%<externally_visible%>"
811 " attribute have effect only on public objects");
813 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
814 && vnode->definition
815 && DECL_INITIAL (decl))
817 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
818 "%<weakref%> attribute ignored"
819 " because variable is initialized");
820 DECL_WEAK (decl) = 0;
821 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
822 DECL_ATTRIBUTES (decl));
824 process_common_attributes (vnode, decl);
828 /* Mark DECL as finalized. By finalizing the declaration, frontend instruct the
829 middle end to output the variable to asm file, if needed or externally
830 visible. */
832 void
833 varpool_node::finalize_decl (tree decl)
835 varpool_node *node = varpool_node::get_create (decl);
837 gcc_assert (TREE_STATIC (decl) || DECL_EXTERNAL (decl));
839 if (node->definition)
840 return;
841 /* Set definition first before calling notice_global_symbol so that
842 it is available to notice_global_symbol. */
843 node->definition = true;
844 notice_global_symbol (decl);
845 if (!flag_toplevel_reorder)
846 node->no_reorder = true;
847 if (TREE_THIS_VOLATILE (decl) || DECL_PRESERVE_P (decl)
848 /* Traditionally we do not eliminate static variables when not
849 optimizing and when not doing toplevel reoder. */
850 || (node->no_reorder && !DECL_COMDAT (node->decl)
851 && !DECL_ARTIFICIAL (node->decl)))
852 node->force_output = true;
854 if (symtab->state == CONSTRUCTION
855 && (node->needed_p () || node->referred_to_p ()))
856 enqueue_node (node);
857 if (symtab->state >= IPA_SSA)
858 node->analyze ();
859 /* Some frontends produce various interface variables after compilation
860 finished. */
861 if (symtab->state == FINISHED
862 || (node->no_reorder
863 && symtab->state == EXPANSION))
864 node->assemble_decl ();
867 /* EDGE is an polymorphic call. Mark all possible targets as reachable
868 and if there is only one target, perform trivial devirtualization.
869 REACHABLE_CALL_TARGETS collects target lists we already walked to
870 avoid udplicate work. */
872 static void
873 walk_polymorphic_call_targets (hash_set<void *> *reachable_call_targets,
874 cgraph_edge *edge)
876 unsigned int i;
877 void *cache_token;
878 bool final;
879 vec <cgraph_node *>targets
880 = possible_polymorphic_call_targets
881 (edge, &final, &cache_token);
883 if (!reachable_call_targets->add (cache_token))
885 if (symtab->dump_file)
886 dump_possible_polymorphic_call_targets
887 (symtab->dump_file, edge);
889 for (i = 0; i < targets.length (); i++)
891 /* Do not bother to mark virtual methods in anonymous namespace;
892 either we will find use of virtual table defining it, or it is
893 unused. */
894 if (targets[i]->definition
895 && TREE_CODE
896 (TREE_TYPE (targets[i]->decl))
897 == METHOD_TYPE
898 && !type_in_anonymous_namespace_p
899 (TYPE_METHOD_BASETYPE (TREE_TYPE (targets[i]->decl))))
900 enqueue_node (targets[i]);
904 /* Very trivial devirtualization; when the type is
905 final or anonymous (so we know all its derivation)
906 and there is only one possible virtual call target,
907 make the edge direct. */
908 if (final)
910 if (targets.length () <= 1 && dbg_cnt (devirt))
912 cgraph_node *target;
913 if (targets.length () == 1)
914 target = targets[0];
915 else
916 target = cgraph_node::create
917 (builtin_decl_implicit (BUILT_IN_UNREACHABLE));
919 if (symtab->dump_file)
921 fprintf (symtab->dump_file,
922 "Devirtualizing call: ");
923 print_gimple_stmt (symtab->dump_file,
924 edge->call_stmt, 0,
925 TDF_SLIM);
927 if (dump_enabled_p ())
929 dump_printf_loc (MSG_OPTIMIZED_LOCATIONS, edge->call_stmt,
930 "devirtualizing call in %s to %s\n",
931 edge->caller->name (), target->name ());
934 edge->make_direct (target);
935 edge->redirect_call_stmt_to_callee ();
937 if (symtab->dump_file)
939 fprintf (symtab->dump_file,
940 "Devirtualized as: ");
941 print_gimple_stmt (symtab->dump_file,
942 edge->call_stmt, 0,
943 TDF_SLIM);
949 /* Issue appropriate warnings for the global declaration DECL. */
951 static void
952 check_global_declaration (symtab_node *snode)
954 const char *decl_file;
955 tree decl = snode->decl;
957 /* Warn about any function declared static but not defined. We don't
958 warn about variables, because many programs have static variables
959 that exist only to get some text into the object file. */
960 if (TREE_CODE (decl) == FUNCTION_DECL
961 && DECL_INITIAL (decl) == 0
962 && DECL_EXTERNAL (decl)
963 && ! DECL_ARTIFICIAL (decl)
964 && ! TREE_NO_WARNING (decl)
965 && ! TREE_PUBLIC (decl)
966 && (warn_unused_function
967 || snode->referred_to_p (/*include_self=*/false)))
969 if (snode->referred_to_p (/*include_self=*/false))
970 pedwarn (input_location, 0, "%q+F used but never defined", decl);
971 else
972 warning (OPT_Wunused_function, "%q+F declared %<static%> but never defined", decl);
973 /* This symbol is effectively an "extern" declaration now. */
974 TREE_PUBLIC (decl) = 1;
977 /* Warn about static fns or vars defined but not used. */
978 if (((warn_unused_function && TREE_CODE (decl) == FUNCTION_DECL)
979 || (((warn_unused_variable && ! TREE_READONLY (decl))
980 || (warn_unused_const_variable > 0 && TREE_READONLY (decl)
981 && (warn_unused_const_variable == 2
982 || (main_input_filename != NULL
983 && (decl_file = DECL_SOURCE_FILE (decl)) != NULL
984 && filename_cmp (main_input_filename,
985 decl_file) == 0))))
986 && VAR_P (decl)))
987 && ! DECL_IN_SYSTEM_HEADER (decl)
988 && ! snode->referred_to_p (/*include_self=*/false)
989 /* This TREE_USED check is needed in addition to referred_to_p
990 above, because the `__unused__' attribute is not being
991 considered for referred_to_p. */
992 && ! TREE_USED (decl)
993 /* The TREE_USED bit for file-scope decls is kept in the identifier,
994 to handle multiple external decls in different scopes. */
995 && ! (DECL_NAME (decl) && TREE_USED (DECL_NAME (decl)))
996 && ! DECL_EXTERNAL (decl)
997 && ! DECL_ARTIFICIAL (decl)
998 && ! DECL_ABSTRACT_ORIGIN (decl)
999 && ! TREE_PUBLIC (decl)
1000 /* A volatile variable might be used in some non-obvious way. */
1001 && (! VAR_P (decl) || ! TREE_THIS_VOLATILE (decl))
1002 /* Global register variables must be declared to reserve them. */
1003 && ! (VAR_P (decl) && DECL_REGISTER (decl))
1004 /* Global ctors and dtors are called by the runtime. */
1005 && (TREE_CODE (decl) != FUNCTION_DECL
1006 || (!DECL_STATIC_CONSTRUCTOR (decl)
1007 && !DECL_STATIC_DESTRUCTOR (decl)))
1008 /* Otherwise, ask the language. */
1009 && lang_hooks.decls.warn_unused_global (decl))
1010 warning_at (DECL_SOURCE_LOCATION (decl),
1011 (TREE_CODE (decl) == FUNCTION_DECL)
1012 ? OPT_Wunused_function
1013 : (TREE_READONLY (decl)
1014 ? OPT_Wunused_const_variable_
1015 : OPT_Wunused_variable),
1016 "%qD defined but not used", decl);
1019 /* Discover all functions and variables that are trivially needed, analyze
1020 them as well as all functions and variables referred by them */
1021 static cgraph_node *first_analyzed;
1022 static varpool_node *first_analyzed_var;
1024 /* FIRST_TIME is set to TRUE for the first time we are called for a
1025 translation unit from finalize_compilation_unit() or false
1026 otherwise. */
1028 static void
1029 analyze_functions (bool first_time)
1031 /* Keep track of already processed nodes when called multiple times for
1032 intermodule optimization. */
1033 cgraph_node *first_handled = first_analyzed;
1034 varpool_node *first_handled_var = first_analyzed_var;
1035 hash_set<void *> reachable_call_targets;
1037 symtab_node *node;
1038 symtab_node *next;
1039 int i;
1040 ipa_ref *ref;
1041 bool changed = true;
1042 location_t saved_loc = input_location;
1044 bitmap_obstack_initialize (NULL);
1045 symtab->state = CONSTRUCTION;
1046 input_location = UNKNOWN_LOCATION;
1048 /* Ugly, but the fixup can not happen at a time same body alias is created;
1049 C++ FE is confused about the COMDAT groups being right. */
1050 if (symtab->cpp_implicit_aliases_done)
1051 FOR_EACH_SYMBOL (node)
1052 if (node->cpp_implicit_alias)
1053 node->fixup_same_cpp_alias_visibility (node->get_alias_target ());
1054 build_type_inheritance_graph ();
1056 /* Analysis adds static variables that in turn adds references to new functions.
1057 So we need to iterate the process until it stabilize. */
1058 while (changed)
1060 changed = false;
1061 process_function_and_variable_attributes (first_analyzed,
1062 first_analyzed_var);
1064 /* First identify the trivially needed symbols. */
1065 for (node = symtab->first_symbol ();
1066 node != first_analyzed
1067 && node != first_analyzed_var; node = node->next)
1069 /* Convert COMDAT group designators to IDENTIFIER_NODEs. */
1070 node->get_comdat_group_id ();
1071 if (node->needed_p ())
1073 enqueue_node (node);
1074 if (!changed && symtab->dump_file)
1075 fprintf (symtab->dump_file, "Trivially needed symbols:");
1076 changed = true;
1077 if (symtab->dump_file)
1078 fprintf (symtab->dump_file, " %s", node->asm_name ());
1079 if (!changed && symtab->dump_file)
1080 fprintf (symtab->dump_file, "\n");
1082 if (node == first_analyzed
1083 || node == first_analyzed_var)
1084 break;
1086 symtab->process_new_functions ();
1087 first_analyzed_var = symtab->first_variable ();
1088 first_analyzed = symtab->first_function ();
1090 if (changed && symtab->dump_file)
1091 fprintf (symtab->dump_file, "\n");
1093 /* Lower representation, build callgraph edges and references for all trivially
1094 needed symbols and all symbols referred by them. */
1095 while (queued_nodes != &symtab_terminator)
1097 changed = true;
1098 node = queued_nodes;
1099 queued_nodes = (symtab_node *)queued_nodes->aux;
1100 cgraph_node *cnode = dyn_cast <cgraph_node *> (node);
1101 if (cnode && cnode->definition)
1103 cgraph_edge *edge;
1104 tree decl = cnode->decl;
1106 /* ??? It is possible to create extern inline function
1107 and later using weak alias attribute to kill its body.
1108 See gcc.c-torture/compile/20011119-1.c */
1109 if (!DECL_STRUCT_FUNCTION (decl)
1110 && !cnode->alias
1111 && !cnode->thunk.thunk_p
1112 && !cnode->dispatcher_function)
1114 cnode->reset ();
1115 cnode->local.redefined_extern_inline = true;
1116 continue;
1119 if (!cnode->analyzed)
1120 cnode->analyze ();
1122 for (edge = cnode->callees; edge; edge = edge->next_callee)
1123 if (edge->callee->definition
1124 && (!DECL_EXTERNAL (edge->callee->decl)
1125 /* When not optimizing, do not try to analyze extern
1126 inline functions. Doing so is pointless. */
1127 || opt_for_fn (edge->callee->decl, optimize)
1128 /* Weakrefs needs to be preserved. */
1129 || edge->callee->alias
1130 /* always_inline functions are inlined aven at -O0. */
1131 || lookup_attribute
1132 ("always_inline",
1133 DECL_ATTRIBUTES (edge->callee->decl))
1134 /* Multiversioned functions needs the dispatcher to
1135 be produced locally even for extern functions. */
1136 || edge->callee->function_version ()))
1137 enqueue_node (edge->callee);
1138 if (opt_for_fn (cnode->decl, optimize)
1139 && opt_for_fn (cnode->decl, flag_devirtualize))
1141 cgraph_edge *next;
1143 for (edge = cnode->indirect_calls; edge; edge = next)
1145 next = edge->next_callee;
1146 if (edge->indirect_info->polymorphic)
1147 walk_polymorphic_call_targets (&reachable_call_targets,
1148 edge);
1152 /* If decl is a clone of an abstract function,
1153 mark that abstract function so that we don't release its body.
1154 The DECL_INITIAL() of that abstract function declaration
1155 will be later needed to output debug info. */
1156 if (DECL_ABSTRACT_ORIGIN (decl))
1158 cgraph_node *origin_node
1159 = cgraph_node::get_create (DECL_ABSTRACT_ORIGIN (decl));
1160 origin_node->used_as_abstract_origin = true;
1162 /* Preserve a functions function context node. It will
1163 later be needed to output debug info. */
1164 if (tree fn = decl_function_context (decl))
1166 cgraph_node *origin_node = cgraph_node::get_create (fn);
1167 enqueue_node (origin_node);
1170 else
1172 varpool_node *vnode = dyn_cast <varpool_node *> (node);
1173 if (vnode && vnode->definition && !vnode->analyzed)
1174 vnode->analyze ();
1177 if (node->same_comdat_group)
1179 symtab_node *next;
1180 for (next = node->same_comdat_group;
1181 next != node;
1182 next = next->same_comdat_group)
1183 if (!next->comdat_local_p ())
1184 enqueue_node (next);
1186 for (i = 0; node->iterate_reference (i, ref); i++)
1187 if (ref->referred->definition
1188 && (!DECL_EXTERNAL (ref->referred->decl)
1189 || ((TREE_CODE (ref->referred->decl) != FUNCTION_DECL
1190 && optimize)
1191 || (TREE_CODE (ref->referred->decl) == FUNCTION_DECL
1192 && opt_for_fn (ref->referred->decl, optimize))
1193 || node->alias
1194 || ref->referred->alias)))
1195 enqueue_node (ref->referred);
1196 symtab->process_new_functions ();
1199 update_type_inheritance_graph ();
1201 /* Collect entry points to the unit. */
1202 if (symtab->dump_file)
1204 fprintf (symtab->dump_file, "\n\nInitial ");
1205 symtab->dump (symtab->dump_file);
1208 if (first_time)
1210 symtab_node *snode;
1211 FOR_EACH_SYMBOL (snode)
1212 check_global_declaration (snode);
1215 if (symtab->dump_file)
1216 fprintf (symtab->dump_file, "\nRemoving unused symbols:");
1218 for (node = symtab->first_symbol ();
1219 node != first_handled
1220 && node != first_handled_var; node = next)
1222 next = node->next;
1223 if (!node->aux && !node->referred_to_p ())
1225 if (symtab->dump_file)
1226 fprintf (symtab->dump_file, " %s", node->name ());
1228 /* See if the debugger can use anything before the DECL
1229 passes away. Perhaps it can notice a DECL that is now a
1230 constant and can tag the early DIE with an appropriate
1231 attribute.
1233 Otherwise, this is the last chance the debug_hooks have
1234 at looking at optimized away DECLs, since
1235 late_global_decl will subsequently be called from the
1236 contents of the now pruned symbol table. */
1237 if (VAR_P (node->decl)
1238 && !decl_function_context (node->decl))
1240 /* We are reclaiming totally unreachable code and variables
1241 so they effectively appear as readonly. Show that to
1242 the debug machinery. */
1243 TREE_READONLY (node->decl) = 1;
1244 node->definition = false;
1245 (*debug_hooks->late_global_decl) (node->decl);
1248 node->remove ();
1249 continue;
1251 if (cgraph_node *cnode = dyn_cast <cgraph_node *> (node))
1253 tree decl = node->decl;
1255 if (cnode->definition && !gimple_has_body_p (decl)
1256 && !cnode->alias
1257 && !cnode->thunk.thunk_p)
1258 cnode->reset ();
1260 gcc_assert (!cnode->definition || cnode->thunk.thunk_p
1261 || cnode->alias
1262 || gimple_has_body_p (decl)
1263 || cnode->native_rtl_p ());
1264 gcc_assert (cnode->analyzed == cnode->definition);
1266 node->aux = NULL;
1268 for (;node; node = node->next)
1269 node->aux = NULL;
1270 first_analyzed = symtab->first_function ();
1271 first_analyzed_var = symtab->first_variable ();
1272 if (symtab->dump_file)
1274 fprintf (symtab->dump_file, "\n\nReclaimed ");
1275 symtab->dump (symtab->dump_file);
1277 bitmap_obstack_release (NULL);
1278 ggc_collect ();
1279 /* Initialize assembler name hash, in particular we want to trigger C++
1280 mangling and same body alias creation before we free DECL_ARGUMENTS
1281 used by it. */
1282 if (!seen_error ())
1283 symtab->symtab_initialize_asm_name_hash ();
1285 input_location = saved_loc;
1288 /* Check declaration of the type of ALIAS for compatibility with its TARGET
1289 (which may be an ifunc resolver) and issue a diagnostic when they are
1290 not compatible according to language rules (plus a C++ extension for
1291 non-static member functions). */
1293 static void
1294 maybe_diag_incompatible_alias (tree alias, tree target)
1296 tree altype = TREE_TYPE (alias);
1297 tree targtype = TREE_TYPE (target);
1299 bool ifunc = cgraph_node::get (alias)->ifunc_resolver;
1300 tree funcptr = altype;
1302 if (ifunc)
1304 /* Handle attribute ifunc first. */
1305 if (TREE_CODE (altype) == METHOD_TYPE)
1307 /* Set FUNCPTR to the type of the alias target. If the type
1308 is a non-static member function of class C, construct a type
1309 of an ordinary function taking C* as the first argument,
1310 followed by the member function argument list, and use it
1311 instead to check for incompatibility. This conversion is
1312 not defined by the language but an extension provided by
1313 G++. */
1315 tree rettype = TREE_TYPE (altype);
1316 tree args = TYPE_ARG_TYPES (altype);
1317 altype = build_function_type (rettype, args);
1318 funcptr = altype;
1321 targtype = TREE_TYPE (targtype);
1323 if (POINTER_TYPE_P (targtype))
1325 targtype = TREE_TYPE (targtype);
1327 /* Only issue Wattribute-alias for conversions to void* with
1328 -Wextra. */
1329 if (VOID_TYPE_P (targtype) && !extra_warnings)
1330 return;
1332 /* Proceed to handle incompatible ifunc resolvers below. */
1334 else
1336 funcptr = build_pointer_type (funcptr);
1338 error_at (DECL_SOURCE_LOCATION (target),
1339 "%<ifunc%> resolver for %qD must return %qT",
1340 alias, funcptr);
1341 inform (DECL_SOURCE_LOCATION (alias),
1342 "resolver indirect function declared here");
1343 return;
1347 if ((!FUNC_OR_METHOD_TYPE_P (targtype)
1348 || (prototype_p (altype)
1349 && prototype_p (targtype)
1350 && !types_compatible_p (altype, targtype))))
1352 /* Warn for incompatibilities. Avoid warning for functions
1353 without a prototype to make it possible to declare aliases
1354 without knowing the exact type, as libstdc++ does. */
1355 if (ifunc)
1357 funcptr = build_pointer_type (funcptr);
1359 auto_diagnostic_group d;
1360 if (warning_at (DECL_SOURCE_LOCATION (target),
1361 OPT_Wattribute_alias,
1362 "%<ifunc%> resolver for %qD should return %qT",
1363 alias, funcptr))
1364 inform (DECL_SOURCE_LOCATION (alias),
1365 "resolver indirect function declared here");
1367 else
1369 auto_diagnostic_group d;
1370 if (warning_at (DECL_SOURCE_LOCATION (alias),
1371 OPT_Wattribute_alias,
1372 "%qD alias between functions of incompatible "
1373 "types %qT and %qT", alias, altype, targtype))
1374 inform (DECL_SOURCE_LOCATION (target),
1375 "aliased declaration here");
1380 /* Translate the ugly representation of aliases as alias pairs into nice
1381 representation in callgraph. We don't handle all cases yet,
1382 unfortunately. */
1384 static void
1385 handle_alias_pairs (void)
1387 alias_pair *p;
1388 unsigned i;
1390 for (i = 0; alias_pairs && alias_pairs->iterate (i, &p);)
1392 symtab_node *target_node = symtab_node::get_for_asmname (p->target);
1394 /* Weakrefs with target not defined in current unit are easy to handle:
1395 they behave just as external variables except we need to note the
1396 alias flag to later output the weakref pseudo op into asm file. */
1397 if (!target_node
1398 && lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)) != NULL)
1400 symtab_node *node = symtab_node::get (p->decl);
1401 if (node)
1403 node->alias_target = p->target;
1404 node->weakref = true;
1405 node->alias = true;
1406 node->transparent_alias = true;
1408 alias_pairs->unordered_remove (i);
1409 continue;
1411 else if (!target_node)
1413 error ("%q+D aliased to undefined symbol %qE", p->decl, p->target);
1414 symtab_node *node = symtab_node::get (p->decl);
1415 if (node)
1416 node->alias = false;
1417 alias_pairs->unordered_remove (i);
1418 continue;
1421 if (DECL_EXTERNAL (target_node->decl)
1422 /* We use local aliases for C++ thunks to force the tailcall
1423 to bind locally. This is a hack - to keep it working do
1424 the following (which is not strictly correct). */
1425 && (TREE_CODE (target_node->decl) != FUNCTION_DECL
1426 || ! DECL_VIRTUAL_P (target_node->decl))
1427 && ! lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)))
1429 error ("%q+D aliased to external symbol %qE",
1430 p->decl, p->target);
1433 if (TREE_CODE (p->decl) == FUNCTION_DECL
1434 && target_node && is_a <cgraph_node *> (target_node))
1436 maybe_diag_incompatible_alias (p->decl, target_node->decl);
1438 cgraph_node *src_node = cgraph_node::get (p->decl);
1439 if (src_node && src_node->definition)
1440 src_node->reset ();
1441 cgraph_node::create_alias (p->decl, target_node->decl);
1442 alias_pairs->unordered_remove (i);
1444 else if (VAR_P (p->decl)
1445 && target_node && is_a <varpool_node *> (target_node))
1447 varpool_node::create_alias (p->decl, target_node->decl);
1448 alias_pairs->unordered_remove (i);
1450 else
1452 error ("%q+D alias between function and variable is not supported",
1453 p->decl);
1454 inform (DECL_SOURCE_LOCATION (target_node->decl),
1455 "aliased declaration here");
1457 alias_pairs->unordered_remove (i);
1460 vec_free (alias_pairs);
1464 /* Figure out what functions we want to assemble. */
1466 static void
1467 mark_functions_to_output (void)
1469 bool check_same_comdat_groups = false;
1470 cgraph_node *node;
1472 if (flag_checking)
1473 FOR_EACH_FUNCTION (node)
1474 gcc_assert (!node->process);
1476 FOR_EACH_FUNCTION (node)
1478 tree decl = node->decl;
1480 gcc_assert (!node->process || node->same_comdat_group);
1481 if (node->process)
1482 continue;
1484 /* We need to output all local functions that are used and not
1485 always inlined, as well as those that are reachable from
1486 outside the current compilation unit. */
1487 if (node->analyzed
1488 && !node->thunk.thunk_p
1489 && !node->alias
1490 && !node->global.inlined_to
1491 && !TREE_ASM_WRITTEN (decl)
1492 && !DECL_EXTERNAL (decl))
1494 node->process = 1;
1495 if (node->same_comdat_group)
1497 cgraph_node *next;
1498 for (next = dyn_cast<cgraph_node *> (node->same_comdat_group);
1499 next != node;
1500 next = dyn_cast<cgraph_node *> (next->same_comdat_group))
1501 if (!next->thunk.thunk_p && !next->alias
1502 && !next->comdat_local_p ())
1503 next->process = 1;
1506 else if (node->same_comdat_group)
1508 if (flag_checking)
1509 check_same_comdat_groups = true;
1511 else
1513 /* We should've reclaimed all functions that are not needed. */
1514 if (flag_checking
1515 && !node->global.inlined_to
1516 && gimple_has_body_p (decl)
1517 /* FIXME: in ltrans unit when offline copy is outside partition but inline copies
1518 are inside partition, we can end up not removing the body since we no longer
1519 have analyzed node pointing to it. */
1520 && !node->in_other_partition
1521 && !node->alias
1522 && !node->clones
1523 && !DECL_EXTERNAL (decl))
1525 node->debug ();
1526 internal_error ("failed to reclaim unneeded function");
1528 gcc_assert (node->global.inlined_to
1529 || !gimple_has_body_p (decl)
1530 || node->in_other_partition
1531 || node->clones
1532 || DECL_ARTIFICIAL (decl)
1533 || DECL_EXTERNAL (decl));
1538 if (flag_checking && check_same_comdat_groups)
1539 FOR_EACH_FUNCTION (node)
1540 if (node->same_comdat_group && !node->process)
1542 tree decl = node->decl;
1543 if (!node->global.inlined_to
1544 && gimple_has_body_p (decl)
1545 /* FIXME: in an ltrans unit when the offline copy is outside a
1546 partition but inline copies are inside a partition, we can
1547 end up not removing the body since we no longer have an
1548 analyzed node pointing to it. */
1549 && !node->in_other_partition
1550 && !node->clones
1551 && !DECL_EXTERNAL (decl))
1553 node->debug ();
1554 internal_error ("failed to reclaim unneeded function in same "
1555 "comdat group");
1560 /* DECL is FUNCTION_DECL. Initialize datastructures so DECL is a function
1561 in lowered gimple form. IN_SSA is true if the gimple is in SSA.
1563 Set current_function_decl and cfun to newly constructed empty function body.
1564 return basic block in the function body. */
1566 basic_block
1567 init_lowered_empty_function (tree decl, bool in_ssa, profile_count count)
1569 basic_block bb;
1570 edge e;
1572 current_function_decl = decl;
1573 allocate_struct_function (decl, false);
1574 gimple_register_cfg_hooks ();
1575 init_empty_tree_cfg ();
1576 init_tree_ssa (cfun);
1578 if (in_ssa)
1580 init_ssa_operands (cfun);
1581 cfun->gimple_df->in_ssa_p = true;
1582 cfun->curr_properties |= PROP_ssa;
1585 DECL_INITIAL (decl) = make_node (BLOCK);
1586 BLOCK_SUPERCONTEXT (DECL_INITIAL (decl)) = decl;
1588 DECL_SAVED_TREE (decl) = error_mark_node;
1589 cfun->curr_properties |= (PROP_gimple_lcf | PROP_gimple_leh | PROP_gimple_any
1590 | PROP_cfg | PROP_loops);
1592 set_loops_for_fn (cfun, ggc_cleared_alloc<loops> ());
1593 init_loops_structure (cfun, loops_for_fn (cfun), 1);
1594 loops_for_fn (cfun)->state |= LOOPS_MAY_HAVE_MULTIPLE_LATCHES;
1596 /* Create BB for body of the function and connect it properly. */
1597 ENTRY_BLOCK_PTR_FOR_FN (cfun)->count = count;
1598 EXIT_BLOCK_PTR_FOR_FN (cfun)->count = count;
1599 bb = create_basic_block (NULL, ENTRY_BLOCK_PTR_FOR_FN (cfun));
1600 bb->count = count;
1601 e = make_edge (ENTRY_BLOCK_PTR_FOR_FN (cfun), bb, EDGE_FALLTHRU);
1602 e->probability = profile_probability::always ();
1603 e = make_edge (bb, EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
1604 e->probability = profile_probability::always ();
1605 add_bb_to_loop (bb, ENTRY_BLOCK_PTR_FOR_FN (cfun)->loop_father);
1607 return bb;
1610 /* Adjust PTR by the constant FIXED_OFFSET, by the vtable offset indicated by
1611 VIRTUAL_OFFSET, and by the indirect offset indicated by INDIRECT_OFFSET, if
1612 it is non-null. THIS_ADJUSTING is nonzero for a this adjusting thunk and zero
1613 for a result adjusting thunk. */
1615 tree
1616 thunk_adjust (gimple_stmt_iterator * bsi,
1617 tree ptr, bool this_adjusting,
1618 HOST_WIDE_INT fixed_offset, tree virtual_offset,
1619 HOST_WIDE_INT indirect_offset)
1621 gassign *stmt;
1622 tree ret;
1624 if (this_adjusting
1625 && fixed_offset != 0)
1627 stmt = gimple_build_assign
1628 (ptr, fold_build_pointer_plus_hwi_loc (input_location,
1629 ptr,
1630 fixed_offset));
1631 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1634 if (!vtable_entry_type && (virtual_offset || indirect_offset != 0))
1636 tree vfunc_type = make_node (FUNCTION_TYPE);
1637 TREE_TYPE (vfunc_type) = integer_type_node;
1638 TYPE_ARG_TYPES (vfunc_type) = NULL_TREE;
1639 layout_type (vfunc_type);
1641 vtable_entry_type = build_pointer_type (vfunc_type);
1644 /* If there's a virtual offset, look up that value in the vtable and
1645 adjust the pointer again. */
1646 if (virtual_offset)
1648 tree vtabletmp;
1649 tree vtabletmp2;
1650 tree vtabletmp3;
1652 vtabletmp =
1653 create_tmp_reg (build_pointer_type
1654 (build_pointer_type (vtable_entry_type)), "vptr");
1656 /* The vptr is always at offset zero in the object. */
1657 stmt = gimple_build_assign (vtabletmp,
1658 build1 (NOP_EXPR, TREE_TYPE (vtabletmp),
1659 ptr));
1660 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1662 /* Form the vtable address. */
1663 vtabletmp2 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp)),
1664 "vtableaddr");
1665 stmt = gimple_build_assign (vtabletmp2,
1666 build_simple_mem_ref (vtabletmp));
1667 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1669 /* Find the entry with the vcall offset. */
1670 stmt = gimple_build_assign (vtabletmp2,
1671 fold_build_pointer_plus_loc (input_location,
1672 vtabletmp2,
1673 virtual_offset));
1674 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1676 /* Get the offset itself. */
1677 vtabletmp3 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp2)),
1678 "vcalloffset");
1679 stmt = gimple_build_assign (vtabletmp3,
1680 build_simple_mem_ref (vtabletmp2));
1681 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1683 /* Adjust the `this' pointer. */
1684 ptr = fold_build_pointer_plus_loc (input_location, ptr, vtabletmp3);
1685 ptr = force_gimple_operand_gsi (bsi, ptr, true, NULL_TREE, false,
1686 GSI_CONTINUE_LINKING);
1689 /* Likewise for an offset that is stored in the object that contains the
1690 vtable. */
1691 if (indirect_offset != 0)
1693 tree offset_ptr, offset_tree;
1695 /* Get the address of the offset. */
1696 offset_ptr
1697 = create_tmp_reg (build_pointer_type
1698 (build_pointer_type (vtable_entry_type)),
1699 "offset_ptr");
1700 stmt = gimple_build_assign (offset_ptr,
1701 build1 (NOP_EXPR, TREE_TYPE (offset_ptr),
1702 ptr));
1703 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1705 stmt = gimple_build_assign
1706 (offset_ptr,
1707 fold_build_pointer_plus_hwi_loc (input_location, offset_ptr,
1708 indirect_offset));
1709 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1711 /* Get the offset itself. */
1712 offset_tree = create_tmp_reg (TREE_TYPE (TREE_TYPE (offset_ptr)),
1713 "offset");
1714 stmt = gimple_build_assign (offset_tree,
1715 build_simple_mem_ref (offset_ptr));
1716 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1718 /* Adjust the `this' pointer. */
1719 ptr = fold_build_pointer_plus_loc (input_location, ptr, offset_tree);
1720 ptr = force_gimple_operand_gsi (bsi, ptr, true, NULL_TREE, false,
1721 GSI_CONTINUE_LINKING);
1724 if (!this_adjusting
1725 && fixed_offset != 0)
1726 /* Adjust the pointer by the constant. */
1728 tree ptrtmp;
1730 if (VAR_P (ptr))
1731 ptrtmp = ptr;
1732 else
1734 ptrtmp = create_tmp_reg (TREE_TYPE (ptr), "ptr");
1735 stmt = gimple_build_assign (ptrtmp, ptr);
1736 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1738 ptr = fold_build_pointer_plus_hwi_loc (input_location,
1739 ptrtmp, fixed_offset);
1742 /* Emit the statement and gimplify the adjustment expression. */
1743 ret = create_tmp_reg (TREE_TYPE (ptr), "adjusted_this");
1744 stmt = gimple_build_assign (ret, ptr);
1745 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1747 return ret;
1750 /* Expand thunk NODE to gimple if possible.
1751 When FORCE_GIMPLE_THUNK is true, gimple thunk is created and
1752 no assembler is produced.
1753 When OUTPUT_ASM_THUNK is true, also produce assembler for
1754 thunks that are not lowered. */
1756 bool
1757 cgraph_node::expand_thunk (bool output_asm_thunks, bool force_gimple_thunk)
1759 bool this_adjusting = thunk.this_adjusting;
1760 HOST_WIDE_INT fixed_offset = thunk.fixed_offset;
1761 HOST_WIDE_INT virtual_value = thunk.virtual_value;
1762 HOST_WIDE_INT indirect_offset = thunk.indirect_offset;
1763 tree virtual_offset = NULL;
1764 tree alias = callees->callee->decl;
1765 tree thunk_fndecl = decl;
1766 tree a;
1768 /* Instrumentation thunk is the same function with
1769 a different signature. Never need to expand it. */
1770 if (thunk.add_pointer_bounds_args)
1771 return false;
1773 if (!force_gimple_thunk
1774 && this_adjusting
1775 && indirect_offset == 0
1776 && !DECL_EXTERNAL (alias)
1777 && !DECL_STATIC_CHAIN (alias)
1778 && targetm.asm_out.can_output_mi_thunk (thunk_fndecl, fixed_offset,
1779 virtual_value, alias))
1781 const char *fnname;
1782 tree fn_block;
1783 tree restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1785 if (!output_asm_thunks)
1787 analyzed = true;
1788 return false;
1791 if (in_lto_p)
1792 get_untransformed_body ();
1793 a = DECL_ARGUMENTS (thunk_fndecl);
1795 current_function_decl = thunk_fndecl;
1797 /* Ensure thunks are emitted in their correct sections. */
1798 resolve_unique_section (thunk_fndecl, 0,
1799 flag_function_sections);
1801 DECL_RESULT (thunk_fndecl)
1802 = build_decl (DECL_SOURCE_LOCATION (thunk_fndecl),
1803 RESULT_DECL, 0, restype);
1804 DECL_CONTEXT (DECL_RESULT (thunk_fndecl)) = thunk_fndecl;
1805 fnname = IDENTIFIER_POINTER (DECL_ASSEMBLER_NAME (thunk_fndecl));
1807 /* The back end expects DECL_INITIAL to contain a BLOCK, so we
1808 create one. */
1809 fn_block = make_node (BLOCK);
1810 BLOCK_VARS (fn_block) = a;
1811 DECL_INITIAL (thunk_fndecl) = fn_block;
1812 BLOCK_SUPERCONTEXT (fn_block) = thunk_fndecl;
1813 allocate_struct_function (thunk_fndecl, false);
1814 init_function_start (thunk_fndecl);
1815 cfun->is_thunk = 1;
1816 insn_locations_init ();
1817 set_curr_insn_location (DECL_SOURCE_LOCATION (thunk_fndecl));
1818 prologue_location = curr_insn_location ();
1819 assemble_start_function (thunk_fndecl, fnname);
1821 targetm.asm_out.output_mi_thunk (asm_out_file, thunk_fndecl,
1822 fixed_offset, virtual_value, alias);
1824 assemble_end_function (thunk_fndecl, fnname);
1825 insn_locations_finalize ();
1826 init_insn_lengths ();
1827 free_after_compilation (cfun);
1828 TREE_ASM_WRITTEN (thunk_fndecl) = 1;
1829 thunk.thunk_p = false;
1830 analyzed = false;
1832 else if (stdarg_p (TREE_TYPE (thunk_fndecl)))
1834 error ("generic thunk code fails for method %qD which uses %<...%>",
1835 thunk_fndecl);
1836 TREE_ASM_WRITTEN (thunk_fndecl) = 1;
1837 analyzed = true;
1838 return false;
1840 else
1842 tree restype;
1843 basic_block bb, then_bb, else_bb, return_bb;
1844 gimple_stmt_iterator bsi;
1845 int nargs = 0;
1846 tree arg;
1847 int i;
1848 tree resdecl;
1849 tree restmp = NULL;
1851 gcall *call;
1852 greturn *ret;
1853 bool alias_is_noreturn = TREE_THIS_VOLATILE (alias);
1855 /* We may be called from expand_thunk that releses body except for
1856 DECL_ARGUMENTS. In this case force_gimple_thunk is true. */
1857 if (in_lto_p && !force_gimple_thunk)
1858 get_untransformed_body ();
1859 a = DECL_ARGUMENTS (thunk_fndecl);
1861 current_function_decl = thunk_fndecl;
1863 /* Ensure thunks are emitted in their correct sections. */
1864 resolve_unique_section (thunk_fndecl, 0,
1865 flag_function_sections);
1867 DECL_IGNORED_P (thunk_fndecl) = 1;
1868 bitmap_obstack_initialize (NULL);
1870 if (thunk.virtual_offset_p)
1871 virtual_offset = size_int (virtual_value);
1873 /* Build the return declaration for the function. */
1874 restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1875 if (DECL_RESULT (thunk_fndecl) == NULL_TREE)
1877 resdecl = build_decl (input_location, RESULT_DECL, 0, restype);
1878 DECL_ARTIFICIAL (resdecl) = 1;
1879 DECL_IGNORED_P (resdecl) = 1;
1880 DECL_CONTEXT (resdecl) = thunk_fndecl;
1881 DECL_RESULT (thunk_fndecl) = resdecl;
1883 else
1884 resdecl = DECL_RESULT (thunk_fndecl);
1886 profile_count cfg_count = count;
1887 if (!cfg_count.initialized_p ())
1888 cfg_count = profile_count::from_gcov_type (BB_FREQ_MAX).guessed_local ();
1890 bb = then_bb = else_bb = return_bb
1891 = init_lowered_empty_function (thunk_fndecl, true, cfg_count);
1893 bsi = gsi_start_bb (bb);
1895 /* Build call to the function being thunked. */
1896 if (!VOID_TYPE_P (restype)
1897 && (!alias_is_noreturn
1898 || TREE_ADDRESSABLE (restype)
1899 || TREE_CODE (TYPE_SIZE_UNIT (restype)) != INTEGER_CST))
1901 if (DECL_BY_REFERENCE (resdecl))
1903 restmp = gimple_fold_indirect_ref (resdecl);
1904 if (!restmp)
1905 restmp = build2 (MEM_REF,
1906 TREE_TYPE (TREE_TYPE (DECL_RESULT (alias))),
1907 resdecl,
1908 build_int_cst (TREE_TYPE
1909 (DECL_RESULT (alias)), 0));
1911 else if (!is_gimple_reg_type (restype))
1913 if (aggregate_value_p (resdecl, TREE_TYPE (thunk_fndecl)))
1915 restmp = resdecl;
1917 if (VAR_P (restmp))
1919 add_local_decl (cfun, restmp);
1920 BLOCK_VARS (DECL_INITIAL (current_function_decl))
1921 = restmp;
1924 else
1925 restmp = create_tmp_var (restype, "retval");
1927 else
1928 restmp = create_tmp_reg (restype, "retval");
1931 for (arg = a; arg; arg = DECL_CHAIN (arg))
1932 nargs++;
1933 auto_vec<tree> vargs (nargs);
1934 i = 0;
1935 arg = a;
1936 if (this_adjusting)
1938 vargs.quick_push (thunk_adjust (&bsi, a, 1, fixed_offset,
1939 virtual_offset, indirect_offset));
1940 arg = DECL_CHAIN (a);
1941 i = 1;
1944 if (nargs)
1945 for (; i < nargs; i++, arg = DECL_CHAIN (arg))
1947 tree tmp = arg;
1948 if (VECTOR_TYPE_P (TREE_TYPE (arg))
1949 || TREE_CODE (TREE_TYPE (arg)) == COMPLEX_TYPE)
1950 DECL_GIMPLE_REG_P (arg) = 1;
1952 if (!is_gimple_val (arg))
1954 tmp = create_tmp_reg (TYPE_MAIN_VARIANT
1955 (TREE_TYPE (arg)), "arg");
1956 gimple *stmt = gimple_build_assign (tmp, arg);
1957 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1959 vargs.quick_push (tmp);
1961 call = gimple_build_call_vec (build_fold_addr_expr_loc (0, alias), vargs);
1962 callees->call_stmt = call;
1963 gimple_call_set_from_thunk (call, true);
1964 if (DECL_STATIC_CHAIN (alias))
1966 tree p = DECL_STRUCT_FUNCTION (alias)->static_chain_decl;
1967 tree type = TREE_TYPE (p);
1968 tree decl = build_decl (DECL_SOURCE_LOCATION (thunk_fndecl),
1969 PARM_DECL, create_tmp_var_name ("CHAIN"),
1970 type);
1971 DECL_ARTIFICIAL (decl) = 1;
1972 DECL_IGNORED_P (decl) = 1;
1973 TREE_USED (decl) = 1;
1974 DECL_CONTEXT (decl) = thunk_fndecl;
1975 DECL_ARG_TYPE (decl) = type;
1976 TREE_READONLY (decl) = 1;
1978 struct function *sf = DECL_STRUCT_FUNCTION (thunk_fndecl);
1979 sf->static_chain_decl = decl;
1981 gimple_call_set_chain (call, decl);
1984 /* Return slot optimization is always possible and in fact requred to
1985 return values with DECL_BY_REFERENCE. */
1986 if (aggregate_value_p (resdecl, TREE_TYPE (thunk_fndecl))
1987 && (!is_gimple_reg_type (TREE_TYPE (resdecl))
1988 || DECL_BY_REFERENCE (resdecl)))
1989 gimple_call_set_return_slot_opt (call, true);
1991 if (restmp)
1993 gimple_call_set_lhs (call, restmp);
1994 gcc_assert (useless_type_conversion_p (TREE_TYPE (restmp),
1995 TREE_TYPE (TREE_TYPE (alias))));
1997 gsi_insert_after (&bsi, call, GSI_NEW_STMT);
1998 if (!alias_is_noreturn)
2000 if (restmp && !this_adjusting
2001 && (fixed_offset || virtual_offset))
2003 tree true_label = NULL_TREE;
2005 if (TREE_CODE (TREE_TYPE (restmp)) == POINTER_TYPE)
2007 gimple *stmt;
2008 edge e;
2009 /* If the return type is a pointer, we need to
2010 protect against NULL. We know there will be an
2011 adjustment, because that's why we're emitting a
2012 thunk. */
2013 then_bb = create_basic_block (NULL, bb);
2014 then_bb->count = cfg_count - cfg_count.apply_scale (1, 16);
2015 return_bb = create_basic_block (NULL, then_bb);
2016 return_bb->count = cfg_count;
2017 else_bb = create_basic_block (NULL, else_bb);
2018 else_bb->count = cfg_count.apply_scale (1, 16);
2019 add_bb_to_loop (then_bb, bb->loop_father);
2020 add_bb_to_loop (return_bb, bb->loop_father);
2021 add_bb_to_loop (else_bb, bb->loop_father);
2022 remove_edge (single_succ_edge (bb));
2023 true_label = gimple_block_label (then_bb);
2024 stmt = gimple_build_cond (NE_EXPR, restmp,
2025 build_zero_cst (TREE_TYPE (restmp)),
2026 NULL_TREE, NULL_TREE);
2027 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
2028 e = make_edge (bb, then_bb, EDGE_TRUE_VALUE);
2029 e->probability = profile_probability::guessed_always ()
2030 .apply_scale (1, 16);
2031 e = make_edge (bb, else_bb, EDGE_FALSE_VALUE);
2032 e->probability = profile_probability::guessed_always ()
2033 .apply_scale (1, 16);
2034 make_single_succ_edge (return_bb,
2035 EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
2036 make_single_succ_edge (then_bb, return_bb, EDGE_FALLTHRU);
2037 e = make_edge (else_bb, return_bb, EDGE_FALLTHRU);
2038 e->probability = profile_probability::always ();
2039 bsi = gsi_last_bb (then_bb);
2042 restmp = thunk_adjust (&bsi, restmp, /*this_adjusting=*/0,
2043 fixed_offset, virtual_offset,
2044 indirect_offset);
2045 if (true_label)
2047 gimple *stmt;
2048 bsi = gsi_last_bb (else_bb);
2049 stmt = gimple_build_assign (restmp,
2050 build_zero_cst (TREE_TYPE (restmp)));
2051 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
2052 bsi = gsi_last_bb (return_bb);
2055 else
2056 gimple_call_set_tail (call, true);
2058 /* Build return value. */
2059 if (!DECL_BY_REFERENCE (resdecl))
2060 ret = gimple_build_return (restmp);
2061 else
2062 ret = gimple_build_return (resdecl);
2064 gsi_insert_after (&bsi, ret, GSI_NEW_STMT);
2066 else
2068 gimple_call_set_tail (call, true);
2069 remove_edge (single_succ_edge (bb));
2072 cfun->gimple_df->in_ssa_p = true;
2073 update_max_bb_count ();
2074 profile_status_for_fn (cfun)
2075 = cfg_count.initialized_p () && cfg_count.ipa_p ()
2076 ? PROFILE_READ : PROFILE_GUESSED;
2077 /* FIXME: C++ FE should stop setting TREE_ASM_WRITTEN on thunks. */
2078 TREE_ASM_WRITTEN (thunk_fndecl) = false;
2079 delete_unreachable_blocks ();
2080 update_ssa (TODO_update_ssa);
2081 checking_verify_flow_info ();
2082 free_dominance_info (CDI_DOMINATORS);
2084 /* Since we want to emit the thunk, we explicitly mark its name as
2085 referenced. */
2086 thunk.thunk_p = false;
2087 lowered = true;
2088 bitmap_obstack_release (NULL);
2090 current_function_decl = NULL;
2091 set_cfun (NULL);
2092 return true;
2095 /* Assemble thunks and aliases associated to node. */
2097 void
2098 cgraph_node::assemble_thunks_and_aliases (void)
2100 cgraph_edge *e;
2101 ipa_ref *ref;
2103 for (e = callers; e;)
2104 if (e->caller->thunk.thunk_p
2105 && !e->caller->global.inlined_to
2106 && !e->caller->thunk.add_pointer_bounds_args)
2108 cgraph_node *thunk = e->caller;
2110 e = e->next_caller;
2111 thunk->expand_thunk (true, false);
2112 thunk->assemble_thunks_and_aliases ();
2114 else
2115 e = e->next_caller;
2117 FOR_EACH_ALIAS (this, ref)
2119 cgraph_node *alias = dyn_cast <cgraph_node *> (ref->referring);
2120 if (!alias->transparent_alias)
2122 bool saved_written = TREE_ASM_WRITTEN (decl);
2124 /* Force assemble_alias to really output the alias this time instead
2125 of buffering it in same alias pairs. */
2126 TREE_ASM_WRITTEN (decl) = 1;
2127 do_assemble_alias (alias->decl,
2128 DECL_ASSEMBLER_NAME (decl));
2129 alias->assemble_thunks_and_aliases ();
2130 TREE_ASM_WRITTEN (decl) = saved_written;
2135 /* Expand function specified by node. */
2137 void
2138 cgraph_node::expand (void)
2140 location_t saved_loc;
2142 /* We ought to not compile any inline clones. */
2143 gcc_assert (!global.inlined_to);
2145 /* __RTL functions are compiled as soon as they are parsed, so don't
2146 do it again. */
2147 if (native_rtl_p ())
2148 return;
2150 announce_function (decl);
2151 process = 0;
2152 gcc_assert (lowered);
2153 get_untransformed_body ();
2155 /* Generate RTL for the body of DECL. */
2157 timevar_push (TV_REST_OF_COMPILATION);
2159 gcc_assert (symtab->global_info_ready);
2161 /* Initialize the default bitmap obstack. */
2162 bitmap_obstack_initialize (NULL);
2164 /* Initialize the RTL code for the function. */
2165 saved_loc = input_location;
2166 input_location = DECL_SOURCE_LOCATION (decl);
2168 gcc_assert (DECL_STRUCT_FUNCTION (decl));
2169 push_cfun (DECL_STRUCT_FUNCTION (decl));
2170 init_function_start (decl);
2172 gimple_register_cfg_hooks ();
2174 bitmap_obstack_initialize (&reg_obstack); /* FIXME, only at RTL generation*/
2176 execute_all_ipa_transforms ();
2178 /* Perform all tree transforms and optimizations. */
2180 /* Signal the start of passes. */
2181 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_START, NULL);
2183 execute_pass_list (cfun, g->get_passes ()->all_passes);
2185 /* Signal the end of passes. */
2186 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_END, NULL);
2188 bitmap_obstack_release (&reg_obstack);
2190 /* Release the default bitmap obstack. */
2191 bitmap_obstack_release (NULL);
2193 /* If requested, warn about function definitions where the function will
2194 return a value (usually of some struct or union type) which itself will
2195 take up a lot of stack space. */
2196 if (!DECL_EXTERNAL (decl) && TREE_TYPE (decl))
2198 tree ret_type = TREE_TYPE (TREE_TYPE (decl));
2200 if (ret_type && TYPE_SIZE_UNIT (ret_type)
2201 && TREE_CODE (TYPE_SIZE_UNIT (ret_type)) == INTEGER_CST
2202 && compare_tree_int (TYPE_SIZE_UNIT (ret_type),
2203 warn_larger_than_size) > 0)
2205 unsigned int size_as_int
2206 = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (ret_type));
2208 if (compare_tree_int (TYPE_SIZE_UNIT (ret_type), size_as_int) == 0)
2209 warning (OPT_Wlarger_than_,
2210 "size of return value of %q+D is %u bytes",
2211 decl, size_as_int);
2212 else
2213 warning (OPT_Wlarger_than_,
2214 "size of return value of %q+D is larger than %wu bytes",
2215 decl, warn_larger_than_size);
2219 gimple_set_body (decl, NULL);
2220 if (DECL_STRUCT_FUNCTION (decl) == 0
2221 && !cgraph_node::get (decl)->origin)
2223 /* Stop pointing to the local nodes about to be freed.
2224 But DECL_INITIAL must remain nonzero so we know this
2225 was an actual function definition.
2226 For a nested function, this is done in c_pop_function_context.
2227 If rest_of_compilation set this to 0, leave it 0. */
2228 if (DECL_INITIAL (decl) != 0)
2229 DECL_INITIAL (decl) = error_mark_node;
2232 input_location = saved_loc;
2234 ggc_collect ();
2235 timevar_pop (TV_REST_OF_COMPILATION);
2237 /* Make sure that BE didn't give up on compiling. */
2238 gcc_assert (TREE_ASM_WRITTEN (decl));
2239 if (cfun)
2240 pop_cfun ();
2242 /* It would make a lot more sense to output thunks before function body to get more
2243 forward and lest backwarding jumps. This however would need solving problem
2244 with comdats. See PR48668. Also aliases must come after function itself to
2245 make one pass assemblers, like one on AIX, happy. See PR 50689.
2246 FIXME: Perhaps thunks should be move before function IFF they are not in comdat
2247 groups. */
2248 assemble_thunks_and_aliases ();
2249 release_body ();
2250 /* Eliminate all call edges. This is important so the GIMPLE_CALL no longer
2251 points to the dead function body. */
2252 remove_callees ();
2253 remove_all_references ();
2256 /* Node comparer that is responsible for the order that corresponds
2257 to time when a function was launched for the first time. */
2259 static int
2260 node_cmp (const void *pa, const void *pb)
2262 const cgraph_node *a = *(const cgraph_node * const *) pa;
2263 const cgraph_node *b = *(const cgraph_node * const *) pb;
2265 /* Functions with time profile must be before these without profile. */
2266 if (!a->tp_first_run || !b->tp_first_run)
2267 return a->tp_first_run - b->tp_first_run;
2269 return a->tp_first_run != b->tp_first_run
2270 ? b->tp_first_run - a->tp_first_run
2271 : b->order - a->order;
2274 /* Expand all functions that must be output.
2276 Attempt to topologically sort the nodes so function is output when
2277 all called functions are already assembled to allow data to be
2278 propagated across the callgraph. Use a stack to get smaller distance
2279 between a function and its callees (later we may choose to use a more
2280 sophisticated algorithm for function reordering; we will likely want
2281 to use subsections to make the output functions appear in top-down
2282 order). */
2284 static void
2285 expand_all_functions (void)
2287 cgraph_node *node;
2288 cgraph_node **order = XCNEWVEC (cgraph_node *,
2289 symtab->cgraph_count);
2290 unsigned int expanded_func_count = 0, profiled_func_count = 0;
2291 int order_pos, new_order_pos = 0;
2292 int i;
2294 order_pos = ipa_reverse_postorder (order);
2295 gcc_assert (order_pos == symtab->cgraph_count);
2297 /* Garbage collector may remove inline clones we eliminate during
2298 optimization. So we must be sure to not reference them. */
2299 for (i = 0; i < order_pos; i++)
2300 if (order[i]->process)
2301 order[new_order_pos++] = order[i];
2303 if (flag_profile_reorder_functions)
2304 qsort (order, new_order_pos, sizeof (cgraph_node *), node_cmp);
2306 for (i = new_order_pos - 1; i >= 0; i--)
2308 node = order[i];
2310 if (node->process)
2312 expanded_func_count++;
2313 if(node->tp_first_run)
2314 profiled_func_count++;
2316 if (symtab->dump_file)
2317 fprintf (symtab->dump_file,
2318 "Time profile order in expand_all_functions:%s:%d\n",
2319 node->asm_name (), node->tp_first_run);
2320 node->process = 0;
2321 node->expand ();
2325 if (dump_file)
2326 fprintf (dump_file, "Expanded functions with time profile (%s):%u/%u\n",
2327 main_input_filename, profiled_func_count, expanded_func_count);
2329 if (symtab->dump_file && flag_profile_reorder_functions)
2330 fprintf (symtab->dump_file, "Expanded functions with time profile:%u/%u\n",
2331 profiled_func_count, expanded_func_count);
2333 symtab->process_new_functions ();
2334 free_gimplify_stack ();
2336 free (order);
2339 /* This is used to sort the node types by the cgraph order number. */
2341 enum cgraph_order_sort_kind
2343 ORDER_UNDEFINED = 0,
2344 ORDER_FUNCTION,
2345 ORDER_VAR,
2346 ORDER_VAR_UNDEF,
2347 ORDER_ASM
2350 struct cgraph_order_sort
2352 enum cgraph_order_sort_kind kind;
2353 union
2355 cgraph_node *f;
2356 varpool_node *v;
2357 asm_node *a;
2358 } u;
2361 /* Output all functions, variables, and asm statements in the order
2362 according to their order fields, which is the order in which they
2363 appeared in the file. This implements -fno-toplevel-reorder. In
2364 this mode we may output functions and variables which don't really
2365 need to be output. */
2367 static void
2368 output_in_order (void)
2370 int max;
2371 cgraph_order_sort *nodes;
2372 int i;
2373 cgraph_node *pf;
2374 varpool_node *pv;
2375 asm_node *pa;
2376 max = symtab->order;
2377 nodes = XCNEWVEC (cgraph_order_sort, max);
2379 FOR_EACH_DEFINED_FUNCTION (pf)
2381 if (pf->process && !pf->thunk.thunk_p && !pf->alias)
2383 if (!pf->no_reorder)
2384 continue;
2385 i = pf->order;
2386 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
2387 nodes[i].kind = ORDER_FUNCTION;
2388 nodes[i].u.f = pf;
2392 /* There is a similar loop in symbol_table::output_variables.
2393 Please keep them in sync. */
2394 FOR_EACH_VARIABLE (pv)
2396 if (!pv->no_reorder)
2397 continue;
2398 if (DECL_HARD_REGISTER (pv->decl)
2399 || DECL_HAS_VALUE_EXPR_P (pv->decl))
2400 continue;
2401 i = pv->order;
2402 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
2403 nodes[i].kind = pv->definition ? ORDER_VAR : ORDER_VAR_UNDEF;
2404 nodes[i].u.v = pv;
2407 for (pa = symtab->first_asm_symbol (); pa; pa = pa->next)
2409 i = pa->order;
2410 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
2411 nodes[i].kind = ORDER_ASM;
2412 nodes[i].u.a = pa;
2415 /* In toplevel reorder mode we output all statics; mark them as needed. */
2417 for (i = 0; i < max; ++i)
2418 if (nodes[i].kind == ORDER_VAR)
2419 nodes[i].u.v->finalize_named_section_flags ();
2421 for (i = 0; i < max; ++i)
2423 switch (nodes[i].kind)
2425 case ORDER_FUNCTION:
2426 nodes[i].u.f->process = 0;
2427 nodes[i].u.f->expand ();
2428 break;
2430 case ORDER_VAR:
2431 nodes[i].u.v->assemble_decl ();
2432 break;
2434 case ORDER_VAR_UNDEF:
2435 assemble_undefined_decl (nodes[i].u.v->decl);
2436 break;
2438 case ORDER_ASM:
2439 assemble_asm (nodes[i].u.a->asm_str);
2440 break;
2442 case ORDER_UNDEFINED:
2443 break;
2445 default:
2446 gcc_unreachable ();
2450 symtab->clear_asm_symbols ();
2452 free (nodes);
2455 static void
2456 ipa_passes (void)
2458 gcc::pass_manager *passes = g->get_passes ();
2460 set_cfun (NULL);
2461 current_function_decl = NULL;
2462 gimple_register_cfg_hooks ();
2463 bitmap_obstack_initialize (NULL);
2465 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_START, NULL);
2467 if (!in_lto_p)
2469 execute_ipa_pass_list (passes->all_small_ipa_passes);
2470 if (seen_error ())
2471 return;
2474 /* This extra symtab_remove_unreachable_nodes pass tends to catch some
2475 devirtualization and other changes where removal iterate. */
2476 symtab->remove_unreachable_nodes (symtab->dump_file);
2478 /* If pass_all_early_optimizations was not scheduled, the state of
2479 the cgraph will not be properly updated. Update it now. */
2480 if (symtab->state < IPA_SSA)
2481 symtab->state = IPA_SSA;
2483 if (!in_lto_p)
2485 /* Generate coverage variables and constructors. */
2486 coverage_finish ();
2488 /* Process new functions added. */
2489 set_cfun (NULL);
2490 current_function_decl = NULL;
2491 symtab->process_new_functions ();
2493 execute_ipa_summary_passes
2494 ((ipa_opt_pass_d *) passes->all_regular_ipa_passes);
2497 /* Some targets need to handle LTO assembler output specially. */
2498 if (flag_generate_lto || flag_generate_offload)
2499 targetm.asm_out.lto_start ();
2501 if (!in_lto_p
2502 || flag_incremental_link == INCREMENTAL_LINK_LTO)
2504 if (!quiet_flag)
2505 fprintf (stderr, "Streaming LTO\n");
2506 if (g->have_offload)
2508 section_name_prefix = OFFLOAD_SECTION_NAME_PREFIX;
2509 lto_stream_offload_p = true;
2510 ipa_write_summaries ();
2511 lto_stream_offload_p = false;
2513 if (flag_lto)
2515 section_name_prefix = LTO_SECTION_NAME_PREFIX;
2516 lto_stream_offload_p = false;
2517 ipa_write_summaries ();
2521 if (flag_generate_lto || flag_generate_offload)
2522 targetm.asm_out.lto_end ();
2524 if (!flag_ltrans
2525 && ((in_lto_p && flag_incremental_link != INCREMENTAL_LINK_LTO)
2526 || !flag_lto || flag_fat_lto_objects))
2527 execute_ipa_pass_list (passes->all_regular_ipa_passes);
2528 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_END, NULL);
2530 bitmap_obstack_release (NULL);
2534 /* Return string alias is alias of. */
2536 static tree
2537 get_alias_symbol (tree decl)
2539 tree alias = lookup_attribute ("alias", DECL_ATTRIBUTES (decl));
2540 return get_identifier (TREE_STRING_POINTER
2541 (TREE_VALUE (TREE_VALUE (alias))));
2545 /* Weakrefs may be associated to external decls and thus not output
2546 at expansion time. Emit all necessary aliases. */
2548 void
2549 symbol_table::output_weakrefs (void)
2551 symtab_node *node;
2552 FOR_EACH_SYMBOL (node)
2553 if (node->alias
2554 && !TREE_ASM_WRITTEN (node->decl)
2555 && node->weakref)
2557 tree target;
2559 /* Weakrefs are special by not requiring target definition in current
2560 compilation unit. It is thus bit hard to work out what we want to
2561 alias.
2562 When alias target is defined, we need to fetch it from symtab reference,
2563 otherwise it is pointed to by alias_target. */
2564 if (node->alias_target)
2565 target = (DECL_P (node->alias_target)
2566 ? DECL_ASSEMBLER_NAME (node->alias_target)
2567 : node->alias_target);
2568 else if (node->analyzed)
2569 target = DECL_ASSEMBLER_NAME (node->get_alias_target ()->decl);
2570 else
2572 gcc_unreachable ();
2573 target = get_alias_symbol (node->decl);
2575 do_assemble_alias (node->decl, target);
2579 /* Perform simple optimizations based on callgraph. */
2581 void
2582 symbol_table::compile (void)
2584 if (seen_error ())
2585 return;
2587 symtab_node::checking_verify_symtab_nodes ();
2589 timevar_push (TV_CGRAPHOPT);
2590 if (pre_ipa_mem_report)
2592 fprintf (stderr, "Memory consumption before IPA\n");
2593 dump_memory_report (false);
2595 if (!quiet_flag)
2596 fprintf (stderr, "Performing interprocedural optimizations\n");
2597 state = IPA;
2599 /* If LTO is enabled, initialize the streamer hooks needed by GIMPLE. */
2600 if (flag_generate_lto || flag_generate_offload)
2601 lto_streamer_hooks_init ();
2603 /* Don't run the IPA passes if there was any error or sorry messages. */
2604 if (!seen_error ())
2605 ipa_passes ();
2607 /* Do nothing else if any IPA pass found errors or if we are just streaming LTO. */
2608 if (seen_error ()
2609 || ((!in_lto_p || flag_incremental_link == INCREMENTAL_LINK_LTO)
2610 && flag_lto && !flag_fat_lto_objects))
2612 timevar_pop (TV_CGRAPHOPT);
2613 return;
2616 global_info_ready = true;
2617 if (dump_file)
2619 fprintf (dump_file, "Optimized ");
2620 symtab->dump (dump_file);
2622 if (post_ipa_mem_report)
2624 fprintf (stderr, "Memory consumption after IPA\n");
2625 dump_memory_report (false);
2627 timevar_pop (TV_CGRAPHOPT);
2629 /* Output everything. */
2630 switch_to_section (text_section);
2631 (*debug_hooks->assembly_start) ();
2632 if (!quiet_flag)
2633 fprintf (stderr, "Assembling functions:\n");
2634 symtab_node::checking_verify_symtab_nodes ();
2636 bitmap_obstack_initialize (NULL);
2637 execute_ipa_pass_list (g->get_passes ()->all_late_ipa_passes);
2638 bitmap_obstack_release (NULL);
2639 mark_functions_to_output ();
2641 /* When weakref support is missing, we automatically translate all
2642 references to NODE to references to its ultimate alias target.
2643 The renaming mechanizm uses flag IDENTIFIER_TRANSPARENT_ALIAS and
2644 TREE_CHAIN.
2646 Set up this mapping before we output any assembler but once we are sure
2647 that all symbol renaming is done.
2649 FIXME: All this uglyness can go away if we just do renaming at gimple
2650 level by physically rewritting the IL. At the moment we can only redirect
2651 calls, so we need infrastructure for renaming references as well. */
2652 #ifndef ASM_OUTPUT_WEAKREF
2653 symtab_node *node;
2655 FOR_EACH_SYMBOL (node)
2656 if (node->alias
2657 && lookup_attribute ("weakref", DECL_ATTRIBUTES (node->decl)))
2659 IDENTIFIER_TRANSPARENT_ALIAS
2660 (DECL_ASSEMBLER_NAME (node->decl)) = 1;
2661 TREE_CHAIN (DECL_ASSEMBLER_NAME (node->decl))
2662 = (node->alias_target ? node->alias_target
2663 : DECL_ASSEMBLER_NAME (node->get_alias_target ()->decl));
2665 #endif
2667 state = EXPANSION;
2669 /* Output first asm statements and anything ordered. The process
2670 flag is cleared for these nodes, so we skip them later. */
2671 output_in_order ();
2672 expand_all_functions ();
2673 output_variables ();
2675 process_new_functions ();
2676 state = FINISHED;
2677 output_weakrefs ();
2679 if (dump_file)
2681 fprintf (dump_file, "\nFinal ");
2682 symtab->dump (dump_file);
2684 if (!flag_checking)
2685 return;
2686 symtab_node::verify_symtab_nodes ();
2687 /* Double check that all inline clones are gone and that all
2688 function bodies have been released from memory. */
2689 if (!seen_error ())
2691 cgraph_node *node;
2692 bool error_found = false;
2694 FOR_EACH_DEFINED_FUNCTION (node)
2695 if (node->global.inlined_to
2696 || gimple_has_body_p (node->decl))
2698 error_found = true;
2699 node->debug ();
2701 if (error_found)
2702 internal_error ("nodes with unreleased memory found");
2706 /* Earlydebug dump file, flags, and number. */
2708 static int debuginfo_early_dump_nr;
2709 static FILE *debuginfo_early_dump_file;
2710 static dump_flags_t debuginfo_early_dump_flags;
2712 /* Debug dump file, flags, and number. */
2714 static int debuginfo_dump_nr;
2715 static FILE *debuginfo_dump_file;
2716 static dump_flags_t debuginfo_dump_flags;
2718 /* Register the debug and earlydebug dump files. */
2720 void
2721 debuginfo_early_init (void)
2723 gcc::dump_manager *dumps = g->get_dumps ();
2724 debuginfo_early_dump_nr = dumps->dump_register (".earlydebug", "earlydebug",
2725 "earlydebug", DK_tree,
2726 OPTGROUP_NONE,
2727 false);
2728 debuginfo_dump_nr = dumps->dump_register (".debug", "debug",
2729 "debug", DK_tree,
2730 OPTGROUP_NONE,
2731 false);
2734 /* Initialize the debug and earlydebug dump files. */
2736 void
2737 debuginfo_init (void)
2739 gcc::dump_manager *dumps = g->get_dumps ();
2740 debuginfo_dump_file = dump_begin (debuginfo_dump_nr, NULL);
2741 debuginfo_dump_flags = dumps->get_dump_file_info (debuginfo_dump_nr)->pflags;
2742 debuginfo_early_dump_file = dump_begin (debuginfo_early_dump_nr, NULL);
2743 debuginfo_early_dump_flags
2744 = dumps->get_dump_file_info (debuginfo_early_dump_nr)->pflags;
2747 /* Finalize the debug and earlydebug dump files. */
2749 void
2750 debuginfo_fini (void)
2752 if (debuginfo_dump_file)
2753 dump_end (debuginfo_dump_nr, debuginfo_dump_file);
2754 if (debuginfo_early_dump_file)
2755 dump_end (debuginfo_early_dump_nr, debuginfo_early_dump_file);
2758 /* Set dump_file to the debug dump file. */
2760 void
2761 debuginfo_start (void)
2763 set_dump_file (debuginfo_dump_file);
2766 /* Undo setting dump_file to the debug dump file. */
2768 void
2769 debuginfo_stop (void)
2771 set_dump_file (NULL);
2774 /* Set dump_file to the earlydebug dump file. */
2776 void
2777 debuginfo_early_start (void)
2779 set_dump_file (debuginfo_early_dump_file);
2782 /* Undo setting dump_file to the earlydebug dump file. */
2784 void
2785 debuginfo_early_stop (void)
2787 set_dump_file (NULL);
2790 /* Analyze the whole compilation unit once it is parsed completely. */
2792 void
2793 symbol_table::finalize_compilation_unit (void)
2795 timevar_push (TV_CGRAPH);
2797 /* If we're here there's no current function anymore. Some frontends
2798 are lazy in clearing these. */
2799 current_function_decl = NULL;
2800 set_cfun (NULL);
2802 /* Do not skip analyzing the functions if there were errors, we
2803 miss diagnostics for following functions otherwise. */
2805 /* Emit size functions we didn't inline. */
2806 finalize_size_functions ();
2808 /* Mark alias targets necessary and emit diagnostics. */
2809 handle_alias_pairs ();
2811 if (!quiet_flag)
2813 fprintf (stderr, "\nAnalyzing compilation unit\n");
2814 fflush (stderr);
2817 if (flag_dump_passes)
2818 dump_passes ();
2820 /* Gimplify and lower all functions, compute reachability and
2821 remove unreachable nodes. */
2822 analyze_functions (/*first_time=*/true);
2824 /* Mark alias targets necessary and emit diagnostics. */
2825 handle_alias_pairs ();
2827 /* Gimplify and lower thunks. */
2828 analyze_functions (/*first_time=*/false);
2830 /* Offloading requires LTO infrastructure. */
2831 if (!in_lto_p && g->have_offload)
2832 flag_generate_offload = 1;
2834 if (!seen_error ())
2836 /* Emit early debug for reachable functions, and by consequence,
2837 locally scoped symbols. */
2838 struct cgraph_node *cnode;
2839 FOR_EACH_FUNCTION_WITH_GIMPLE_BODY (cnode)
2840 (*debug_hooks->early_global_decl) (cnode->decl);
2842 /* Clean up anything that needs cleaning up after initial debug
2843 generation. */
2844 debuginfo_early_start ();
2845 (*debug_hooks->early_finish) (main_input_filename);
2846 debuginfo_early_stop ();
2849 /* Finally drive the pass manager. */
2850 compile ();
2852 timevar_pop (TV_CGRAPH);
2855 /* Reset all state within cgraphunit.c so that we can rerun the compiler
2856 within the same process. For use by toplev::finalize. */
2858 void
2859 cgraphunit_c_finalize (void)
2861 gcc_assert (cgraph_new_nodes.length () == 0);
2862 cgraph_new_nodes.truncate (0);
2864 vtable_entry_type = NULL;
2865 queued_nodes = &symtab_terminator;
2867 first_analyzed = NULL;
2868 first_analyzed_var = NULL;
2871 /* Creates a wrapper from cgraph_node to TARGET node. Thunk is used for this
2872 kind of wrapper method. */
2874 void
2875 cgraph_node::create_wrapper (cgraph_node *target)
2877 /* Preserve DECL_RESULT so we get right by reference flag. */
2878 tree decl_result = DECL_RESULT (decl);
2880 /* Remove the function's body but keep arguments to be reused
2881 for thunk. */
2882 release_body (true);
2883 reset ();
2885 DECL_UNINLINABLE (decl) = false;
2886 DECL_RESULT (decl) = decl_result;
2887 DECL_INITIAL (decl) = NULL;
2888 allocate_struct_function (decl, false);
2889 set_cfun (NULL);
2891 /* Turn alias into thunk and expand it into GIMPLE representation. */
2892 definition = true;
2894 memset (&thunk, 0, sizeof (cgraph_thunk_info));
2895 thunk.thunk_p = true;
2896 create_edge (target, NULL, count);
2897 callees->can_throw_external = !TREE_NOTHROW (target->decl);
2899 tree arguments = DECL_ARGUMENTS (decl);
2901 while (arguments)
2903 TREE_ADDRESSABLE (arguments) = false;
2904 arguments = TREE_CHAIN (arguments);
2907 expand_thunk (false, true);
2909 /* Inline summary set-up. */
2910 analyze ();
2911 inline_analyze_function (this);
2914 #include "gt-cgraphunit.h"