Daily bump.
[official-gcc.git] / gcc / cgraphunit.c
blob130fc0debcb7a96e96ee611955e4bc5257218e6b
1 /* Driver of optimization process
2 Copyright (C) 2003-2014 Free Software Foundation, Inc.
3 Contributed by Jan Hubicka
5 This file is part of GCC.
7 GCC is free software; you can redistribute it and/or modify it under
8 the terms of the GNU General Public License as published by the Free
9 Software Foundation; either version 3, or (at your option) any later
10 version.
12 GCC is distributed in the hope that it will be useful, but WITHOUT ANY
13 WARRANTY; without even the implied warranty of MERCHANTABILITY or
14 FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
15 for more details.
17 You should have received a copy of the GNU General Public License
18 along with GCC; see the file COPYING3. If not see
19 <http://www.gnu.org/licenses/>. */
21 /* This module implements main driver of compilation process.
23 The main scope of this file is to act as an interface in between
24 tree based frontends and the backend.
26 The front-end is supposed to use following functionality:
28 - cgraph_finalize_function
30 This function is called once front-end has parsed whole body of function
31 and it is certain that the function body nor the declaration will change.
33 (There is one exception needed for implementing GCC extern inline
34 function.)
36 - varpool_finalize_decl
38 This function has same behavior as the above but is used for static
39 variables.
41 - add_asm_node
43 Insert new toplevel ASM statement
45 - finalize_compilation_unit
47 This function is called once (source level) compilation unit is finalized
48 and it will no longer change.
50 The symbol table is constructed starting from the trivially needed
51 symbols finalized by the frontend. Functions are lowered into
52 GIMPLE representation and callgraph/reference lists are constructed.
53 Those are used to discover other necessary functions and variables.
55 At the end the bodies of unreachable functions are removed.
57 The function can be called multiple times when multiple source level
58 compilation units are combined.
60 - compile
62 This passes control to the back-end. Optimizations are performed and
63 final assembler is generated. This is done in the following way. Note
64 that with link time optimization the process is split into three
65 stages (compile time, linktime analysis and parallel linktime as
66 indicated bellow).
68 Compile time:
70 1) Inter-procedural optimization.
71 (ipa_passes)
73 This part is further split into:
75 a) early optimizations. These are local passes executed in
76 the topological order on the callgraph.
78 The purpose of early optimiations is to optimize away simple
79 things that may otherwise confuse IP analysis. Very simple
80 propagation across the callgraph is done i.e. to discover
81 functions without side effects and simple inlining is performed.
83 b) early small interprocedural passes.
85 Those are interprocedural passes executed only at compilation
86 time. These include, for example, transational memory lowering,
87 unreachable code removal and other simple transformations.
89 c) IP analysis stage. All interprocedural passes do their
90 analysis.
92 Interprocedural passes differ from small interprocedural
93 passes by their ability to operate across whole program
94 at linktime. Their analysis stage is performed early to
95 both reduce linking times and linktime memory usage by
96 not having to represent whole program in memory.
98 d) LTO sreaming. When doing LTO, everything important gets
99 streamed into the object file.
101 Compile time and or linktime analysis stage (WPA):
103 At linktime units gets streamed back and symbol table is
104 merged. Function bodies are not streamed in and not
105 available.
106 e) IP propagation stage. All IP passes execute their
107 IP propagation. This is done based on the earlier analysis
108 without having function bodies at hand.
109 f) Ltrans streaming. When doing WHOPR LTO, the program
110 is partitioned and streamed into multple object files.
112 Compile time and/or parallel linktime stage (ltrans)
114 Each of the object files is streamed back and compiled
115 separately. Now the function bodies becomes available
116 again.
118 2) Virtual clone materialization
119 (cgraph_materialize_clone)
121 IP passes can produce copies of existing functoins (such
122 as versioned clones or inline clones) without actually
123 manipulating their bodies by creating virtual clones in
124 the callgraph. At this time the virtual clones are
125 turned into real functions
126 3) IP transformation
128 All IP passes transform function bodies based on earlier
129 decision of the IP propagation.
131 4) late small IP passes
133 Simple IP passes working within single program partition.
135 5) Expansion
136 (expand_all_functions)
138 At this stage functions that needs to be output into
139 assembler are identified and compiled in topological order
140 6) Output of variables and aliases
141 Now it is known what variable references was not optimized
142 out and thus all variables are output to the file.
144 Note that with -fno-toplevel-reorder passes 5 and 6
145 are combined together in cgraph_output_in_order.
147 Finally there are functions to manipulate the callgraph from
148 backend.
149 - cgraph_add_new_function is used to add backend produced
150 functions introduced after the unit is finalized.
151 The functions are enqueue for later processing and inserted
152 into callgraph with cgraph_process_new_functions.
154 - cgraph_function_versioning
156 produces a copy of function into new one (a version)
157 and apply simple transformations
160 #include "config.h"
161 #include "system.h"
162 #include "coretypes.h"
163 #include "tm.h"
164 #include "tree.h"
165 #include "varasm.h"
166 #include "stor-layout.h"
167 #include "stringpool.h"
168 #include "output.h"
169 #include "rtl.h"
170 #include "basic-block.h"
171 #include "tree-ssa-alias.h"
172 #include "internal-fn.h"
173 #include "gimple-fold.h"
174 #include "gimple-expr.h"
175 #include "is-a.h"
176 #include "gimple.h"
177 #include "gimplify.h"
178 #include "gimple-iterator.h"
179 #include "gimplify-me.h"
180 #include "gimple-ssa.h"
181 #include "tree-cfg.h"
182 #include "tree-into-ssa.h"
183 #include "tree-ssa.h"
184 #include "tree-inline.h"
185 #include "langhooks.h"
186 #include "toplev.h"
187 #include "flags.h"
188 #include "debug.h"
189 #include "target.h"
190 #include "diagnostic.h"
191 #include "params.h"
192 #include "fibheap.h"
193 #include "intl.h"
194 #include "function.h"
195 #include "ipa-prop.h"
196 #include "tree-iterator.h"
197 #include "tree-pass.h"
198 #include "tree-dump.h"
199 #include "gimple-pretty-print.h"
200 #include "output.h"
201 #include "coverage.h"
202 #include "plugin.h"
203 #include "ipa-inline.h"
204 #include "ipa-utils.h"
205 #include "lto-streamer.h"
206 #include "except.h"
207 #include "cfgloop.h"
208 #include "regset.h" /* FIXME: For reg_obstack. */
209 #include "context.h"
210 #include "pass_manager.h"
211 #include "tree-nested.h"
212 #include "gimplify.h"
214 /* Queue of cgraph nodes scheduled to be added into cgraph. This is a
215 secondary queue used during optimization to accommodate passes that
216 may generate new functions that need to be optimized and expanded. */
217 cgraph_node_set cgraph_new_nodes;
219 static void expand_all_functions (void);
220 static void mark_functions_to_output (void);
221 static void expand_function (struct cgraph_node *);
222 static void handle_alias_pairs (void);
224 FILE *cgraph_dump_file;
226 /* Linked list of cgraph asm nodes. */
227 struct asm_node *asm_nodes;
229 /* Last node in cgraph_asm_nodes. */
230 static GTY(()) struct asm_node *asm_last_node;
232 /* Used for vtable lookup in thunk adjusting. */
233 static GTY (()) tree vtable_entry_type;
235 /* Determine if symbol DECL is needed. That is, visible to something
236 either outside this translation unit, something magic in the system
237 configury */
238 bool
239 decide_is_symbol_needed (symtab_node *node)
241 tree decl = node->decl;
243 /* Double check that no one output the function into assembly file
244 early. */
245 gcc_checking_assert (!DECL_ASSEMBLER_NAME_SET_P (decl)
246 || !TREE_SYMBOL_REFERENCED (DECL_ASSEMBLER_NAME (decl)));
248 if (!node->definition)
249 return false;
251 if (DECL_EXTERNAL (decl))
252 return false;
254 /* If the user told us it is used, then it must be so. */
255 if (node->force_output)
256 return true;
258 /* ABI forced symbols are needed when they are external. */
259 if (node->forced_by_abi && TREE_PUBLIC (decl))
260 return true;
262 /* Keep constructors, destructors and virtual functions. */
263 if (TREE_CODE (decl) == FUNCTION_DECL
264 && (DECL_STATIC_CONSTRUCTOR (decl) || DECL_STATIC_DESTRUCTOR (decl)))
265 return true;
267 /* Externally visible variables must be output. The exception is
268 COMDAT variables that must be output only when they are needed. */
269 if (TREE_PUBLIC (decl) && !DECL_COMDAT (decl))
270 return true;
272 return false;
275 /* Head and terminator of the queue of nodes to be processed while building
276 callgraph. */
278 static symtab_node symtab_terminator;
279 static symtab_node *queued_nodes = &symtab_terminator;
281 /* Add NODE to queue starting at QUEUED_NODES.
282 The queue is linked via AUX pointers and terminated by pointer to 1. */
284 static void
285 enqueue_node (symtab_node *node)
287 if (node->aux)
288 return;
289 gcc_checking_assert (queued_nodes);
290 node->aux = queued_nodes;
291 queued_nodes = node;
294 /* Process CGRAPH_NEW_FUNCTIONS and perform actions necessary to add these
295 functions into callgraph in a way so they look like ordinary reachable
296 functions inserted into callgraph already at construction time. */
298 void
299 cgraph_process_new_functions (void)
301 tree fndecl;
302 struct cgraph_node *node;
303 cgraph_node_set_iterator csi;
305 if (!cgraph_new_nodes)
306 return;
307 handle_alias_pairs ();
308 /* Note that this queue may grow as its being processed, as the new
309 functions may generate new ones. */
310 for (csi = csi_start (cgraph_new_nodes); !csi_end_p (csi); csi_next (&csi))
312 node = csi_node (csi);
313 fndecl = node->decl;
314 switch (cgraph_state)
316 case CGRAPH_STATE_CONSTRUCTION:
317 /* At construction time we just need to finalize function and move
318 it into reachable functions list. */
320 cgraph_finalize_function (fndecl, false);
321 cgraph_call_function_insertion_hooks (node);
322 enqueue_node (node);
323 break;
325 case CGRAPH_STATE_IPA:
326 case CGRAPH_STATE_IPA_SSA:
327 /* When IPA optimization already started, do all essential
328 transformations that has been already performed on the whole
329 cgraph but not on this function. */
331 gimple_register_cfg_hooks ();
332 if (!node->analyzed)
333 cgraph_analyze_function (node);
334 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
335 if (cgraph_state == CGRAPH_STATE_IPA_SSA
336 && !gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
337 g->get_passes ()->execute_early_local_passes ();
338 else if (inline_summary_vec != NULL)
339 compute_inline_parameters (node, true);
340 free_dominance_info (CDI_POST_DOMINATORS);
341 free_dominance_info (CDI_DOMINATORS);
342 pop_cfun ();
343 cgraph_call_function_insertion_hooks (node);
344 break;
346 case CGRAPH_STATE_EXPANSION:
347 /* Functions created during expansion shall be compiled
348 directly. */
349 node->process = 0;
350 cgraph_call_function_insertion_hooks (node);
351 expand_function (node);
352 break;
354 default:
355 gcc_unreachable ();
356 break;
359 free_cgraph_node_set (cgraph_new_nodes);
360 cgraph_new_nodes = NULL;
363 /* As an GCC extension we allow redefinition of the function. The
364 semantics when both copies of bodies differ is not well defined.
365 We replace the old body with new body so in unit at a time mode
366 we always use new body, while in normal mode we may end up with
367 old body inlined into some functions and new body expanded and
368 inlined in others.
370 ??? It may make more sense to use one body for inlining and other
371 body for expanding the function but this is difficult to do. */
373 void
374 cgraph_reset_node (struct cgraph_node *node)
376 /* If node->process is set, then we have already begun whole-unit analysis.
377 This is *not* testing for whether we've already emitted the function.
378 That case can be sort-of legitimately seen with real function redefinition
379 errors. I would argue that the front end should never present us with
380 such a case, but don't enforce that for now. */
381 gcc_assert (!node->process);
383 /* Reset our data structures so we can analyze the function again. */
384 memset (&node->local, 0, sizeof (node->local));
385 memset (&node->global, 0, sizeof (node->global));
386 memset (&node->rtl, 0, sizeof (node->rtl));
387 node->analyzed = false;
388 node->definition = false;
389 node->alias = false;
390 node->weakref = false;
391 node->cpp_implicit_alias = false;
393 cgraph_node_remove_callees (node);
394 ipa_remove_all_references (&node->ref_list);
397 /* Return true when there are references to NODE. */
399 static bool
400 referred_to_p (symtab_node *node)
402 struct ipa_ref *ref;
404 /* See if there are any references at all. */
405 if (ipa_ref_list_referring_iterate (&node->ref_list, 0, ref))
406 return true;
407 /* For functions check also calls. */
408 cgraph_node *cn = dyn_cast <cgraph_node> (node);
409 if (cn && cn->callers)
410 return true;
411 return false;
414 /* DECL has been parsed. Take it, queue it, compile it at the whim of the
415 logic in effect. If NO_COLLECT is true, then our caller cannot stand to have
416 the garbage collector run at the moment. We would need to either create
417 a new GC context, or just not compile right now. */
419 void
420 cgraph_finalize_function (tree decl, bool no_collect)
422 struct cgraph_node *node = cgraph_get_create_node (decl);
424 if (node->definition)
426 /* Nested functions should only be defined once. */
427 gcc_assert (!DECL_CONTEXT (decl)
428 || TREE_CODE (DECL_CONTEXT (decl)) != FUNCTION_DECL);
429 cgraph_reset_node (node);
430 node->local.redefined_extern_inline = true;
433 notice_global_symbol (decl);
434 node->definition = true;
435 node->lowered = DECL_STRUCT_FUNCTION (decl)->cfg != NULL;
437 /* With -fkeep-inline-functions we are keeping all inline functions except
438 for extern inline ones. */
439 if (flag_keep_inline_functions
440 && DECL_DECLARED_INLINE_P (decl)
441 && !DECL_EXTERNAL (decl)
442 && !DECL_DISREGARD_INLINE_LIMITS (decl))
443 node->force_output = 1;
445 /* When not optimizing, also output the static functions. (see
446 PR24561), but don't do so for always_inline functions, functions
447 declared inline and nested functions. These were optimized out
448 in the original implementation and it is unclear whether we want
449 to change the behavior here. */
450 if ((!optimize
451 && !node->cpp_implicit_alias
452 && !DECL_DISREGARD_INLINE_LIMITS (decl)
453 && !DECL_DECLARED_INLINE_P (decl)
454 && !(DECL_CONTEXT (decl)
455 && TREE_CODE (DECL_CONTEXT (decl)) == FUNCTION_DECL))
456 && !DECL_COMDAT (decl) && !DECL_EXTERNAL (decl))
457 node->force_output = 1;
459 /* If we've not yet emitted decl, tell the debug info about it. */
460 if (!TREE_ASM_WRITTEN (decl))
461 (*debug_hooks->deferred_inline_function) (decl);
463 /* Possibly warn about unused parameters. */
464 if (warn_unused_parameter)
465 do_warn_unused_parameter (decl);
467 if (!no_collect)
468 ggc_collect ();
470 if (cgraph_state == CGRAPH_STATE_CONSTRUCTION
471 && (decide_is_symbol_needed (node)
472 || referred_to_p (node)))
473 enqueue_node (node);
476 /* Add the function FNDECL to the call graph.
477 Unlike cgraph_finalize_function, this function is intended to be used
478 by middle end and allows insertion of new function at arbitrary point
479 of compilation. The function can be either in high, low or SSA form
480 GIMPLE.
482 The function is assumed to be reachable and have address taken (so no
483 API breaking optimizations are performed on it).
485 Main work done by this function is to enqueue the function for later
486 processing to avoid need the passes to be re-entrant. */
488 void
489 cgraph_add_new_function (tree fndecl, bool lowered)
491 gcc::pass_manager *passes = g->get_passes ();
492 struct cgraph_node *node;
493 switch (cgraph_state)
495 case CGRAPH_STATE_PARSING:
496 cgraph_finalize_function (fndecl, false);
497 break;
498 case CGRAPH_STATE_CONSTRUCTION:
499 /* Just enqueue function to be processed at nearest occurrence. */
500 node = cgraph_create_node (fndecl);
501 if (lowered)
502 node->lowered = true;
503 if (!cgraph_new_nodes)
504 cgraph_new_nodes = cgraph_node_set_new ();
505 cgraph_node_set_add (cgraph_new_nodes, node);
506 break;
508 case CGRAPH_STATE_IPA:
509 case CGRAPH_STATE_IPA_SSA:
510 case CGRAPH_STATE_EXPANSION:
511 /* Bring the function into finalized state and enqueue for later
512 analyzing and compilation. */
513 node = cgraph_get_create_node (fndecl);
514 node->local.local = false;
515 node->definition = true;
516 node->force_output = true;
517 if (!lowered && cgraph_state == CGRAPH_STATE_EXPANSION)
519 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
520 gimple_register_cfg_hooks ();
521 bitmap_obstack_initialize (NULL);
522 execute_pass_list (passes->all_lowering_passes);
523 passes->execute_early_local_passes ();
524 bitmap_obstack_release (NULL);
525 pop_cfun ();
527 lowered = true;
529 if (lowered)
530 node->lowered = true;
531 if (!cgraph_new_nodes)
532 cgraph_new_nodes = cgraph_node_set_new ();
533 cgraph_node_set_add (cgraph_new_nodes, node);
534 break;
536 case CGRAPH_STATE_FINISHED:
537 /* At the very end of compilation we have to do all the work up
538 to expansion. */
539 node = cgraph_create_node (fndecl);
540 if (lowered)
541 node->lowered = true;
542 node->definition = true;
543 cgraph_analyze_function (node);
544 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
545 gimple_register_cfg_hooks ();
546 bitmap_obstack_initialize (NULL);
547 if (!gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
548 g->get_passes ()->execute_early_local_passes ();
549 bitmap_obstack_release (NULL);
550 pop_cfun ();
551 expand_function (node);
552 break;
554 default:
555 gcc_unreachable ();
558 /* Set a personality if required and we already passed EH lowering. */
559 if (lowered
560 && (function_needs_eh_personality (DECL_STRUCT_FUNCTION (fndecl))
561 == eh_personality_lang))
562 DECL_FUNCTION_PERSONALITY (fndecl) = lang_hooks.eh_personality ();
565 /* Add a top-level asm statement to the list. */
567 struct asm_node *
568 add_asm_node (tree asm_str)
570 struct asm_node *node;
572 node = ggc_alloc_cleared_asm_node ();
573 node->asm_str = asm_str;
574 node->order = symtab_order++;
575 node->next = NULL;
576 if (asm_nodes == NULL)
577 asm_nodes = node;
578 else
579 asm_last_node->next = node;
580 asm_last_node = node;
581 return node;
584 /* Output all asm statements we have stored up to be output. */
586 static void
587 output_asm_statements (void)
589 struct asm_node *can;
591 if (seen_error ())
592 return;
594 for (can = asm_nodes; can; can = can->next)
595 assemble_asm (can->asm_str);
596 asm_nodes = NULL;
599 /* Analyze the function scheduled to be output. */
600 void
601 cgraph_analyze_function (struct cgraph_node *node)
603 tree decl = node->decl;
604 location_t saved_loc = input_location;
605 input_location = DECL_SOURCE_LOCATION (decl);
607 if (node->thunk.thunk_p)
609 cgraph_create_edge (node, cgraph_get_node (node->thunk.alias),
610 NULL, 0, CGRAPH_FREQ_BASE);
611 if (!expand_thunk (node, false))
613 node->thunk.alias = NULL;
614 node->analyzed = true;
615 return;
617 node->thunk.alias = NULL;
619 if (node->alias)
620 symtab_resolve_alias
621 (node, cgraph_get_node (node->alias_target));
622 else if (node->dispatcher_function)
624 /* Generate the dispatcher body of multi-versioned functions. */
625 struct cgraph_function_version_info *dispatcher_version_info
626 = get_cgraph_node_version (node);
627 if (dispatcher_version_info != NULL
628 && (dispatcher_version_info->dispatcher_resolver
629 == NULL_TREE))
631 tree resolver = NULL_TREE;
632 gcc_assert (targetm.generate_version_dispatcher_body);
633 resolver = targetm.generate_version_dispatcher_body (node);
634 gcc_assert (resolver != NULL_TREE);
637 else
639 push_cfun (DECL_STRUCT_FUNCTION (decl));
641 assign_assembler_name_if_neeeded (node->decl);
643 /* Make sure to gimplify bodies only once. During analyzing a
644 function we lower it, which will require gimplified nested
645 functions, so we can end up here with an already gimplified
646 body. */
647 if (!gimple_has_body_p (decl))
648 gimplify_function_tree (decl);
649 dump_function (TDI_generic, decl);
651 /* Lower the function. */
652 if (!node->lowered)
654 if (node->nested)
655 lower_nested_functions (node->decl);
656 gcc_assert (!node->nested);
658 gimple_register_cfg_hooks ();
659 bitmap_obstack_initialize (NULL);
660 execute_pass_list (g->get_passes ()->all_lowering_passes);
661 free_dominance_info (CDI_POST_DOMINATORS);
662 free_dominance_info (CDI_DOMINATORS);
663 compact_blocks ();
664 bitmap_obstack_release (NULL);
665 node->lowered = true;
668 pop_cfun ();
670 node->analyzed = true;
672 input_location = saved_loc;
675 /* C++ frontend produce same body aliases all over the place, even before PCH
676 gets streamed out. It relies on us linking the aliases with their function
677 in order to do the fixups, but ipa-ref is not PCH safe. Consequentely we
678 first produce aliases without links, but once C++ FE is sure he won't sream
679 PCH we build the links via this function. */
681 void
682 cgraph_process_same_body_aliases (void)
684 symtab_node *node;
685 FOR_EACH_SYMBOL (node)
686 if (node->cpp_implicit_alias && !node->analyzed)
687 symtab_resolve_alias
688 (node,
689 TREE_CODE (node->alias_target) == VAR_DECL
690 ? (symtab_node *)varpool_node_for_decl (node->alias_target)
691 : (symtab_node *)cgraph_get_create_node (node->alias_target));
692 cpp_implicit_aliases_done = true;
695 /* Process attributes common for vars and functions. */
697 static void
698 process_common_attributes (tree decl)
700 tree weakref = lookup_attribute ("weakref", DECL_ATTRIBUTES (decl));
702 if (weakref && !lookup_attribute ("alias", DECL_ATTRIBUTES (decl)))
704 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
705 "%<weakref%> attribute should be accompanied with"
706 " an %<alias%> attribute");
707 DECL_WEAK (decl) = 0;
708 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
709 DECL_ATTRIBUTES (decl));
713 /* Look for externally_visible and used attributes and mark cgraph nodes
714 accordingly.
716 We cannot mark the nodes at the point the attributes are processed (in
717 handle_*_attribute) because the copy of the declarations available at that
718 point may not be canonical. For example, in:
720 void f();
721 void f() __attribute__((used));
723 the declaration we see in handle_used_attribute will be the second
724 declaration -- but the front end will subsequently merge that declaration
725 with the original declaration and discard the second declaration.
727 Furthermore, we can't mark these nodes in cgraph_finalize_function because:
729 void f() {}
730 void f() __attribute__((externally_visible));
732 is valid.
734 So, we walk the nodes at the end of the translation unit, applying the
735 attributes at that point. */
737 static void
738 process_function_and_variable_attributes (struct cgraph_node *first,
739 varpool_node *first_var)
741 struct cgraph_node *node;
742 varpool_node *vnode;
744 for (node = cgraph_first_function (); node != first;
745 node = cgraph_next_function (node))
747 tree decl = node->decl;
748 if (DECL_PRESERVE_P (decl))
749 cgraph_mark_force_output_node (node);
750 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
752 if (! TREE_PUBLIC (node->decl))
753 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
754 "%<externally_visible%>"
755 " attribute have effect only on public objects");
757 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
758 && (node->definition && !node->alias))
760 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
761 "%<weakref%> attribute ignored"
762 " because function is defined");
763 DECL_WEAK (decl) = 0;
764 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
765 DECL_ATTRIBUTES (decl));
768 if (lookup_attribute ("always_inline", DECL_ATTRIBUTES (decl))
769 && !DECL_DECLARED_INLINE_P (decl)
770 /* redefining extern inline function makes it DECL_UNINLINABLE. */
771 && !DECL_UNINLINABLE (decl))
772 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
773 "always_inline function might not be inlinable");
775 process_common_attributes (decl);
777 for (vnode = varpool_first_variable (); vnode != first_var;
778 vnode = varpool_next_variable (vnode))
780 tree decl = vnode->decl;
781 if (DECL_EXTERNAL (decl)
782 && DECL_INITIAL (decl))
783 varpool_finalize_decl (decl);
784 if (DECL_PRESERVE_P (decl))
785 vnode->force_output = true;
786 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
788 if (! TREE_PUBLIC (vnode->decl))
789 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
790 "%<externally_visible%>"
791 " attribute have effect only on public objects");
793 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
794 && vnode->definition
795 && DECL_INITIAL (decl))
797 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
798 "%<weakref%> attribute ignored"
799 " because variable is initialized");
800 DECL_WEAK (decl) = 0;
801 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
802 DECL_ATTRIBUTES (decl));
804 process_common_attributes (decl);
808 /* Mark DECL as finalized. By finalizing the declaration, frontend instruct the
809 middle end to output the variable to asm file, if needed or externally
810 visible. */
812 void
813 varpool_finalize_decl (tree decl)
815 varpool_node *node = varpool_node_for_decl (decl);
817 gcc_assert (TREE_STATIC (decl) || DECL_EXTERNAL (decl));
819 if (node->definition)
820 return;
821 notice_global_symbol (decl);
822 node->definition = true;
823 if (TREE_THIS_VOLATILE (decl) || DECL_PRESERVE_P (decl)
824 /* Traditionally we do not eliminate static variables when not
825 optimizing and when not doing toplevel reoder. */
826 || (!flag_toplevel_reorder && !DECL_COMDAT (node->decl)
827 && !DECL_ARTIFICIAL (node->decl)))
828 node->force_output = true;
830 if (cgraph_state == CGRAPH_STATE_CONSTRUCTION
831 && (decide_is_symbol_needed (node)
832 || referred_to_p (node)))
833 enqueue_node (node);
834 if (cgraph_state >= CGRAPH_STATE_IPA_SSA)
835 varpool_analyze_node (node);
836 /* Some frontends produce various interface variables after compilation
837 finished. */
838 if (cgraph_state == CGRAPH_STATE_FINISHED
839 || (!flag_toplevel_reorder && cgraph_state == CGRAPH_STATE_EXPANSION))
840 varpool_assemble_decl (node);
843 /* EDGE is an polymorphic call. Mark all possible targets as reachable
844 and if there is only one target, perform trivial devirtualization.
845 REACHABLE_CALL_TARGETS collects target lists we already walked to
846 avoid udplicate work. */
848 static void
849 walk_polymorphic_call_targets (pointer_set_t *reachable_call_targets,
850 struct cgraph_edge *edge)
852 unsigned int i;
853 void *cache_token;
854 bool final;
855 vec <cgraph_node *>targets
856 = possible_polymorphic_call_targets
857 (edge, &final, &cache_token);
859 if (!pointer_set_insert (reachable_call_targets,
860 cache_token))
862 if (cgraph_dump_file)
863 dump_possible_polymorphic_call_targets
864 (cgraph_dump_file, edge);
866 for (i = 0; i < targets.length (); i++)
868 /* Do not bother to mark virtual methods in anonymous namespace;
869 either we will find use of virtual table defining it, or it is
870 unused. */
871 if (targets[i]->definition
872 && TREE_CODE
873 (TREE_TYPE (targets[i]->decl))
874 == METHOD_TYPE
875 && !type_in_anonymous_namespace_p
876 (method_class_type
877 (TREE_TYPE (targets[i]->decl))))
878 enqueue_node (targets[i]);
882 /* Very trivial devirtualization; when the type is
883 final or anonymous (so we know all its derivation)
884 and there is only one possible virtual call target,
885 make the edge direct. */
886 if (final)
888 if (targets.length () <= 1)
890 cgraph_node *target;
891 if (targets.length () == 1)
892 target = targets[0];
893 else
894 target = cgraph_get_create_node
895 (builtin_decl_implicit (BUILT_IN_UNREACHABLE));
897 if (cgraph_dump_file)
899 fprintf (cgraph_dump_file,
900 "Devirtualizing call: ");
901 print_gimple_stmt (cgraph_dump_file,
902 edge->call_stmt, 0,
903 TDF_SLIM);
905 cgraph_make_edge_direct (edge, target);
906 cgraph_redirect_edge_call_stmt_to_callee (edge);
907 if (cgraph_dump_file)
909 fprintf (cgraph_dump_file,
910 "Devirtualized as: ");
911 print_gimple_stmt (cgraph_dump_file,
912 edge->call_stmt, 0,
913 TDF_SLIM);
920 /* Discover all functions and variables that are trivially needed, analyze
921 them as well as all functions and variables referred by them */
923 static void
924 analyze_functions (void)
926 /* Keep track of already processed nodes when called multiple times for
927 intermodule optimization. */
928 static struct cgraph_node *first_analyzed;
929 struct cgraph_node *first_handled = first_analyzed;
930 static varpool_node *first_analyzed_var;
931 varpool_node *first_handled_var = first_analyzed_var;
932 struct pointer_set_t *reachable_call_targets = pointer_set_create ();
934 symtab_node *node;
935 symtab_node *next;
936 int i;
937 struct ipa_ref *ref;
938 bool changed = true;
939 location_t saved_loc = input_location;
941 bitmap_obstack_initialize (NULL);
942 cgraph_state = CGRAPH_STATE_CONSTRUCTION;
943 input_location = UNKNOWN_LOCATION;
945 /* Ugly, but the fixup can not happen at a time same body alias is created;
946 C++ FE is confused about the COMDAT groups being right. */
947 if (cpp_implicit_aliases_done)
948 FOR_EACH_SYMBOL (node)
949 if (node->cpp_implicit_alias)
950 fixup_same_cpp_alias_visibility (node, symtab_alias_target (node));
951 if (optimize && flag_devirtualize)
952 build_type_inheritance_graph ();
954 /* Analysis adds static variables that in turn adds references to new functions.
955 So we need to iterate the process until it stabilize. */
956 while (changed)
958 changed = false;
959 process_function_and_variable_attributes (first_analyzed,
960 first_analyzed_var);
962 /* First identify the trivially needed symbols. */
963 for (node = symtab_nodes;
964 node != first_analyzed
965 && node != first_analyzed_var; node = node->next)
967 if (decide_is_symbol_needed (node))
969 enqueue_node (node);
970 if (!changed && cgraph_dump_file)
971 fprintf (cgraph_dump_file, "Trivially needed symbols:");
972 changed = true;
973 if (cgraph_dump_file)
974 fprintf (cgraph_dump_file, " %s", node->asm_name ());
975 if (!changed && cgraph_dump_file)
976 fprintf (cgraph_dump_file, "\n");
978 if (node == first_analyzed
979 || node == first_analyzed_var)
980 break;
982 cgraph_process_new_functions ();
983 first_analyzed_var = varpool_first_variable ();
984 first_analyzed = cgraph_first_function ();
986 if (changed && dump_file)
987 fprintf (cgraph_dump_file, "\n");
989 /* Lower representation, build callgraph edges and references for all trivially
990 needed symbols and all symbols referred by them. */
991 while (queued_nodes != &symtab_terminator)
993 changed = true;
994 node = queued_nodes;
995 queued_nodes = (symtab_node *)queued_nodes->aux;
996 cgraph_node *cnode = dyn_cast <cgraph_node> (node);
997 if (cnode && cnode->definition)
999 struct cgraph_edge *edge;
1000 tree decl = cnode->decl;
1002 /* ??? It is possible to create extern inline function
1003 and later using weak alias attribute to kill its body.
1004 See gcc.c-torture/compile/20011119-1.c */
1005 if (!DECL_STRUCT_FUNCTION (decl)
1006 && !cnode->alias
1007 && !cnode->thunk.thunk_p
1008 && !cnode->dispatcher_function)
1010 cgraph_reset_node (cnode);
1011 cnode->local.redefined_extern_inline = true;
1012 continue;
1015 if (!cnode->analyzed)
1016 cgraph_analyze_function (cnode);
1018 for (edge = cnode->callees; edge; edge = edge->next_callee)
1019 if (edge->callee->definition)
1020 enqueue_node (edge->callee);
1021 if (optimize && flag_devirtualize)
1023 struct cgraph_edge *next;
1025 for (edge = cnode->indirect_calls; edge; edge = next)
1027 next = edge->next_callee;
1028 if (edge->indirect_info->polymorphic)
1029 walk_polymorphic_call_targets (reachable_call_targets,
1030 edge);
1034 /* If decl is a clone of an abstract function,
1035 mark that abstract function so that we don't release its body.
1036 The DECL_INITIAL() of that abstract function declaration
1037 will be later needed to output debug info. */
1038 if (DECL_ABSTRACT_ORIGIN (decl))
1040 struct cgraph_node *origin_node
1041 = cgraph_get_create_node (DECL_ABSTRACT_ORIGIN (decl));
1042 origin_node->used_as_abstract_origin = true;
1045 else
1047 varpool_node *vnode = dyn_cast <varpool_node> (node);
1048 if (vnode && vnode->definition && !vnode->analyzed)
1049 varpool_analyze_node (vnode);
1052 if (node->same_comdat_group)
1054 symtab_node *next;
1055 for (next = node->same_comdat_group;
1056 next != node;
1057 next = next->same_comdat_group)
1058 enqueue_node (next);
1060 for (i = 0; ipa_ref_list_reference_iterate (&node->ref_list, i, ref); i++)
1061 if (ref->referred->definition)
1062 enqueue_node (ref->referred);
1063 cgraph_process_new_functions ();
1066 if (optimize && flag_devirtualize)
1067 update_type_inheritance_graph ();
1069 /* Collect entry points to the unit. */
1070 if (cgraph_dump_file)
1072 fprintf (cgraph_dump_file, "\n\nInitial ");
1073 dump_symtab (cgraph_dump_file);
1076 if (cgraph_dump_file)
1077 fprintf (cgraph_dump_file, "\nRemoving unused symbols:");
1079 for (node = symtab_nodes;
1080 node != first_handled
1081 && node != first_handled_var; node = next)
1083 next = node->next;
1084 if (!node->aux && !referred_to_p (node))
1086 if (cgraph_dump_file)
1087 fprintf (cgraph_dump_file, " %s", node->name ());
1088 symtab_remove_node (node);
1089 continue;
1091 if (cgraph_node *cnode = dyn_cast <cgraph_node> (node))
1093 tree decl = node->decl;
1095 if (cnode->definition && !gimple_has_body_p (decl)
1096 && !cnode->alias
1097 && !cnode->thunk.thunk_p)
1098 cgraph_reset_node (cnode);
1100 gcc_assert (!cnode->definition || cnode->thunk.thunk_p
1101 || cnode->alias
1102 || gimple_has_body_p (decl));
1103 gcc_assert (cnode->analyzed == cnode->definition);
1105 node->aux = NULL;
1107 for (;node; node = node->next)
1108 node->aux = NULL;
1109 first_analyzed = cgraph_first_function ();
1110 first_analyzed_var = varpool_first_variable ();
1111 if (cgraph_dump_file)
1113 fprintf (cgraph_dump_file, "\n\nReclaimed ");
1114 dump_symtab (cgraph_dump_file);
1116 bitmap_obstack_release (NULL);
1117 pointer_set_destroy (reachable_call_targets);
1118 ggc_collect ();
1119 /* Initialize assembler name hash, in particular we want to trigger C++
1120 mangling and same body alias creation before we free DECL_ARGUMENTS
1121 used by it. */
1122 if (!seen_error ())
1123 symtab_initialize_asm_name_hash ();
1125 input_location = saved_loc;
1128 /* Translate the ugly representation of aliases as alias pairs into nice
1129 representation in callgraph. We don't handle all cases yet,
1130 unfortunately. */
1132 static void
1133 handle_alias_pairs (void)
1135 alias_pair *p;
1136 unsigned i;
1138 for (i = 0; alias_pairs && alias_pairs->iterate (i, &p);)
1140 symtab_node *target_node = symtab_node_for_asm (p->target);
1142 /* Weakrefs with target not defined in current unit are easy to handle:
1143 they behave just as external variables except we need to note the
1144 alias flag to later output the weakref pseudo op into asm file. */
1145 if (!target_node
1146 && lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)) != NULL)
1148 symtab_node *node = symtab_get_node (p->decl);
1149 if (node)
1151 node->alias_target = p->target;
1152 node->weakref = true;
1153 node->alias = true;
1155 alias_pairs->unordered_remove (i);
1156 continue;
1158 else if (!target_node)
1160 error ("%q+D aliased to undefined symbol %qE", p->decl, p->target);
1161 symtab_node *node = symtab_get_node (p->decl);
1162 if (node)
1163 node->alias = false;
1164 alias_pairs->unordered_remove (i);
1165 continue;
1168 if (DECL_EXTERNAL (target_node->decl)
1169 /* We use local aliases for C++ thunks to force the tailcall
1170 to bind locally. This is a hack - to keep it working do
1171 the following (which is not strictly correct). */
1172 && (TREE_CODE (target_node->decl) != FUNCTION_DECL
1173 || ! DECL_VIRTUAL_P (target_node->decl))
1174 && ! lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)))
1176 error ("%q+D aliased to external symbol %qE",
1177 p->decl, p->target);
1180 if (TREE_CODE (p->decl) == FUNCTION_DECL
1181 && target_node && is_a <cgraph_node> (target_node))
1183 struct cgraph_node *src_node = cgraph_get_node (p->decl);
1184 if (src_node && src_node->definition)
1185 cgraph_reset_node (src_node);
1186 cgraph_create_function_alias (p->decl, target_node->decl);
1187 alias_pairs->unordered_remove (i);
1189 else if (TREE_CODE (p->decl) == VAR_DECL
1190 && target_node && is_a <varpool_node> (target_node))
1192 varpool_create_variable_alias (p->decl, target_node->decl);
1193 alias_pairs->unordered_remove (i);
1195 else
1197 error ("%q+D alias in between function and variable is not supported",
1198 p->decl);
1199 warning (0, "%q+D aliased declaration",
1200 target_node->decl);
1201 alias_pairs->unordered_remove (i);
1204 vec_free (alias_pairs);
1208 /* Figure out what functions we want to assemble. */
1210 static void
1211 mark_functions_to_output (void)
1213 struct cgraph_node *node;
1214 #ifdef ENABLE_CHECKING
1215 bool check_same_comdat_groups = false;
1217 FOR_EACH_FUNCTION (node)
1218 gcc_assert (!node->process);
1219 #endif
1221 FOR_EACH_FUNCTION (node)
1223 tree decl = node->decl;
1225 gcc_assert (!node->process || node->same_comdat_group);
1226 if (node->process)
1227 continue;
1229 /* We need to output all local functions that are used and not
1230 always inlined, as well as those that are reachable from
1231 outside the current compilation unit. */
1232 if (node->analyzed
1233 && !node->thunk.thunk_p
1234 && !node->alias
1235 && !node->global.inlined_to
1236 && !TREE_ASM_WRITTEN (decl)
1237 && !DECL_EXTERNAL (decl))
1239 node->process = 1;
1240 if (node->same_comdat_group)
1242 struct cgraph_node *next;
1243 for (next = cgraph (node->same_comdat_group);
1244 next != node;
1245 next = cgraph (next->same_comdat_group))
1246 if (!next->thunk.thunk_p && !next->alias
1247 && !symtab_comdat_local_p (next))
1248 next->process = 1;
1251 else if (node->same_comdat_group)
1253 #ifdef ENABLE_CHECKING
1254 check_same_comdat_groups = true;
1255 #endif
1257 else
1259 /* We should've reclaimed all functions that are not needed. */
1260 #ifdef ENABLE_CHECKING
1261 if (!node->global.inlined_to
1262 && gimple_has_body_p (decl)
1263 /* FIXME: in ltrans unit when offline copy is outside partition but inline copies
1264 are inside partition, we can end up not removing the body since we no longer
1265 have analyzed node pointing to it. */
1266 && !node->in_other_partition
1267 && !node->alias
1268 && !node->clones
1269 && !DECL_EXTERNAL (decl))
1271 dump_cgraph_node (stderr, node);
1272 internal_error ("failed to reclaim unneeded function");
1274 #endif
1275 gcc_assert (node->global.inlined_to
1276 || !gimple_has_body_p (decl)
1277 || node->in_other_partition
1278 || node->clones
1279 || DECL_ARTIFICIAL (decl)
1280 || DECL_EXTERNAL (decl));
1285 #ifdef ENABLE_CHECKING
1286 if (check_same_comdat_groups)
1287 FOR_EACH_FUNCTION (node)
1288 if (node->same_comdat_group && !node->process)
1290 tree decl = node->decl;
1291 if (!node->global.inlined_to
1292 && gimple_has_body_p (decl)
1293 /* FIXME: in an ltrans unit when the offline copy is outside a
1294 partition but inline copies are inside a partition, we can
1295 end up not removing the body since we no longer have an
1296 analyzed node pointing to it. */
1297 && !node->in_other_partition
1298 && !node->clones
1299 && !DECL_EXTERNAL (decl))
1301 dump_cgraph_node (stderr, node);
1302 internal_error ("failed to reclaim unneeded function in same "
1303 "comdat group");
1306 #endif
1309 /* DECL is FUNCTION_DECL. Initialize datastructures so DECL is a function
1310 in lowered gimple form. IN_SSA is true if the gimple is in SSA.
1312 Set current_function_decl and cfun to newly constructed empty function body.
1313 return basic block in the function body. */
1315 basic_block
1316 init_lowered_empty_function (tree decl, bool in_ssa)
1318 basic_block bb;
1320 current_function_decl = decl;
1321 allocate_struct_function (decl, false);
1322 gimple_register_cfg_hooks ();
1323 init_empty_tree_cfg ();
1325 if (in_ssa)
1327 init_tree_ssa (cfun);
1328 init_ssa_operands (cfun);
1329 cfun->gimple_df->in_ssa_p = true;
1330 cfun->curr_properties |= PROP_ssa;
1333 DECL_INITIAL (decl) = make_node (BLOCK);
1335 DECL_SAVED_TREE (decl) = error_mark_node;
1336 cfun->curr_properties |= (PROP_gimple_lcf | PROP_gimple_leh | PROP_gimple_any
1337 | PROP_cfg | PROP_loops);
1339 set_loops_for_fn (cfun, ggc_alloc_cleared_loops ());
1340 init_loops_structure (cfun, loops_for_fn (cfun), 1);
1341 loops_for_fn (cfun)->state |= LOOPS_MAY_HAVE_MULTIPLE_LATCHES;
1343 /* Create BB for body of the function and connect it properly. */
1344 bb = create_basic_block (NULL, (void *) 0, ENTRY_BLOCK_PTR_FOR_FN (cfun));
1345 make_edge (ENTRY_BLOCK_PTR_FOR_FN (cfun), bb, EDGE_FALLTHRU);
1346 make_edge (bb, EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
1347 add_bb_to_loop (bb, ENTRY_BLOCK_PTR_FOR_FN (cfun)->loop_father);
1349 return bb;
1352 /* Adjust PTR by the constant FIXED_OFFSET, and by the vtable
1353 offset indicated by VIRTUAL_OFFSET, if that is
1354 non-null. THIS_ADJUSTING is nonzero for a this adjusting thunk and
1355 zero for a result adjusting thunk. */
1357 static tree
1358 thunk_adjust (gimple_stmt_iterator * bsi,
1359 tree ptr, bool this_adjusting,
1360 HOST_WIDE_INT fixed_offset, tree virtual_offset)
1362 gimple stmt;
1363 tree ret;
1365 if (this_adjusting
1366 && fixed_offset != 0)
1368 stmt = gimple_build_assign
1369 (ptr, fold_build_pointer_plus_hwi_loc (input_location,
1370 ptr,
1371 fixed_offset));
1372 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1375 /* If there's a virtual offset, look up that value in the vtable and
1376 adjust the pointer again. */
1377 if (virtual_offset)
1379 tree vtabletmp;
1380 tree vtabletmp2;
1381 tree vtabletmp3;
1383 if (!vtable_entry_type)
1385 tree vfunc_type = make_node (FUNCTION_TYPE);
1386 TREE_TYPE (vfunc_type) = integer_type_node;
1387 TYPE_ARG_TYPES (vfunc_type) = NULL_TREE;
1388 layout_type (vfunc_type);
1390 vtable_entry_type = build_pointer_type (vfunc_type);
1393 vtabletmp =
1394 create_tmp_reg (build_pointer_type
1395 (build_pointer_type (vtable_entry_type)), "vptr");
1397 /* The vptr is always at offset zero in the object. */
1398 stmt = gimple_build_assign (vtabletmp,
1399 build1 (NOP_EXPR, TREE_TYPE (vtabletmp),
1400 ptr));
1401 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1403 /* Form the vtable address. */
1404 vtabletmp2 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp)),
1405 "vtableaddr");
1406 stmt = gimple_build_assign (vtabletmp2,
1407 build_simple_mem_ref (vtabletmp));
1408 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1410 /* Find the entry with the vcall offset. */
1411 stmt = gimple_build_assign (vtabletmp2,
1412 fold_build_pointer_plus_loc (input_location,
1413 vtabletmp2,
1414 virtual_offset));
1415 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1417 /* Get the offset itself. */
1418 vtabletmp3 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp2)),
1419 "vcalloffset");
1420 stmt = gimple_build_assign (vtabletmp3,
1421 build_simple_mem_ref (vtabletmp2));
1422 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1424 /* Adjust the `this' pointer. */
1425 ptr = fold_build_pointer_plus_loc (input_location, ptr, vtabletmp3);
1426 ptr = force_gimple_operand_gsi (bsi, ptr, true, NULL_TREE, false,
1427 GSI_CONTINUE_LINKING);
1430 if (!this_adjusting
1431 && fixed_offset != 0)
1432 /* Adjust the pointer by the constant. */
1434 tree ptrtmp;
1436 if (TREE_CODE (ptr) == VAR_DECL)
1437 ptrtmp = ptr;
1438 else
1440 ptrtmp = create_tmp_reg (TREE_TYPE (ptr), "ptr");
1441 stmt = gimple_build_assign (ptrtmp, ptr);
1442 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1444 ptr = fold_build_pointer_plus_hwi_loc (input_location,
1445 ptrtmp, fixed_offset);
1448 /* Emit the statement and gimplify the adjustment expression. */
1449 ret = create_tmp_reg (TREE_TYPE (ptr), "adjusted_this");
1450 stmt = gimple_build_assign (ret, ptr);
1451 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1453 return ret;
1456 /* Expand thunk NODE to gimple if possible.
1457 When OUTPUT_ASM_THUNK is true, also produce assembler for
1458 thunks that are not lowered. */
1460 bool
1461 expand_thunk (struct cgraph_node *node, bool output_asm_thunks)
1463 bool this_adjusting = node->thunk.this_adjusting;
1464 HOST_WIDE_INT fixed_offset = node->thunk.fixed_offset;
1465 HOST_WIDE_INT virtual_value = node->thunk.virtual_value;
1466 tree virtual_offset = NULL;
1467 tree alias = node->callees->callee->decl;
1468 tree thunk_fndecl = node->decl;
1469 tree a;
1472 if (this_adjusting
1473 && targetm.asm_out.can_output_mi_thunk (thunk_fndecl, fixed_offset,
1474 virtual_value, alias))
1476 const char *fnname;
1477 tree fn_block;
1478 tree restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1480 if (!output_asm_thunks)
1481 return false;
1483 if (in_lto_p)
1484 cgraph_get_body (node);
1485 a = DECL_ARGUMENTS (thunk_fndecl);
1487 current_function_decl = thunk_fndecl;
1489 /* Ensure thunks are emitted in their correct sections. */
1490 resolve_unique_section (thunk_fndecl, 0, flag_function_sections);
1492 DECL_RESULT (thunk_fndecl)
1493 = build_decl (DECL_SOURCE_LOCATION (thunk_fndecl),
1494 RESULT_DECL, 0, restype);
1495 DECL_CONTEXT (DECL_RESULT (thunk_fndecl)) = thunk_fndecl;
1496 fnname = IDENTIFIER_POINTER (DECL_ASSEMBLER_NAME (thunk_fndecl));
1498 /* The back end expects DECL_INITIAL to contain a BLOCK, so we
1499 create one. */
1500 fn_block = make_node (BLOCK);
1501 BLOCK_VARS (fn_block) = a;
1502 DECL_INITIAL (thunk_fndecl) = fn_block;
1503 init_function_start (thunk_fndecl);
1504 cfun->is_thunk = 1;
1505 insn_locations_init ();
1506 set_curr_insn_location (DECL_SOURCE_LOCATION (thunk_fndecl));
1507 prologue_location = curr_insn_location ();
1508 assemble_start_function (thunk_fndecl, fnname);
1510 targetm.asm_out.output_mi_thunk (asm_out_file, thunk_fndecl,
1511 fixed_offset, virtual_value, alias);
1513 assemble_end_function (thunk_fndecl, fnname);
1514 insn_locations_finalize ();
1515 init_insn_lengths ();
1516 free_after_compilation (cfun);
1517 set_cfun (NULL);
1518 TREE_ASM_WRITTEN (thunk_fndecl) = 1;
1519 node->thunk.thunk_p = false;
1520 node->analyzed = false;
1522 else
1524 tree restype;
1525 basic_block bb, then_bb, else_bb, return_bb;
1526 gimple_stmt_iterator bsi;
1527 int nargs = 0;
1528 tree arg;
1529 int i;
1530 tree resdecl;
1531 tree restmp = NULL;
1533 gimple call;
1534 gimple ret;
1536 if (in_lto_p)
1537 cgraph_get_body (node);
1538 a = DECL_ARGUMENTS (thunk_fndecl);
1540 current_function_decl = thunk_fndecl;
1542 /* Ensure thunks are emitted in their correct sections. */
1543 resolve_unique_section (thunk_fndecl, 0, flag_function_sections);
1545 DECL_IGNORED_P (thunk_fndecl) = 1;
1546 bitmap_obstack_initialize (NULL);
1548 if (node->thunk.virtual_offset_p)
1549 virtual_offset = size_int (virtual_value);
1551 /* Build the return declaration for the function. */
1552 restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1553 if (DECL_RESULT (thunk_fndecl) == NULL_TREE)
1555 resdecl = build_decl (input_location, RESULT_DECL, 0, restype);
1556 DECL_ARTIFICIAL (resdecl) = 1;
1557 DECL_IGNORED_P (resdecl) = 1;
1558 DECL_RESULT (thunk_fndecl) = resdecl;
1559 DECL_CONTEXT (DECL_RESULT (thunk_fndecl)) = thunk_fndecl;
1561 else
1562 resdecl = DECL_RESULT (thunk_fndecl);
1564 bb = then_bb = else_bb = return_bb = init_lowered_empty_function (thunk_fndecl, true);
1566 bsi = gsi_start_bb (bb);
1568 /* Build call to the function being thunked. */
1569 if (!VOID_TYPE_P (restype))
1571 if (DECL_BY_REFERENCE (resdecl))
1572 restmp = gimple_fold_indirect_ref (resdecl);
1573 else if (!is_gimple_reg_type (restype))
1575 if (aggregate_value_p (resdecl, TREE_TYPE (thunk_fndecl)))
1577 restmp = resdecl;
1578 add_local_decl (cfun, restmp);
1579 BLOCK_VARS (DECL_INITIAL (current_function_decl)) = restmp;
1581 else
1582 restmp = create_tmp_var (restype, "retval");
1584 else
1585 restmp = create_tmp_reg (restype, "retval");
1588 for (arg = a; arg; arg = DECL_CHAIN (arg))
1589 nargs++;
1590 auto_vec<tree> vargs (nargs);
1591 if (this_adjusting)
1592 vargs.quick_push (thunk_adjust (&bsi, a, 1, fixed_offset,
1593 virtual_offset));
1594 else if (nargs)
1595 vargs.quick_push (a);
1597 if (nargs)
1598 for (i = 1, arg = DECL_CHAIN (a); i < nargs; i++, arg = DECL_CHAIN (arg))
1600 tree tmp = arg;
1601 if (!is_gimple_val (arg))
1603 tmp = create_tmp_reg (TYPE_MAIN_VARIANT
1604 (TREE_TYPE (arg)), "arg");
1605 gimple stmt = gimple_build_assign (tmp, arg);
1606 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1608 vargs.quick_push (tmp);
1610 call = gimple_build_call_vec (build_fold_addr_expr_loc (0, alias), vargs);
1611 node->callees->call_stmt = call;
1612 gimple_call_set_from_thunk (call, true);
1613 if (restmp)
1615 gimple_call_set_lhs (call, restmp);
1616 gcc_assert (useless_type_conversion_p (TREE_TYPE (restmp),
1617 TREE_TYPE (TREE_TYPE (alias))));
1619 gsi_insert_after (&bsi, call, GSI_NEW_STMT);
1620 if (!(gimple_call_flags (call) & ECF_NORETURN))
1622 if (restmp && !this_adjusting
1623 && (fixed_offset || virtual_offset))
1625 tree true_label = NULL_TREE;
1627 if (TREE_CODE (TREE_TYPE (restmp)) == POINTER_TYPE)
1629 gimple stmt;
1630 /* If the return type is a pointer, we need to
1631 protect against NULL. We know there will be an
1632 adjustment, because that's why we're emitting a
1633 thunk. */
1634 then_bb = create_basic_block (NULL, (void *) 0, bb);
1635 return_bb = create_basic_block (NULL, (void *) 0, then_bb);
1636 else_bb = create_basic_block (NULL, (void *) 0, else_bb);
1637 add_bb_to_loop (then_bb, bb->loop_father);
1638 add_bb_to_loop (return_bb, bb->loop_father);
1639 add_bb_to_loop (else_bb, bb->loop_father);
1640 remove_edge (single_succ_edge (bb));
1641 true_label = gimple_block_label (then_bb);
1642 stmt = gimple_build_cond (NE_EXPR, restmp,
1643 build_zero_cst (TREE_TYPE (restmp)),
1644 NULL_TREE, NULL_TREE);
1645 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1646 make_edge (bb, then_bb, EDGE_TRUE_VALUE);
1647 make_edge (bb, else_bb, EDGE_FALSE_VALUE);
1648 make_edge (return_bb, EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
1649 make_edge (then_bb, return_bb, EDGE_FALLTHRU);
1650 make_edge (else_bb, return_bb, EDGE_FALLTHRU);
1651 bsi = gsi_last_bb (then_bb);
1654 restmp = thunk_adjust (&bsi, restmp, /*this_adjusting=*/0,
1655 fixed_offset, virtual_offset);
1656 if (true_label)
1658 gimple stmt;
1659 bsi = gsi_last_bb (else_bb);
1660 stmt = gimple_build_assign (restmp,
1661 build_zero_cst (TREE_TYPE (restmp)));
1662 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1663 bsi = gsi_last_bb (return_bb);
1666 else
1667 gimple_call_set_tail (call, true);
1669 /* Build return value. */
1670 ret = gimple_build_return (restmp);
1671 gsi_insert_after (&bsi, ret, GSI_NEW_STMT);
1673 else
1675 gimple_call_set_tail (call, true);
1676 remove_edge (single_succ_edge (bb));
1679 cfun->gimple_df->in_ssa_p = true;
1680 /* FIXME: C++ FE should stop setting TREE_ASM_WRITTEN on thunks. */
1681 TREE_ASM_WRITTEN (thunk_fndecl) = false;
1682 delete_unreachable_blocks ();
1683 update_ssa (TODO_update_ssa);
1684 #ifdef ENABLE_CHECKING
1685 verify_flow_info ();
1686 #endif
1687 free_dominance_info (CDI_DOMINATORS);
1689 /* Since we want to emit the thunk, we explicitly mark its name as
1690 referenced. */
1691 node->thunk.thunk_p = false;
1692 node->lowered = true;
1693 bitmap_obstack_release (NULL);
1695 current_function_decl = NULL;
1696 set_cfun (NULL);
1697 return true;
1700 /* Assemble thunks and aliases associated to NODE. */
1702 static void
1703 assemble_thunks_and_aliases (struct cgraph_node *node)
1705 struct cgraph_edge *e;
1706 int i;
1707 struct ipa_ref *ref;
1709 for (e = node->callers; e;)
1710 if (e->caller->thunk.thunk_p)
1712 struct cgraph_node *thunk = e->caller;
1714 e = e->next_caller;
1715 assemble_thunks_and_aliases (thunk);
1716 expand_thunk (thunk, true);
1718 else
1719 e = e->next_caller;
1720 for (i = 0; ipa_ref_list_referring_iterate (&node->ref_list,
1721 i, ref); i++)
1722 if (ref->use == IPA_REF_ALIAS)
1724 struct cgraph_node *alias = ipa_ref_referring_node (ref);
1725 bool saved_written = TREE_ASM_WRITTEN (node->decl);
1727 /* Force assemble_alias to really output the alias this time instead
1728 of buffering it in same alias pairs. */
1729 TREE_ASM_WRITTEN (node->decl) = 1;
1730 do_assemble_alias (alias->decl,
1731 DECL_ASSEMBLER_NAME (node->decl));
1732 assemble_thunks_and_aliases (alias);
1733 TREE_ASM_WRITTEN (node->decl) = saved_written;
1737 /* Expand function specified by NODE. */
1739 static void
1740 expand_function (struct cgraph_node *node)
1742 tree decl = node->decl;
1743 location_t saved_loc;
1745 /* We ought to not compile any inline clones. */
1746 gcc_assert (!node->global.inlined_to);
1748 announce_function (decl);
1749 node->process = 0;
1750 gcc_assert (node->lowered);
1751 cgraph_get_body (node);
1753 /* Generate RTL for the body of DECL. */
1755 timevar_push (TV_REST_OF_COMPILATION);
1757 gcc_assert (cgraph_global_info_ready);
1759 /* Initialize the default bitmap obstack. */
1760 bitmap_obstack_initialize (NULL);
1762 /* Initialize the RTL code for the function. */
1763 current_function_decl = decl;
1764 saved_loc = input_location;
1765 input_location = DECL_SOURCE_LOCATION (decl);
1766 init_function_start (decl);
1768 gimple_register_cfg_hooks ();
1770 bitmap_obstack_initialize (&reg_obstack); /* FIXME, only at RTL generation*/
1772 execute_all_ipa_transforms ();
1774 /* Perform all tree transforms and optimizations. */
1776 /* Signal the start of passes. */
1777 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_START, NULL);
1779 execute_pass_list (g->get_passes ()->all_passes);
1781 /* Signal the end of passes. */
1782 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_END, NULL);
1784 bitmap_obstack_release (&reg_obstack);
1786 /* Release the default bitmap obstack. */
1787 bitmap_obstack_release (NULL);
1789 /* If requested, warn about function definitions where the function will
1790 return a value (usually of some struct or union type) which itself will
1791 take up a lot of stack space. */
1792 if (warn_larger_than && !DECL_EXTERNAL (decl) && TREE_TYPE (decl))
1794 tree ret_type = TREE_TYPE (TREE_TYPE (decl));
1796 if (ret_type && TYPE_SIZE_UNIT (ret_type)
1797 && TREE_CODE (TYPE_SIZE_UNIT (ret_type)) == INTEGER_CST
1798 && 0 < compare_tree_int (TYPE_SIZE_UNIT (ret_type),
1799 larger_than_size))
1801 unsigned int size_as_int
1802 = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (ret_type));
1804 if (compare_tree_int (TYPE_SIZE_UNIT (ret_type), size_as_int) == 0)
1805 warning (OPT_Wlarger_than_, "size of return value of %q+D is %u bytes",
1806 decl, size_as_int);
1807 else
1808 warning (OPT_Wlarger_than_, "size of return value of %q+D is larger than %wd bytes",
1809 decl, larger_than_size);
1813 gimple_set_body (decl, NULL);
1814 if (DECL_STRUCT_FUNCTION (decl) == 0
1815 && !cgraph_get_node (decl)->origin)
1817 /* Stop pointing to the local nodes about to be freed.
1818 But DECL_INITIAL must remain nonzero so we know this
1819 was an actual function definition.
1820 For a nested function, this is done in c_pop_function_context.
1821 If rest_of_compilation set this to 0, leave it 0. */
1822 if (DECL_INITIAL (decl) != 0)
1823 DECL_INITIAL (decl) = error_mark_node;
1826 input_location = saved_loc;
1828 ggc_collect ();
1829 timevar_pop (TV_REST_OF_COMPILATION);
1831 /* Make sure that BE didn't give up on compiling. */
1832 gcc_assert (TREE_ASM_WRITTEN (decl));
1833 set_cfun (NULL);
1834 current_function_decl = NULL;
1836 /* It would make a lot more sense to output thunks before function body to get more
1837 forward and lest backwarding jumps. This however would need solving problem
1838 with comdats. See PR48668. Also aliases must come after function itself to
1839 make one pass assemblers, like one on AIX, happy. See PR 50689.
1840 FIXME: Perhaps thunks should be move before function IFF they are not in comdat
1841 groups. */
1842 assemble_thunks_and_aliases (node);
1843 cgraph_release_function_body (node);
1844 /* Eliminate all call edges. This is important so the GIMPLE_CALL no longer
1845 points to the dead function body. */
1846 cgraph_node_remove_callees (node);
1847 ipa_remove_all_references (&node->ref_list);
1850 /* Node comparer that is responsible for the order that corresponds
1851 to time when a function was launched for the first time. */
1853 static int
1854 node_cmp (const void *pa, const void *pb)
1856 const struct cgraph_node *a = *(const struct cgraph_node * const *) pa;
1857 const struct cgraph_node *b = *(const struct cgraph_node * const *) pb;
1859 /* Functions with time profile must be before these without profile. */
1860 if (!a->tp_first_run || !b->tp_first_run)
1861 return a->tp_first_run - b->tp_first_run;
1863 return a->tp_first_run != b->tp_first_run
1864 ? b->tp_first_run - a->tp_first_run
1865 : b->order - a->order;
1868 /* Expand all functions that must be output.
1870 Attempt to topologically sort the nodes so function is output when
1871 all called functions are already assembled to allow data to be
1872 propagated across the callgraph. Use a stack to get smaller distance
1873 between a function and its callees (later we may choose to use a more
1874 sophisticated algorithm for function reordering; we will likely want
1875 to use subsections to make the output functions appear in top-down
1876 order). */
1878 static void
1879 expand_all_functions (void)
1881 struct cgraph_node *node;
1882 struct cgraph_node **order = XCNEWVEC (struct cgraph_node *, cgraph_n_nodes);
1883 unsigned int expanded_func_count = 0, profiled_func_count = 0;
1884 int order_pos, new_order_pos = 0;
1885 int i;
1887 order_pos = ipa_reverse_postorder (order);
1888 gcc_assert (order_pos == cgraph_n_nodes);
1890 /* Garbage collector may remove inline clones we eliminate during
1891 optimization. So we must be sure to not reference them. */
1892 for (i = 0; i < order_pos; i++)
1893 if (order[i]->process)
1894 order[new_order_pos++] = order[i];
1896 if (flag_profile_reorder_functions)
1897 qsort (order, new_order_pos, sizeof (struct cgraph_node *), node_cmp);
1899 for (i = new_order_pos - 1; i >= 0; i--)
1901 node = order[i];
1903 if (node->process)
1905 expanded_func_count++;
1906 if(node->tp_first_run)
1907 profiled_func_count++;
1909 if (cgraph_dump_file)
1910 fprintf (cgraph_dump_file, "Time profile order in expand_all_functions:%s:%d\n", node->asm_name (), node->tp_first_run);
1912 node->process = 0;
1913 expand_function (node);
1917 if (dump_file)
1918 fprintf (dump_file, "Expanded functions with time profile (%s):%u/%u\n",
1919 main_input_filename, profiled_func_count, expanded_func_count);
1921 if (cgraph_dump_file && flag_profile_reorder_functions)
1922 fprintf (cgraph_dump_file, "Expanded functions with time profile:%u/%u\n",
1923 profiled_func_count, expanded_func_count);
1925 cgraph_process_new_functions ();
1926 free_gimplify_stack ();
1928 free (order);
1931 /* This is used to sort the node types by the cgraph order number. */
1933 enum cgraph_order_sort_kind
1935 ORDER_UNDEFINED = 0,
1936 ORDER_FUNCTION,
1937 ORDER_VAR,
1938 ORDER_ASM
1941 struct cgraph_order_sort
1943 enum cgraph_order_sort_kind kind;
1944 union
1946 struct cgraph_node *f;
1947 varpool_node *v;
1948 struct asm_node *a;
1949 } u;
1952 /* Output all functions, variables, and asm statements in the order
1953 according to their order fields, which is the order in which they
1954 appeared in the file. This implements -fno-toplevel-reorder. In
1955 this mode we may output functions and variables which don't really
1956 need to be output. */
1958 static void
1959 output_in_order (void)
1961 int max;
1962 struct cgraph_order_sort *nodes;
1963 int i;
1964 struct cgraph_node *pf;
1965 varpool_node *pv;
1966 struct asm_node *pa;
1968 max = symtab_order;
1969 nodes = XCNEWVEC (struct cgraph_order_sort, max);
1971 FOR_EACH_DEFINED_FUNCTION (pf)
1973 if (pf->process && !pf->thunk.thunk_p && !pf->alias)
1975 i = pf->order;
1976 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
1977 nodes[i].kind = ORDER_FUNCTION;
1978 nodes[i].u.f = pf;
1982 FOR_EACH_DEFINED_VARIABLE (pv)
1983 if (!DECL_EXTERNAL (pv->decl))
1985 i = pv->order;
1986 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
1987 nodes[i].kind = ORDER_VAR;
1988 nodes[i].u.v = pv;
1991 for (pa = asm_nodes; pa; pa = pa->next)
1993 i = pa->order;
1994 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
1995 nodes[i].kind = ORDER_ASM;
1996 nodes[i].u.a = pa;
1999 /* In toplevel reorder mode we output all statics; mark them as needed. */
2001 for (i = 0; i < max; ++i)
2002 if (nodes[i].kind == ORDER_VAR)
2003 varpool_finalize_named_section_flags (nodes[i].u.v);
2005 for (i = 0; i < max; ++i)
2007 switch (nodes[i].kind)
2009 case ORDER_FUNCTION:
2010 nodes[i].u.f->process = 0;
2011 expand_function (nodes[i].u.f);
2012 break;
2014 case ORDER_VAR:
2015 varpool_assemble_decl (nodes[i].u.v);
2016 break;
2018 case ORDER_ASM:
2019 assemble_asm (nodes[i].u.a->asm_str);
2020 break;
2022 case ORDER_UNDEFINED:
2023 break;
2025 default:
2026 gcc_unreachable ();
2030 asm_nodes = NULL;
2031 free (nodes);
2034 static void
2035 ipa_passes (void)
2037 gcc::pass_manager *passes = g->get_passes ();
2039 set_cfun (NULL);
2040 current_function_decl = NULL;
2041 gimple_register_cfg_hooks ();
2042 bitmap_obstack_initialize (NULL);
2044 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_START, NULL);
2046 if (!in_lto_p)
2048 execute_ipa_pass_list (passes->all_small_ipa_passes);
2049 if (seen_error ())
2050 return;
2053 /* We never run removal of unreachable nodes after early passes. This is
2054 because TODO is run before the subpasses. It is important to remove
2055 the unreachable functions to save works at IPA level and to get LTO
2056 symbol tables right. */
2057 symtab_remove_unreachable_nodes (true, cgraph_dump_file);
2059 /* If pass_all_early_optimizations was not scheduled, the state of
2060 the cgraph will not be properly updated. Update it now. */
2061 if (cgraph_state < CGRAPH_STATE_IPA_SSA)
2062 cgraph_state = CGRAPH_STATE_IPA_SSA;
2064 if (!in_lto_p)
2066 /* Generate coverage variables and constructors. */
2067 coverage_finish ();
2069 /* Process new functions added. */
2070 set_cfun (NULL);
2071 current_function_decl = NULL;
2072 cgraph_process_new_functions ();
2074 execute_ipa_summary_passes
2075 ((ipa_opt_pass_d *) passes->all_regular_ipa_passes);
2078 /* Some targets need to handle LTO assembler output specially. */
2079 if (flag_generate_lto)
2080 targetm.asm_out.lto_start ();
2082 if (!in_lto_p)
2083 ipa_write_summaries ();
2085 if (flag_generate_lto)
2086 targetm.asm_out.lto_end ();
2088 if (!flag_ltrans && (in_lto_p || !flag_lto || flag_fat_lto_objects))
2089 execute_ipa_pass_list (passes->all_regular_ipa_passes);
2090 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_END, NULL);
2092 bitmap_obstack_release (NULL);
2096 /* Return string alias is alias of. */
2098 static tree
2099 get_alias_symbol (tree decl)
2101 tree alias = lookup_attribute ("alias", DECL_ATTRIBUTES (decl));
2102 return get_identifier (TREE_STRING_POINTER
2103 (TREE_VALUE (TREE_VALUE (alias))));
2107 /* Weakrefs may be associated to external decls and thus not output
2108 at expansion time. Emit all necessary aliases. */
2110 static void
2111 output_weakrefs (void)
2113 symtab_node *node;
2114 FOR_EACH_SYMBOL (node)
2115 if (node->alias
2116 && !TREE_ASM_WRITTEN (node->decl)
2117 && node->weakref)
2119 tree target;
2121 /* Weakrefs are special by not requiring target definition in current
2122 compilation unit. It is thus bit hard to work out what we want to
2123 alias.
2124 When alias target is defined, we need to fetch it from symtab reference,
2125 otherwise it is pointed to by alias_target. */
2126 if (node->alias_target)
2127 target = (DECL_P (node->alias_target)
2128 ? DECL_ASSEMBLER_NAME (node->alias_target)
2129 : node->alias_target);
2130 else if (node->analyzed)
2131 target = DECL_ASSEMBLER_NAME (symtab_alias_target (node)->decl);
2132 else
2134 gcc_unreachable ();
2135 target = get_alias_symbol (node->decl);
2137 do_assemble_alias (node->decl, target);
2141 /* Initialize callgraph dump file. */
2143 void
2144 init_cgraph (void)
2146 if (!cgraph_dump_file)
2147 cgraph_dump_file = dump_begin (TDI_cgraph, NULL);
2151 /* Perform simple optimizations based on callgraph. */
2153 void
2154 compile (void)
2156 if (seen_error ())
2157 return;
2159 #ifdef ENABLE_CHECKING
2160 verify_symtab ();
2161 #endif
2163 timevar_push (TV_CGRAPHOPT);
2164 if (pre_ipa_mem_report)
2166 fprintf (stderr, "Memory consumption before IPA\n");
2167 dump_memory_report (false);
2169 if (!quiet_flag)
2170 fprintf (stderr, "Performing interprocedural optimizations\n");
2171 cgraph_state = CGRAPH_STATE_IPA;
2173 /* If LTO is enabled, initialize the streamer hooks needed by GIMPLE. */
2174 if (flag_lto)
2175 lto_streamer_hooks_init ();
2177 /* Don't run the IPA passes if there was any error or sorry messages. */
2178 if (!seen_error ())
2179 ipa_passes ();
2181 /* Do nothing else if any IPA pass found errors or if we are just streaming LTO. */
2182 if (seen_error ()
2183 || (!in_lto_p && flag_lto && !flag_fat_lto_objects))
2185 timevar_pop (TV_CGRAPHOPT);
2186 return;
2189 /* This pass remove bodies of extern inline functions we never inlined.
2190 Do this later so other IPA passes see what is really going on. */
2191 symtab_remove_unreachable_nodes (false, dump_file);
2192 cgraph_global_info_ready = true;
2193 if (cgraph_dump_file)
2195 fprintf (cgraph_dump_file, "Optimized ");
2196 dump_symtab (cgraph_dump_file);
2198 if (post_ipa_mem_report)
2200 fprintf (stderr, "Memory consumption after IPA\n");
2201 dump_memory_report (false);
2203 timevar_pop (TV_CGRAPHOPT);
2205 /* Output everything. */
2206 (*debug_hooks->assembly_start) ();
2207 if (!quiet_flag)
2208 fprintf (stderr, "Assembling functions:\n");
2209 #ifdef ENABLE_CHECKING
2210 verify_symtab ();
2211 #endif
2213 cgraph_materialize_all_clones ();
2214 bitmap_obstack_initialize (NULL);
2215 execute_ipa_pass_list (g->get_passes ()->all_late_ipa_passes);
2216 symtab_remove_unreachable_nodes (true, dump_file);
2217 #ifdef ENABLE_CHECKING
2218 verify_symtab ();
2219 #endif
2220 bitmap_obstack_release (NULL);
2221 mark_functions_to_output ();
2223 /* When weakref support is missing, we autmatically translate all
2224 references to NODE to references to its ultimate alias target.
2225 The renaming mechanizm uses flag IDENTIFIER_TRANSPARENT_ALIAS and
2226 TREE_CHAIN.
2228 Set up this mapping before we output any assembler but once we are sure
2229 that all symbol renaming is done.
2231 FIXME: All this uglyness can go away if we just do renaming at gimple
2232 level by physically rewritting the IL. At the moment we can only redirect
2233 calls, so we need infrastructure for renaming references as well. */
2234 #ifndef ASM_OUTPUT_WEAKREF
2235 symtab_node *node;
2237 FOR_EACH_SYMBOL (node)
2238 if (node->alias
2239 && lookup_attribute ("weakref", DECL_ATTRIBUTES (node->decl)))
2241 IDENTIFIER_TRANSPARENT_ALIAS
2242 (DECL_ASSEMBLER_NAME (node->decl)) = 1;
2243 TREE_CHAIN (DECL_ASSEMBLER_NAME (node->decl))
2244 = (node->alias_target ? node->alias_target
2245 : DECL_ASSEMBLER_NAME (symtab_alias_target (node)->decl));
2247 #endif
2249 cgraph_state = CGRAPH_STATE_EXPANSION;
2251 if (!flag_toplevel_reorder)
2252 output_in_order ();
2253 else
2255 output_asm_statements ();
2257 expand_all_functions ();
2258 varpool_output_variables ();
2261 cgraph_process_new_functions ();
2262 cgraph_state = CGRAPH_STATE_FINISHED;
2263 output_weakrefs ();
2265 if (cgraph_dump_file)
2267 fprintf (cgraph_dump_file, "\nFinal ");
2268 dump_symtab (cgraph_dump_file);
2270 #ifdef ENABLE_CHECKING
2271 verify_symtab ();
2272 /* Double check that all inline clones are gone and that all
2273 function bodies have been released from memory. */
2274 if (!seen_error ())
2276 struct cgraph_node *node;
2277 bool error_found = false;
2279 FOR_EACH_DEFINED_FUNCTION (node)
2280 if (node->global.inlined_to
2281 || gimple_has_body_p (node->decl))
2283 error_found = true;
2284 dump_cgraph_node (stderr, node);
2286 if (error_found)
2287 internal_error ("nodes with unreleased memory found");
2289 #endif
2293 /* Analyze the whole compilation unit once it is parsed completely. */
2295 void
2296 finalize_compilation_unit (void)
2298 timevar_push (TV_CGRAPH);
2300 /* If we're here there's no current function anymore. Some frontends
2301 are lazy in clearing these. */
2302 current_function_decl = NULL;
2303 set_cfun (NULL);
2305 /* Do not skip analyzing the functions if there were errors, we
2306 miss diagnostics for following functions otherwise. */
2308 /* Emit size functions we didn't inline. */
2309 finalize_size_functions ();
2311 /* Mark alias targets necessary and emit diagnostics. */
2312 handle_alias_pairs ();
2314 if (!quiet_flag)
2316 fprintf (stderr, "\nAnalyzing compilation unit\n");
2317 fflush (stderr);
2320 if (flag_dump_passes)
2321 dump_passes ();
2323 /* Gimplify and lower all functions, compute reachability and
2324 remove unreachable nodes. */
2325 analyze_functions ();
2327 /* Mark alias targets necessary and emit diagnostics. */
2328 handle_alias_pairs ();
2330 /* Gimplify and lower thunks. */
2331 analyze_functions ();
2333 /* Finally drive the pass manager. */
2334 compile ();
2336 timevar_pop (TV_CGRAPH);
2340 #include "gt-cgraphunit.h"