2015-09-25 Vladimir Makarov <vmakarov@redhat.com>
[official-gcc.git] / gcc / vtable-verify.c
blob3242dd4c4a029670750ac14fcee9d2ecc370b170
1 /* Copyright (C) 2013-2015 Free Software Foundation, Inc.
3 This file is part of GCC.
5 GCC is free software; you can redistribute it and/or modify it under
6 the terms of the GNU General Public License as published by the Free
7 Software Foundation; either version 3, or (at your option) any later
8 version.
10 GCC is distributed in the hope that it will be useful, but WITHOUT ANY
11 WARRANTY; without even the implied warranty of MERCHANTABILITY or
12 FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
13 for more details.
15 You should have received a copy of the GNU General Public License
16 along with GCC; see the file COPYING3. If not see
17 <http://www.gnu.org/licenses/>. */
19 /* Virtual Table Pointer Security Pass - Detect corruption of vtable pointers
20 before using them for virtual method dispatches. */
22 /* This file is part of the vtable security feature implementation.
23 The vtable security feature is designed to detect when a virtual
24 call is about to be made through an invalid vtable pointer
25 (possibly due to data corruption or malicious attacks). The
26 compiler finds every virtual call, and inserts a verification call
27 before the virtual call. The verification call takes the actual
28 vtable pointer value in the object through which the virtual call
29 is being made, and compares the vtable pointer against a set of all
30 valid vtable pointers that the object could contain (this set is
31 based on the declared type of the object). If the pointer is in
32 the valid set, execution is allowed to continue; otherwise the
33 program is halted.
35 There are several pieces needed in order to make this work: 1. For
36 every virtual class in the program (i.e. a class that contains
37 virtual methods), we need to build the set of all possible valid
38 vtables that an object of that class could point to. This includes
39 vtables for any class(es) that inherit from the class under
40 consideration. 2. For every such data set we build up, we need a
41 way to find and reference the data set. This is complicated by the
42 fact that the real vtable addresses are not known until runtime,
43 when the program is loaded into memory, but we need to reference the
44 sets at compile time when we are inserting verification calls into
45 the program. 3. We need to find every virtual call in the program,
46 and insert the verification call (with the appropriate arguments)
47 before the virtual call. 4. We need some runtime library pieces:
48 the code to build up the data sets at runtime; the code to actually
49 perform the verification using the data sets; and some code to set
50 protections on the data sets, so they themselves do not become
51 hacker targets.
53 To find and reference the set of valid vtable pointers for any given
54 virtual class, we create a special global variable for each virtual
55 class. We refer to this as the "vtable map variable" for that
56 class. The vtable map variable has the type "void *", and is
57 initialized by the compiler to NULL. At runtime when the set of
58 valid vtable pointers for a virtual class, e.g. class Foo, is built,
59 the vtable map variable for class Foo is made to point to the set.
60 During compile time, when the compiler is inserting verification
61 calls into the program, it passes the vtable map variable for the
62 appropriate class to the verification call, so that at runtime the
63 verification call can find the appropriate data set.
65 The actual set of valid vtable pointers for a virtual class,
66 e.g. class Foo, cannot be built until runtime, when the vtables get
67 loaded into memory and their addresses are known. But the knowledge
68 about which vtables belong in which class' hierarchy is only known
69 at compile time. Therefore at compile time we collect class
70 hierarchy and vtable information about every virtual class, and we
71 generate calls to build up the data sets at runtime. To build the
72 data sets, we call one of the functions we add to the runtime
73 library, __VLTRegisterPair. __VLTRegisterPair takes two arguments,
74 a vtable map variable and the address of a vtable. If the vtable
75 map variable is currently NULL, it creates a new data set (hash
76 table), makes the vtable map variable point to the new data set, and
77 inserts the vtable address into the data set. If the vtable map
78 variable is not NULL, it just inserts the vtable address into the
79 data set. In order to make sure that our data sets are built before
80 any verification calls happen, we create a special constructor
81 initialization function for each compilation unit, give it a very
82 high initialization priority, and insert all of our calls to
83 __VLTRegisterPair into our special constructor initialization
84 function.
86 The vtable verification feature is controlled by the flag
87 '-fvtable-verify='. There are three flavors of this:
88 '-fvtable-verify=std', '-fvtable-verify=preinit', and
89 '-fvtable-verify=none'. If the option '-fvtable-verfy=preinit' is
90 used, then our constructor initialization function gets put into the
91 preinit array. This is necessary if there are data sets that need
92 to be built very early in execution. If the constructor
93 initialization function gets put into the preinit array, the we also
94 add calls to __VLTChangePermission at the beginning and end of the
95 function. The call at the beginning sets the permissions on the
96 data sets and vtable map variables to read/write, and the one at the
97 end makes them read-only. If the '-fvtable-verify=std' option is
98 used, the constructor initialization functions are executed at their
99 normal time, and the __VLTChangePermission calls are handled
100 differently (see the comments in libstdc++-v3/libsupc++/vtv_rts.cc).
101 The option '-fvtable-verify=none' turns off vtable verification.
103 This file contains code for the tree pass that goes through all the
104 statements in each basic block, looking for virtual calls, and
105 inserting a call to __VLTVerifyVtablePointer (with appropriate
106 arguments) before each one. It also contains the hash table
107 functions for the data structures used for collecting the class
108 hierarchy data and building/maintaining the vtable map variable data
109 are defined in gcc/vtable-verify.h. These data structures are
110 shared with the code in the C++ front end that collects the class
111 hierarchy & vtable information and generates the vtable map
112 variables (see cp/vtable-class-hierarchy.c). This tree pass should
113 run just before the gimple is converted to RTL.
115 Some implementation details for this pass:
117 To find all of the virtual calls, we iterate through all the
118 gimple statements in each basic block, looking for any call
119 statement with the code "OBJ_TYPE_REF". Once we have found the
120 virtual call, we need to find the vtable pointer through which the
121 call is being made, and the type of the object containing the
122 pointer (to find the appropriate vtable map variable). We then use
123 these to build a call to __VLTVerifyVtablePointer, passing the
124 vtable map variable, and the vtable pointer. We insert the
125 verification call just after the gimple statement that gets the
126 vtable pointer out of the object, and we update the next
127 statement to depend on the result returned from
128 __VLTVerifyVtablePointer (the vtable pointer value), to ensure
129 subsequent compiler phases don't remove or reorder the call (it's no
130 good to have the verification occur after the virtual call, for
131 example). To find the vtable pointer being used (and the type of
132 the object) we search backwards through the def_stmts chain from the
133 virtual call (see verify_bb_vtables for more details). */
135 #include "config.h"
136 #include "system.h"
137 #include "coretypes.h"
138 #include "alias.h"
139 #include "backend.h"
140 #include "tree.h"
141 #include "gimple.h"
142 #include "hard-reg-set.h"
143 #include "ssa.h"
144 #include "options.h"
145 #include "fold-const.h"
146 #include "internal-fn.h"
147 #include "gimple-iterator.h"
148 #include "tree-pass.h"
149 #include "cfgloop.h"
151 #include "vtable-verify.h"
153 unsigned num_vtable_map_nodes = 0;
154 int total_num_virtual_calls = 0;
155 int total_num_verified_vcalls = 0;
157 extern GTY(()) tree verify_vtbl_ptr_fndecl;
158 tree verify_vtbl_ptr_fndecl = NULL_TREE;
160 /* Keep track of whether or not any virtual call were verified. */
161 static bool any_verification_calls_generated = false;
163 unsigned int vtable_verify_main (void);
166 /* The following few functions are for the vtbl pointer hash table
167 in the 'registered' field of the struct vtable_map_node. The hash
168 table keeps track of which vtable pointers have been used in
169 calls to __VLTRegisterPair with that particular vtable map variable. */
171 /* This function checks to see if a particular VTABLE_DECL and OFFSET are
172 already in the 'registered' hash table for NODE. */
174 bool
175 vtbl_map_node_registration_find (struct vtbl_map_node *node,
176 tree vtable_decl,
177 unsigned offset)
179 struct vtable_registration key;
180 struct vtable_registration **slot;
182 gcc_assert (node && node->registered);
184 key.vtable_decl = vtable_decl;
185 slot = node->registered->find_slot (&key, NO_INSERT);
187 if (slot && (*slot))
189 unsigned i;
190 for (i = 0; i < ((*slot)->offsets).length (); ++i)
191 if ((*slot)->offsets[i] == offset)
192 return true;
195 return false;
198 /* This function inserts VTABLE_DECL and OFFSET into the 'registered'
199 hash table for NODE. It returns a boolean indicating whether or not
200 it actually inserted anything. */
202 bool
203 vtbl_map_node_registration_insert (struct vtbl_map_node *node,
204 tree vtable_decl,
205 unsigned offset)
207 struct vtable_registration key;
208 struct vtable_registration **slot;
209 bool inserted_something = false;
211 if (!node || !node->registered)
212 return false;
214 key.vtable_decl = vtable_decl;
215 slot = node->registered->find_slot (&key, INSERT);
217 if (! *slot)
219 struct vtable_registration *node;
220 node = XNEW (struct vtable_registration);
221 node->vtable_decl = vtable_decl;
223 (node->offsets).create (10);
224 (node->offsets).safe_push (offset);
225 *slot = node;
226 inserted_something = true;
228 else
230 /* We found the vtable_decl slot; we need to see if it already
231 contains the offset. If not, we need to add the offset. */
232 unsigned i;
233 bool found = false;
234 for (i = 0; i < ((*slot)->offsets).length () && !found; ++i)
235 if ((*slot)->offsets[i] == offset)
236 found = true;
238 if (!found)
240 ((*slot)->offsets).safe_push (offset);
241 inserted_something = true;
244 return inserted_something;
247 /* Hashtable functions for vtable_registration hashtables. */
249 inline hashval_t
250 registration_hasher::hash (const vtable_registration *p)
252 const struct vtable_registration *n = (const struct vtable_registration *) p;
253 return (hashval_t) (DECL_UID (n->vtable_decl));
256 inline bool
257 registration_hasher::equal (const vtable_registration *p1,
258 const vtable_registration *p2)
260 const struct vtable_registration *n1 =
261 (const struct vtable_registration *) p1;
262 const struct vtable_registration *n2 =
263 (const struct vtable_registration *) p2;
264 return (DECL_UID (n1->vtable_decl) == DECL_UID (n2->vtable_decl));
267 /* End of hashtable functions for "registered" hashtables. */
271 /* Hashtable definition and functions for vtbl_map_hash. */
273 struct vtbl_map_hasher : nofree_ptr_hash <struct vtbl_map_node>
275 static inline hashval_t hash (const vtbl_map_node *);
276 static inline bool equal (const vtbl_map_node *, const vtbl_map_node *);
279 /* Returns a hash code for P. */
281 inline hashval_t
282 vtbl_map_hasher::hash (const vtbl_map_node *p)
284 const struct vtbl_map_node n = *((const struct vtbl_map_node *) p);
285 return (hashval_t) IDENTIFIER_HASH_VALUE (n.class_name);
288 /* Returns nonzero if P1 and P2 are equal. */
290 inline bool
291 vtbl_map_hasher::equal (const vtbl_map_node *p1, const vtbl_map_node *p2)
293 const struct vtbl_map_node n1 = *((const struct vtbl_map_node *) p1);
294 const struct vtbl_map_node n2 = *((const struct vtbl_map_node *) p2);
295 return (IDENTIFIER_HASH_VALUE (n1.class_name) ==
296 IDENTIFIER_HASH_VALUE (n2.class_name));
299 /* Here are the two structures into which we insert vtable map nodes.
300 We use two data structures because of the vastly different ways we need
301 to find the nodes for various tasks (see comments in vtable-verify.h
302 for more details. */
304 typedef hash_table<vtbl_map_hasher> vtbl_map_table_type;
305 typedef vtbl_map_table_type::iterator vtbl_map_iterator_type;
307 /* Vtable map variable nodes stored in a hash table. */
308 static vtbl_map_table_type *vtbl_map_hash;
310 /* Vtable map variable nodes stored in a vector. */
311 vec<struct vtbl_map_node *> vtbl_map_nodes_vec;
313 /* Vector of mangled names for anonymous classes. */
314 extern GTY(()) vec<tree, va_gc> *vtbl_mangled_name_types;
315 extern GTY(()) vec<tree, va_gc> *vtbl_mangled_name_ids;
316 vec<tree, va_gc> *vtbl_mangled_name_types;
317 vec<tree, va_gc> *vtbl_mangled_name_ids;
319 /* Look up class_type (a type decl for record types) in the vtbl_mangled_names_*
320 vectors. This is a linear lookup. Return the associated mangled name for
321 the class type. This is for handling types from anonymous namespaces, whose
322 DECL_ASSEMBLER_NAME ends up being "<anon>", which is useless for our
323 purposes.
325 We use two vectors of trees to keep track of the mangled names: One is a
326 vector of class types and the other is a vector of the mangled names. The
327 assumption is that these two vectors are kept in perfect lock-step so that
328 vtbl_mangled_name_ids[i] is the mangled name for
329 vtbl_mangled_name_types[i]. */
331 static tree
332 vtbl_find_mangled_name (tree class_type)
334 tree result = NULL_TREE;
335 unsigned i;
337 if (!vtbl_mangled_name_types or !vtbl_mangled_name_ids)
338 return result;
340 if (vtbl_mangled_name_types->length() != vtbl_mangled_name_ids->length())
341 return result;
343 for (i = 0; i < vtbl_mangled_name_types->length(); ++i)
344 if ((*vtbl_mangled_name_types)[i] == class_type)
346 result = (*vtbl_mangled_name_ids)[i];
347 break;
350 return result;
353 /* Store a class type decl and its mangled name, for an anonymous RECORD_TYPE,
354 in the vtbl_mangled_names vector. Make sure there is not already an
355 entry for the class type before adding it. */
357 void
358 vtbl_register_mangled_name (tree class_type, tree mangled_name)
360 if (!vtbl_mangled_name_types)
361 vec_alloc (vtbl_mangled_name_types, 10);
363 if (!vtbl_mangled_name_ids)
364 vec_alloc (vtbl_mangled_name_ids, 10);
366 gcc_assert (vtbl_mangled_name_types->length() ==
367 vtbl_mangled_name_ids->length());
370 if (vtbl_find_mangled_name (class_type) == NULL_TREE)
372 vec_safe_push (vtbl_mangled_name_types, class_type);
373 vec_safe_push (vtbl_mangled_name_ids, mangled_name);
377 /* Return vtbl_map node for CLASS_NAME without creating a new one. */
379 struct vtbl_map_node *
380 vtbl_map_get_node (tree class_type)
382 struct vtbl_map_node key;
383 struct vtbl_map_node **slot;
385 tree class_type_decl;
386 tree class_name;
387 unsigned int type_quals;
389 if (!vtbl_map_hash)
390 return NULL;
392 gcc_assert (TREE_CODE (class_type) == RECORD_TYPE);
395 /* Find the TYPE_DECL for the class. */
396 class_type_decl = TYPE_NAME (class_type);
398 /* Verify that there aren't any qualifiers on the type. */
399 type_quals = TYPE_QUALS (TREE_TYPE (class_type_decl));
400 gcc_assert (type_quals == TYPE_UNQUALIFIED);
402 /* Get the mangled name for the unqualified type. */
403 gcc_assert (HAS_DECL_ASSEMBLER_NAME_P (class_type_decl));
404 class_name = DECL_ASSEMBLER_NAME (class_type_decl);
406 if (strstr (IDENTIFIER_POINTER (class_name), "<anon>") != NULL)
407 class_name = vtbl_find_mangled_name (class_type_decl);
409 key.class_name = class_name;
410 slot = (struct vtbl_map_node **) vtbl_map_hash->find_slot (&key, NO_INSERT);
411 if (!slot)
412 return NULL;
413 return *slot;
416 /* Return vtbl_map node assigned to BASE_CLASS_TYPE. Create new one
417 when needed. */
419 struct vtbl_map_node *
420 find_or_create_vtbl_map_node (tree base_class_type)
422 struct vtbl_map_node key;
423 struct vtbl_map_node *node;
424 struct vtbl_map_node **slot;
425 tree class_type_decl;
426 unsigned int type_quals;
428 if (!vtbl_map_hash)
429 vtbl_map_hash = new vtbl_map_table_type (10);
431 /* Find the TYPE_DECL for the class. */
432 class_type_decl = TYPE_NAME (base_class_type);
434 /* Verify that there aren't any type qualifiers on type. */
435 type_quals = TYPE_QUALS (TREE_TYPE (class_type_decl));
436 gcc_assert (type_quals == TYPE_UNQUALIFIED);
438 gcc_assert (HAS_DECL_ASSEMBLER_NAME_P (class_type_decl));
439 key.class_name = DECL_ASSEMBLER_NAME (class_type_decl);
441 if (strstr (IDENTIFIER_POINTER (key.class_name), "<anon>") != NULL)
442 key.class_name = vtbl_find_mangled_name (class_type_decl);
444 slot = (struct vtbl_map_node **) vtbl_map_hash->find_slot (&key, INSERT);
446 if (*slot)
447 return *slot;
449 node = XNEW (struct vtbl_map_node);
450 node->vtbl_map_decl = NULL_TREE;
451 node->class_name = key.class_name;
452 node->uid = num_vtable_map_nodes++;
454 node->class_info = XNEW (struct vtv_graph_node);
455 node->class_info->class_type = base_class_type;
456 node->class_info->class_uid = node->uid;
457 node->class_info->num_processed_children = 0;
459 (node->class_info->parents).create (4);
460 (node->class_info->children).create (4);
462 node->registered = new register_table_type (16);
464 node->is_used = false;
466 vtbl_map_nodes_vec.safe_push (node);
467 gcc_assert (vtbl_map_nodes_vec[node->uid] == node);
469 *slot = node;
470 return node;
473 /* End of hashtable functions for vtable_map variables hash table. */
475 /* Given a gimple STMT, this function checks to see if the statement
476 is an assignment, the rhs of which is getting the vtable pointer
477 value out of an object. (i.e. it's the value we need to verify
478 because its the vtable pointer that will be used for a virtual
479 call). */
481 static bool
482 is_vtable_assignment_stmt (gimple *stmt)
485 if (gimple_code (stmt) != GIMPLE_ASSIGN)
486 return false;
487 else
489 tree lhs = gimple_assign_lhs (stmt);
490 tree rhs = gimple_assign_rhs1 (stmt);
492 if (TREE_CODE (lhs) != SSA_NAME)
493 return false;
495 if (TREE_CODE (rhs) != COMPONENT_REF)
496 return false;
498 if (! (TREE_OPERAND (rhs, 1))
499 || (TREE_CODE (TREE_OPERAND (rhs, 1)) != FIELD_DECL))
500 return false;
502 if (! DECL_VIRTUAL_P (TREE_OPERAND (rhs, 1)))
503 return false;
506 return true;
509 /* This function attempts to recover the declared class of an object
510 that is used in making a virtual call. We try to get the type from
511 the type cast in the gimple assignment statement that extracts the
512 vtable pointer from the object (DEF_STMT). The gimple statement
513 usually looks something like this:
515 D.2201_4 = MEM[(struct Event *)this_1(D)]._vptr.Event */
517 static tree
518 extract_object_class_type (tree rhs)
520 tree result = NULL_TREE;
522 /* Try to find and extract the type cast from that stmt. */
523 if (TREE_CODE (rhs) == COMPONENT_REF)
525 tree op0 = TREE_OPERAND (rhs, 0);
526 tree op1 = TREE_OPERAND (rhs, 1);
528 if (TREE_CODE (op1) == FIELD_DECL
529 && DECL_VIRTUAL_P (op1))
531 if (TREE_CODE (op0) == COMPONENT_REF
532 && TREE_CODE (TREE_OPERAND (op0, 0)) == MEM_REF
533 && TREE_CODE (TREE_TYPE (TREE_OPERAND (op0, 0)))== RECORD_TYPE)
534 result = TREE_TYPE (TREE_OPERAND (op0, 0));
535 else
536 result = TREE_TYPE (op0);
538 else if (TREE_CODE (op0) == COMPONENT_REF)
540 result = extract_object_class_type (op0);
541 if (result == NULL_TREE
542 && TREE_CODE (op1) == COMPONENT_REF)
543 result = extract_object_class_type (op1);
547 return result;
550 /* This function traces forward through the def-use chain of an SSA
551 variable to see if it ever gets used in a virtual function call. It
552 returns a boolean indicating whether or not it found a virtual call in
553 the use chain. */
555 static bool
556 var_is_used_for_virtual_call_p (tree lhs, int *mem_ref_depth,
557 int *recursion_depth)
559 imm_use_iterator imm_iter;
560 bool found_vcall = false;
561 use_operand_p use_p;
563 if (TREE_CODE (lhs) != SSA_NAME)
564 return false;
566 if (*mem_ref_depth > 2)
567 return false;
569 if (*recursion_depth > 25)
570 /* If we've recursed this far the chances are pretty good that
571 we're not going to find what we're looking for, and that we've
572 gone down a recursion black hole. Time to stop. */
573 return false;
575 *recursion_depth = *recursion_depth + 1;
577 /* Iterate through the immediate uses of the current variable. If
578 it's a virtual function call, we're done. Otherwise, if there's
579 an LHS for the use stmt, add the ssa var to the work list
580 (assuming it's not already in the list and is not a variable
581 we've already examined. */
583 FOR_EACH_IMM_USE_FAST (use_p, imm_iter, lhs)
585 gimple *stmt2 = USE_STMT (use_p);
587 if (is_gimple_call (stmt2))
589 tree fncall = gimple_call_fn (stmt2);
590 if (fncall && TREE_CODE (fncall) == OBJ_TYPE_REF)
591 found_vcall = true;
592 else
593 return false;
595 else if (gimple_code (stmt2) == GIMPLE_PHI)
597 found_vcall = var_is_used_for_virtual_call_p
598 (gimple_phi_result (stmt2),
599 mem_ref_depth,
600 recursion_depth);
602 else if (is_gimple_assign (stmt2))
604 tree rhs = gimple_assign_rhs1 (stmt2);
605 if (TREE_CODE (rhs) == ADDR_EXPR
606 || TREE_CODE (rhs) == MEM_REF)
607 *mem_ref_depth = *mem_ref_depth + 1;
609 if (TREE_CODE (rhs) == COMPONENT_REF)
611 while (TREE_CODE (TREE_OPERAND (rhs, 0)) == COMPONENT_REF)
612 rhs = TREE_OPERAND (rhs, 0);
614 if (TREE_CODE (TREE_OPERAND (rhs, 0)) == ADDR_EXPR
615 || TREE_CODE (TREE_OPERAND (rhs, 0)) == MEM_REF)
616 *mem_ref_depth = *mem_ref_depth + 1;
619 if (*mem_ref_depth < 3)
620 found_vcall = var_is_used_for_virtual_call_p
621 (gimple_assign_lhs (stmt2),
622 mem_ref_depth,
623 recursion_depth);
626 else
627 break;
629 if (found_vcall)
630 return true;
633 return false;
636 /* Search through all the statements in a basic block (BB), searching
637 for virtual method calls. For each virtual method dispatch, find
638 the vptr value used, and the statically declared type of the
639 object; retrieve the vtable map variable for the type of the
640 object; generate a call to __VLTVerifyVtablePointer; and insert the
641 generated call into the basic block, after the point where the vptr
642 value is gotten out of the object and before the virtual method
643 dispatch. Make the virtual method dispatch depend on the return
644 value from the verification call, so that subsequent optimizations
645 cannot reorder the two calls. */
647 static void
648 verify_bb_vtables (basic_block bb)
650 gimple_seq stmts;
651 gimple *stmt = NULL;
652 gimple_stmt_iterator gsi_vtbl_assign;
653 gimple_stmt_iterator gsi_virtual_call;
655 stmts = bb_seq (bb);
656 gsi_virtual_call = gsi_start (stmts);
657 for (; !gsi_end_p (gsi_virtual_call); gsi_next (&gsi_virtual_call))
659 stmt = gsi_stmt (gsi_virtual_call);
661 /* Count virtual calls. */
662 if (is_gimple_call (stmt))
664 tree fncall = gimple_call_fn (stmt);
665 if (fncall && TREE_CODE (fncall) == OBJ_TYPE_REF)
666 total_num_virtual_calls++;
669 if (is_vtable_assignment_stmt (stmt))
671 tree lhs = gimple_assign_lhs (stmt);
672 tree vtbl_var_decl = NULL_TREE;
673 struct vtbl_map_node *vtable_map_node;
674 tree vtbl_decl = NULL_TREE;
675 gcall *call_stmt;
676 const char *vtable_name = "<unknown>";
677 tree tmp0;
678 bool found;
679 int mem_ref_depth = 0;
680 int recursion_depth = 0;
682 /* Make sure this vptr field access is for a virtual call. */
683 if (!var_is_used_for_virtual_call_p (lhs, &mem_ref_depth,
684 &recursion_depth))
685 continue;
687 /* Now we have found the virtual method dispatch and
688 the preceding access of the _vptr.* field... Next
689 we need to find the statically declared type of
690 the object, so we can find and use the right
691 vtable map variable in the verification call. */
692 tree class_type = extract_object_class_type
693 (gimple_assign_rhs1 (stmt));
695 gsi_vtbl_assign = gsi_for_stmt (stmt);
697 if (class_type
698 && (TREE_CODE (class_type) == RECORD_TYPE)
699 && TYPE_BINFO (class_type))
701 /* Get the vtable VAR_DECL for the type. */
702 vtbl_var_decl = BINFO_VTABLE (TYPE_BINFO (class_type));
704 if (TREE_CODE (vtbl_var_decl) == POINTER_PLUS_EXPR)
705 vtbl_var_decl = TREE_OPERAND (TREE_OPERAND (vtbl_var_decl, 0),
708 gcc_assert (vtbl_var_decl);
710 vtbl_decl = vtbl_var_decl;
711 vtable_map_node = vtbl_map_get_node
712 (TYPE_MAIN_VARIANT (class_type));
714 gcc_assert (verify_vtbl_ptr_fndecl);
716 /* Given the vtable pointer for the base class of the
717 object, build the call to __VLTVerifyVtablePointer to
718 verify that the object's vtable pointer (contained in
719 lhs) is in the set of valid vtable pointers for the
720 base class. */
722 if (vtable_map_node && vtable_map_node->vtbl_map_decl)
724 vtable_map_node->is_used = true;
725 vtbl_var_decl = vtable_map_node->vtbl_map_decl;
727 if (TREE_CODE (vtbl_decl) == VAR_DECL)
728 vtable_name = IDENTIFIER_POINTER (DECL_NAME (vtbl_decl));
730 /* Call different routines if we are interested in
731 trace information to debug problems. */
732 if (flag_vtv_debug)
734 int len1 = IDENTIFIER_LENGTH
735 (DECL_NAME (vtbl_var_decl));
736 int len2 = strlen (vtable_name);
738 call_stmt = gimple_build_call
739 (verify_vtbl_ptr_fndecl, 4,
740 build1 (ADDR_EXPR,
741 TYPE_POINTER_TO
742 (TREE_TYPE (vtbl_var_decl)),
743 vtbl_var_decl),
744 lhs,
745 build_string_literal
746 (len1 + 1,
747 IDENTIFIER_POINTER
748 (DECL_NAME
749 (vtbl_var_decl))),
750 build_string_literal (len2 + 1,
751 vtable_name));
753 else
754 call_stmt = gimple_build_call
755 (verify_vtbl_ptr_fndecl, 2,
756 build1 (ADDR_EXPR,
757 TYPE_POINTER_TO
758 (TREE_TYPE (vtbl_var_decl)),
759 vtbl_var_decl),
760 lhs);
763 /* Create a new SSA_NAME var to hold the call's
764 return value, and make the call_stmt use the
765 variable for that purpose. */
766 tmp0 = make_temp_ssa_name (TREE_TYPE (lhs), NULL, "VTV");
767 gimple_call_set_lhs (call_stmt, tmp0);
768 update_stmt (call_stmt);
770 /* Replace all uses of lhs with tmp0. */
771 found = false;
772 imm_use_iterator iterator;
773 gimple *use_stmt;
774 FOR_EACH_IMM_USE_STMT (use_stmt, iterator, lhs)
776 use_operand_p use_p;
777 if (use_stmt == call_stmt)
778 continue;
779 FOR_EACH_IMM_USE_ON_STMT (use_p, iterator)
780 SET_USE (use_p, tmp0);
781 update_stmt (use_stmt);
782 found = true;
785 gcc_assert (found);
787 /* Insert the new verification call just after the
788 statement that gets the vtable pointer out of the
789 object. */
790 gcc_assert (gsi_stmt (gsi_vtbl_assign) == stmt);
791 gsi_insert_after (&gsi_vtbl_assign, call_stmt,
792 GSI_NEW_STMT);
794 any_verification_calls_generated = true;
795 total_num_verified_vcalls++;
802 /* Definition of this optimization pass. */
804 namespace {
806 const pass_data pass_data_vtable_verify =
808 GIMPLE_PASS, /* type */
809 "vtable-verify", /* name */
810 OPTGROUP_NONE, /* optinfo_flags */
811 TV_VTABLE_VERIFICATION, /* tv_id */
812 ( PROP_cfg | PROP_ssa ), /* properties_required */
813 0, /* properties_provided */
814 0, /* properties_destroyed */
815 0, /* todo_flags_start */
816 TODO_update_ssa, /* todo_flags_finish */
819 class pass_vtable_verify : public gimple_opt_pass
821 public:
822 pass_vtable_verify (gcc::context *ctxt)
823 : gimple_opt_pass (pass_data_vtable_verify, ctxt)
826 /* opt_pass methods: */
827 virtual bool gate (function *) { return (flag_vtable_verify); }
828 virtual unsigned int execute (function *);
830 }; // class pass_vtable_verify
832 /* Loop through all the basic blocks in the current function, passing them to
833 verify_bb_vtables, which searches for virtual calls, and inserts
834 calls to __VLTVerifyVtablePointer. */
836 unsigned int
837 pass_vtable_verify::execute (function *fun)
839 unsigned int ret = 1;
840 basic_block bb;
842 FOR_ALL_BB_FN (bb, fun)
843 verify_bb_vtables (bb);
845 return ret;
848 } // anon namespace
850 gimple_opt_pass *
851 make_pass_vtable_verify (gcc::context *ctxt)
853 return new pass_vtable_verify (ctxt);
856 #include "gt-vtable-verify.h"