Added Polish translation.
[wine.git] / documentation / testing.sgml
blobf2851319afe47cd1e97d60126a8835c123b3e2b1
1 <chapter id="testing">
2 <title>Writing Conformance tests</title>
4 <sect1 id="testing-intro">
5 <title>Introduction</title>
6 <para>
7 The Windows API follows no standard, it is itself a de facto standard,
8 and deviations from that standard, even small ones, often cause
9 applications to crash or misbehave in some way.
10 </para>
11 <para>
12 The question becomes, "How do we ensure compliance with that standard?"
13 The answer is, "By using the API documentation available to us and
14 backing that up with conformance tests." Furthermore, a conformance
15 test suite is the most accurate (if not necessarily the most complete)
16 form of API documentation and can be used to supplement the Windows
17 API documentation.
18 </para>
19 <para>
20 Writing a conformance test suite for more than 10000 APIs is no small
21 undertaking. Fortunately it can prove very useful to the development
22 of Wine way before it is complete.
23 <itemizedlist>
24 <listitem>
25 <para>
26 The conformance test suite must run on Windows. This is
27 necessary to provide a reasonable way to verify its accuracy.
28 Furthermore the tests must pass successfully on all Windows
29 platforms (tests not relevant to a given platform should be
30 skipped).
31 </para>
32 <para>
33 A consequence of this is that the test suite will provide a
34 great way to detect variations in the API between different
35 Windows versions. For instance, this can provide insights
36 into the differences between the, often undocumented, Win9x and
37 NT Windows families.
38 </para>
39 <para>
40 However, one must remember that the goal of Wine is to run
41 Windows applications on Linux, not to be a clone of any specific
42 Windows version. So such variations must only be tested for when
43 relevant to that goal.
44 </para>
45 </listitem>
46 <listitem>
47 <para>
48 Writing conformance tests is also an easy way to discover
49 bugs in Wine. Of course, before fixing the bugs discovered in
50 this way, one must first make sure that the new tests do pass
51 successfully on at least one Windows 9x and one Windows NT
52 version.
53 </para>
54 <para>
55 Bugs discovered this way should also be easier to fix. Unlike
56 some mysterious application crashes, when a conformance test
57 fails, the expected behavior and APIs tested for are known thus
58 greatly simplifying the diagnosis.
59 </para>
60 </listitem>
61 <listitem>
62 <para>
63 To detect regressions. Simply running the test suite regularly
64 in Wine turns it into a great tool to detect regressions.
65 When a test fails, one immediately knows what was the expected
66 behavior and which APIs are involved. Thus regressions caught
67 this way should be detected earlier, because it is easy to run
68 all tests on a regular basis, and be easier to fix because of the
69 reduced diagnosis work.
70 </para>
71 </listitem>
72 <listitem>
73 <para>
74 Tests written in advance of the Wine development (possibly even
75 by non Wine developers) can also simplify the work of the
76 future implementer by making it easier for him to check the
77 correctness of his code.
78 </para>
79 </listitem>
80 <listitem>
81 <para>
82 Conformance tests will also come in handy when testing Wine on
83 new (or not as widely used) architectures such as FreeBSD,
84 Solaris x86 or even non-x86 systems. Even when the port does
85 not involve any significant change in the thread management,
86 exception handling or other low-level aspects of Wine, new
87 architectures can expose subtle bugs that can be hard to
88 diagnose when debugging regular (complex) applications.
89 </para>
90 </listitem>
91 </itemizedlist>
92 </para>
93 </sect1>
96 <sect1 id="testing-what">
97 <title>What to test for?</title>
98 <para>
99 The first thing to test for is the documented behavior of APIs
100 and such as CreateFile. For instance one can create a file using a
101 long pathname, check that the behavior is correct when the file
102 already exists, try to open the file using the corresponding short
103 pathname, convert the filename to Unicode and try to open it using
104 CreateFileW, and all other things which are documented and that
105 applications rely on.
106 </para>
107 <para>
108 While the testing framework is not specifically geared towards this
109 type of tests, it is also possible to test the behavior of Windows
110 messages. To do so, create a window, preferably a hidden one so that
111 it does not steal the focus when running the tests, and send messages
112 to that window or to controls in that window. Then, in the message
113 procedure, check that you receive the expected messages and with the
114 correct parameters.
115 </para>
116 <para>
117 For instance you could create an edit control and use WM_SETTEXT to
118 set its contents, possibly check length restrictions, and verify the
119 results using WM_GETTEXT. Similarly one could create a listbox and
120 check the effect of LB_DELETESTRING on the list's number of items,
121 selected items list, highlighted item, etc. For concrete examples,
122 see <filename>dlls/user/tests/win.c</> and the related tests.
123 </para>
124 <para>
125 However, undocumented behavior should not be tested for unless there
126 is an application that relies on this behavior, and in that case the
127 test should mention that application, or unless one can strongly
128 expect applications to rely on this behavior, typically APIs that
129 return the required buffer size when the buffer pointer is NULL.
130 </para>
131 </sect1>
134 <sect1 id="testing-wine">
135 <title>Running the tests in Wine</title>
136 <para>
137 The simplest way to run the tests in Wine is to type 'make test' in
138 the Wine sources top level directory. This will run all the Wine
139 conformance tests.
140 </para>
141 <para>
142 The tests for a specific Wine library are located in a 'tests'
143 directory in that library's directory. Each test is contained in a
144 file (e.g. <filename>dlls/kernel/tests/thread.c</>). Each
145 file itself contains many checks concerning one or more related APIs.
146 </para>
147 <para>
148 So to run all the tests related to a given Wine library, go to the
149 corresponding 'tests' directory and type 'make test'. This will
150 compile the tests, run them, and create an '<replaceable>xxx</>.ok'
151 file for each test that passes successfully. And if you only want to
152 run the tests contained in the <filename>thread.c</> file of the
153 kernel library, you would do:
154 <screen>
155 <prompt>$ </>cd dlls/kernel/tests
156 <prompt>$ </>make thread.ok
157 </screen>
158 </para>
159 <para>
160 Note that if the test has already been run and is up to date (i.e. if
161 neither the kernel library nor the <filename>thread.c</> file has
162 changed since the <filename>thread.ok</> file was created), then make
163 will say so. To force the test to be re-run, delete the
164 <filename>thread.ok</> file, and run the make command again.
165 </para>
166 <para>
167 You can also run tests manually using a command similar to the
168 following:
169 <screen>
170 <prompt>$ </>../../../tools/runtest -q -M kernel32.dll -p kernel32_test.exe.so thread.c
171 <prompt>$ </>../../../tools/runtest -P wine -p kernel32_test.exe.so thread.c
172 thread.c: 86 tests executed, 5 marked as todo, 0 failures.
173 </screen>
174 The '-P wine' option defines the platform that is currently being
175 tested and is used in conjunction with the 'todo' statements (see
176 below). Remove the '-q' option if you want the testing framework
177 to report statistics about the number of successful and failed tests.
178 Run <command>runtest -h</> for more details.
179 </para>
180 </sect1>
183 <sect1 id="cross-compiling-tests">
184 <title>Cross-compiling the tests with MinGW</title>
185 <sect2>
186 <title>Setup of the MinGW cross-compiling environment</title>
187 <para>
188 Here are some instructions to setup MinGW on different Linux
189 distributions and *BSD.
190 </para>
191 <sect3>
192 <title>Debian GNU/Linux</title>
193 <para>
194 On Debian do <command>apt-get install mingw32</>.
195 </para>
196 <para>
197 The standard MinGW libraries will probably be incomplete, causing
198 'undefined symbol' errors. So get the latest
199 <ulink url="http://mirzam.it.vu.nl/mingw/">mingw-w32api RPM</>
200 and use <command>alien</> to either convert it to a .tar.gz file
201 from which to extract just the relevant files, or to convert it
202 to a Debian package that you will install.
203 </para>
204 </sect3>
205 <sect3>
206 <title>Red Hat Linux like rpm systems</title>
207 <para>
208 This includes Fedora Core, Red Hat Enterprise Linux, Mandrake,
209 most probably SuSE Linux too, etc. But this list isn't exhaustive;
210 the following steps should probably work on any rpm based system.
211 </para>
212 <para>
213 Download and install the latest rpm's from
214 <ulink url="http://mirzam.it.vu.nl/mingw/">MinGW RPM packages</>.
215 Alternatively you can follow the instructions on that page and
216 build your own packages from the source rpm's listed there as well.
217 </para>
218 </sect3>
219 <sect3>
220 <title>*BSD</title>
221 <para>
222 The *BSD systems have in their ports collection a port for the
223 MinGW cross-compiling environment. Please see the documentation
224 of your system about how to build and install a port.
225 </para>
226 </sect3>
227 </sect2>
228 <sect2>
229 <title>Compiling the tests</title>
230 <para>
231 Having the cross-compiling environment set up the generation of the
232 Windows executables is easy by using the Wine build system.
233 </para>
234 <para>
235 If you had already run <command>configure</>, then delete
236 <filename>config.cache</> and re-run <command>configure</>.
237 You can then run <command>make crosstest</>. To sum up:
238 <screen>
239 <prompt>$ </><userinput>rm config.cache</>
240 <prompt>$ </><userinput>./configure</>
241 <prompt>$ </><userinput>make crosstest</>
242 </screen>
243 </para>
244 </sect2>
245 </sect1>
248 <sect1 id="testing-windows">
249 <title>Building and running the tests on Windows</title>
250 <sect2>
251 <title>Using pre-compiled binaries</title>
252 <para>
253 The simplest solution is to download the
254 <ulink url="http://www.astro.gla.ac.uk/users/paulm/WRT/CrossBuilt/winetest-latest.exe">latest
255 version of winetest</>. This executable contains all the Wine
256 conformance tests, runs them and reports the results.
257 </para>
258 <para>
259 You can also get the older versions from
260 <ulink url="http://www.astro.gla.ac.uk/users/paulm/WRT/CrossBuilt/">Paul
261 Millar's website</>.
262 </para>
263 </sect2>
264 <sect2>
265 <title>With Visual C++</title>
266 <itemizedlist>
267 <listitem><para>
268 If you are using Visual Studio 6, make sure you have the
269 "processor pack" from
270 <ulink url="http://msdn.microsoft.com/vstudio/downloads/tools/ppack/default.aspx">http://msdn.microsoft.com/vstudio/downloads/tools/ppack/default.aspx</>.
271 The processor pack fixes <emphasis>"error C2520: conversion from
272 unsigned __int64 to double not implemented, use signed __int64"</>.
273 However note that the "processor pack" is incompatible with
274 Visual Studio 6.0 Standard Edition, and with the Visual Studio 6
275 Service Pack 6. If you are using Visual Studio 7 or greater you
276 do not need the processor pack. In either case it is recommended
277 to the most recent compatible Visual Studio
278 <ulink url="http://msdn.microsoft.com/vstudio/downloads/updates/sp/">service pack</>.
279 </para></listitem>
280 <listitem><para>
281 get the Wine sources
282 </para></listitem>
283 <listitem><para>
284 Run msvcmaker to generate Visual C++ project files for the tests.
285 'msvcmaker' is a perl script so you may be able to run it on
286 Windows.
287 <screen>
288 <prompt>$ </>./tools/winapi/msvcmaker --no-wine
289 </screen>
290 </para></listitem>
291 <listitem><para>
292 If the previous steps were done on your Linux development
293 machine, make the Wine sources accessible to the Windows machine
294 on which you are going to compile them. Typically you would do
295 this using Samba but copying them altogether would work too.
296 </para></listitem>
297 <listitem><para>
298 On the Windows machine, open the <filename>winetest.dsw</>
299 workspace. This will load each test's project. For each test there
300 are two configurations: one compiles the test with the Wine
301 headers, and the other uses the Microsoft headers.
302 </para></listitem>
303 <listitem><para>
304 If you choose the "Win32 MSVC Headers" configuration, most of the
305 tests will not compile with the regular Visual Studio headers. So
306 to use this configuration, download and install a recent
307 <ulink url="http://www.microsoft.com/msdownload/platformsdk/sdkupdate/">Platform SDK</>
308 (the download requires Internet Explorer!), as well as the latest
309 <ulink url="http://msdn.microsoft.com/library/default.asp?url=/downloads/list/directx.asp">DirectX SDK</>.
310 Then, <ulink url="http://msdn.microsoft.com/library/default.asp?url=/library/EN-US/sdkintro/sdkintro/installing_the_platform_sdk_with_visual_studio.asp">configure Visual Studio</>
311 to use these SDK's headers and libraries. Alternately you could go
312 to the <menuchoice><guimenu>Project</> <guimenu>Settings...</></>
313 menu and modify the settings appropriately, but you would then
314 have to redo this whenever you rerun msvcmaker.
315 </para></listitem>
316 <listitem><para>
317 Open the <menuchoice><guimenu>Build</> <guimenu>Batch
318 build...</></> menu and select the tests and build configurations
319 you want to build. Then click on <guibutton>Build</>.
320 </para></listitem>
321 <listitem><para>
322 To run a specific test from Visual C++, go to
323 <menuchoice><guimenu>Project</> <guimenu>Settings...</></>. There
324 select that test's project and build configuration and go to the
325 <guilabel>Debug</> tab. There type the name of the specific test
326 to run (e.g. 'thread') in the <guilabel>Program arguments</>
327 field. Validate your change by clicking on <guibutton>Ok</> and
328 start the test by clicking the red exclamation mark (or hitting
329 'F5' or any other usual method).
330 </para></listitem>
331 <listitem><para>
332 You can also run the tests from the command line. You will find
333 them in either <filename>Output\Win32_Wine_Headers</> or
334 <filename>Output\Win32_MSVC_Headers</> depending on the build
335 method. So to run the kernel 'path' tests you would do:
336 <screen>
337 <prompt>C:\&gt;</>cd dlls\kernel\tests\Output\Win32_MSVC_Headers
338 <prompt>C:\wine\dlls\kernel\tests\Output\Win32_MSVC_Headers&gt;</> kernel32_test path
339 </screen>
340 </para></listitem>
341 </itemizedlist>
342 </sect2>
343 <sect2>
344 <title>With MinGW</title>
345 <para>
346 Wine's build system already has support for building tests with a MinGW
347 cross-compiler. See the section above called 'Setup of the MinGW
348 cross-compiling environment' for instructions on how to set things up.
349 When you have a MinGW environment installed all you need to do is rerun
350 configure and it should detect the MinGW compiler and tools. Then run
351 'make crosstest' to start building the tests.
352 </para>
353 </sect2>
354 </sect1>
357 <sect1 id="testing-test">
358 <title>Inside a test</title>
360 <para>
361 When writing new checks you can either modify an existing test file or
362 add a new one. If your tests are related to the tests performed by an
363 existing file, then add them to that file. Otherwise create a new .c
364 file in the tests directory and add that file to the
365 <varname>CTESTS</> variable in <filename>Makefile.in</>.
366 </para>
367 <para>
368 A new test file will look something like the following:
369 <screen>
370 #include &lt;wine/test.h&gt;
371 #include &lt;winbase.h&gt;
373 /* Maybe auxiliary functions and definitions here */
375 START_TEST(paths)
377 /* Write your checks there or put them in functions you will call from
378 * there
381 </screen>
382 </para>
383 <para>
384 The test's entry point is the START_TEST section. This is where
385 execution will start. You can put all your tests in that section but
386 it may be better to split related checks in functions you will call
387 from the START_TEST section. The parameter to START_TEST must match
388 the name of the C file. So in the above example the C file would be
389 called <filename>paths.c</>.
390 </para>
391 <para>
392 Tests should start by including the <filename>wine/test.h</> header.
393 This header will provide you access to all the testing framework
394 functions. You can then include the windows header you need, but make
395 sure to not include any Unix or Wine specific header: tests must
396 compile on Windows.
397 </para>
398 <para>
399 You can use <function>trace</> to print informational messages. Note
400 that these messages will only be printed if 'runtest -v' is being used.
401 <screen>
402 trace("testing GlobalAddAtomA\n");
403 trace("foo=%d\n",foo);
404 </screen>
405 </para>
406 <para>
407 Then just call functions and use <function>ok</> to make sure that
408 they behaved as expected:
409 <screen>
410 ATOM atom = GlobalAddAtomA( "foobar" );
411 ok( GlobalFindAtomA( "foobar" ) == atom, "could not find atom foobar\n" );
412 ok( GlobalFindAtomA( "FOOBAR" ) == atom, "could not find atom FOOBAR\n" );
413 </screen>
414 The first parameter of <function>ok</> is an expression which must
415 evaluate to true if the test was successful. The next parameter is a
416 printf-compatible format string which is displayed in case the test
417 failed, and the following optional parameters depend on the format
418 string.
419 </para>
420 </sect1>
422 <sect1 id="testing-error-messages">
423 <title>Writing good error messages</title>
424 <para>
425 The message that is printed when a test fails is
426 <emphasis>extremely</> important.
427 </para>
428 <para>
429 Someone will take your test, run it on a Windows platform that
430 you don't have access to, and discover that it fails. They will then
431 post an email with the output of the test, and in particular your
432 error message. Someone, maybe you, will then have to figure out from
433 this error message why the test failed.
434 </para>
435 <para>
436 If the error message contains all the relevant information that will
437 be easy. If not, then it will require modifying the test, finding
438 someone to compile it on Windows, sending the modified version to the
439 original tester and waiting for his reply. In other words, it will
440 be long and painful.
441 </para>
442 <para>
443 So how do you write a good error message? Let's start with an example
444 of a bad error message:
445 <screen>
446 ok(GetThreadPriorityBoost(curthread,&amp;disabled)!=0,
447 "GetThreadPriorityBoost Failed\n");
448 </screen>
449 This will yield:
450 <screen>
451 thread.c:123: Test failed: GetThreadPriorityBoost Failed
452 </screen>
453 </para>
454 <para>
455 Did you notice how the error message provides no information about
456 why the test failed? We already know from the line number exactly
457 which test failed. In fact the error message gives strictly no
458 information that cannot already be obtained by reading the code. In
459 other words it provides no more information than an empty string!
460 </para>
461 <para>
462 Let's look at how to rewrite it:
463 <screen>
464 BOOL rc;
466 rc=GetThreadPriorityBoost(curthread,&amp;disabled);
467 ok(rc!=0 && disabled==0,"rc=%d error=%ld disabled=%d\n",
468 rc,GetLastError(),disabled);
469 </screen>
470 This will yield:
471 <screen>
472 thread.c:123: Test failed: rc=0 error=120 disabled=0
473 </screen>
474 </para>
475 <para>
476 When receiving such a message, one would check the source, see that
477 it's a call to GetThreadPriorityBoost, that the test failed not
478 because the API returned the wrong value, but because it returned an
479 error code. Furthermore we see that GetLastError() returned 120 which
480 winerror.h defines as ERROR_CALL_NOT_IMPLEMENTED. So the source of
481 the problem is obvious: this Windows platform (here Windows 98) does
482 not support this API and thus the test must be modified to detect
483 such a condition and skip the test.
484 </para>
485 <para>
486 So a good error message should provide all the information which
487 cannot be obtained by reading the source, typically the function
488 return value, error codes, and any function output parameter. Even if
489 more information is needed to fully understand a problem,
490 systematically providing the above is easy and will help cut down the
491 number of iterations required to get to a resolution.
492 </para>
493 <para>
494 It may also be a good idea to dump items that may be hard to retrieve
495 from the source, like the expected value in a test if it is the
496 result of an earlier computation, or comes from a large array of test
497 values (e.g. index 112 of _pTestStrA in vartest.c). In that respect,
498 for some tests you may want to define a macro such as the following:
499 <screen>
500 #define eq(received, expected, label, type) \
501 ok((received) == (expected), "%s: got " type " instead of " type "\n", (label),(received),(expected))
505 eq( b, curr_val, "SPI_{GET,SET}BEEP", "%d" );
506 </screen>
507 </para>
508 </sect1>
511 <sect1 id="testing-platforms">
512 <title>Handling platform issues</title>
513 <para>
514 Some checks may be written before they pass successfully in Wine.
515 Without some mechanism, such checks would potentially generate
516 hundred of known failures for months each time the tests are being run.
517 This would make it hard to detect new failures caused by a regression.
518 or to detect that a patch fixed a long standing issue.
519 </para>
520 <para>
521 Thus the Wine testing framework has the concept of platforms and
522 groups of checks can be declared as expected to fail on some of them.
523 In the most common case, one would declare a group of tests as
524 expected to fail in Wine. To do so, use the following construct:
525 <screen>
526 todo_wine {
527 SetLastError( 0xdeadbeef );
528 ok( GlobalAddAtomA(0) == 0 && GetLastError() == 0xdeadbeef, "failed to add atom 0\n" );
530 </screen>
531 On Windows the above check would be performed normally, but on Wine it
532 would be expected to fail, and not cause the failure of the whole
533 test. However. If that check were to succeed in Wine, it would
534 cause the test to fail, thus making it easy to detect when something
535 has changed that fixes a bug. Also note that todo checks are accounted
536 separately from regular checks so that the testing statistics remain
537 meaningful. Finally, note that todo sections can be nested so that if
538 a test only fails on the cygwin and reactos platforms, one would
539 write:
540 <screen>
541 todo("cygwin") {
542 todo("reactos") {
546 </screen>
547 <!-- FIXME: Would we really have platforms such as reactos, cygwin, freebsd & co? -->
548 But specific platforms should not be nested inside a todo_wine section
549 since that would be redundant.
550 </para>
551 <para>
552 When writing tests you will also encounter differences between Windows
553 9x and Windows NT platforms. Such differences should be treated
554 differently from the platform issues mentioned above. In particular
555 you should remember that the goal of Wine is not to be a clone of any
556 specific Windows version but to run Windows applications on Unix.
557 </para>
558 <para>
559 So, if an API returns a different error code on Windows 9x and
560 Windows NT, your check should just verify that Wine returns one or
561 the other:
562 <screen>
563 ok ( GetLastError() == WIN9X_ERROR || GetLastError() == NT_ERROR, ...);
564 </screen>
565 </para>
566 <para>
567 If an API is only present on some Windows platforms, then use
568 LoadLibrary and GetProcAddress to check if it is implemented and
569 invoke it. Remember, tests must run on all Windows platforms.
570 Similarly, conformance tests should nor try to correlate the Windows
571 version returned by GetVersion with whether given APIs are
572 implemented or not. Again, the goal of Wine is to run Windows
573 applications (which do not do such checks), and not be a clone of a
574 specific Windows version.
575 </para>
576 <!--para>
577 FIXME: What about checks that cause the process to crash due to a bug?
578 </para-->
579 </sect1>
582 <!-- FIXME: Strategies for testing threads, testing network stuff,
583 file handling... -->
585 </chapter>
587 <!-- Keep this comment at the end of the file
588 Local variables:
589 mode: sgml
590 sgml-parent-document:("wine-devel.sgml" "set" "book" "part" "chapter" "")
591 End: