Purpose

This tool mentor provides an overview of how to use Rational Quantify to quickly pinpoint performance bottlenecks in Visual C/C++, Visual Basic, and Java programs. To learn more about Quantify, including how to project performance improvements, interpret source-code annotations, compare program runs, and fine-tune data collection, read Getting Ahead with Rational Quantify. For step-by-step information about using Quantify, see the Quantify online Help.

Related Rational Unified Process activity: section "Execute Unit Test" in Activity: Perform Unit Tests.

Overview

Quantify provides a complete, accurate set of performance data for your program and its components, and provides it in an understandable and usable format so that you can see exactly where your program spends most of its time. To profile a program's performance:

    1. Run a program using Quantify to collect performance data
    2. Use Quantify's data analysis windows and tools to analyze the performance data
    3. Run the program again and use Quantify's Compare Runs tool to find performance changes

Tool Steps

  1. Run a program using Quantify to collect performance data To top of page

The first step in profiling your program's performance is to collect performance data.

For Visual C++, instrument and run a program, either directly from Microsoft Developer Studio using the Quantify integration, or from Quantify. Quantify instruments copies of the executable and its associated modules. Quantify also inserts additional code to collect counted and timed performance data. Quantify shows you its progress as it instruments files.

For Java, run Java applets, class files, or code launched by container programs from Quantify (using the Run Program dialog) or from the command line. When you profile Java code, Quantify puts the Microsoft virtual machine for Java into a special mode that enables Quantify to monitor the VMÆs operation and directly collect counted and timed performance data as the applet, class file, or code runs.

For Visual Basic, run Visual Basic projects or p-code programs (Visual Basic 6.0) or Visual Basic native-code programs (Visual Basic 5.0 or later), either directly from Microsoft Visual Basic using the Quantify integration, or from Quantify. When you profile projects or p-code programs, Quantify puts the Visual Basic for Applications (VBA) interpreter engine into a special mode that enables Quantify to monitor the engine's operation and directly collect timed performance data as your code runs. For native-code programs, Quantify instruments the program and then collects counted and timed performance data.

When Quantify starts profiling, it displays the Run Summary window so you can monitor the activity of threads and fibers, and check other information about the run. As you exercise your code, Quantify records data about its performance. You can pause and resume data recording at any time, enabling you to profile specific portions of code. You can also take a snapshot of the current data, enabling you to examine performance in stages.

When you exit your program, Quantify has a complete profile of its performance. Because this base dataset can be very large, Quantify uses filters to automatically filter out non-critical data from system libraries and other modules before it displays the performance profile. As you analyze the performance data, you can display more or less data and detail from the original dataset.

Tip: In addition to using Quantify interactively, you can also use Quantify with your test scripts, makefiles, and batch files for automated testing. For more information, look up scripts in the Quantify online Help index.

    More information? Look up developer studio, visual basic, java, run summary, and recording data in the Quantify online Help index.
     
     

  1. Use Quantify's data analysis windows and tools to analyze the performance data To top of page

The second step in profiling your program's performance is to analyze the performance data that Quantify has collected.

When you exit the program for which Quantify has been collecting data, Quantify displays the Call Graph window, graphically depicting the calling structure and performance of the functions, procedures, or methods (collectively referred to here as functions) in the program. By default, the call graph displays the top 20 functions in the current dataset by function + descendants (F+D) time. Quantify's results include virtually no overhead from the profiling process itself. The numbers you see are the time your program would take without Quantify.

The call graph also highlights the most expensive path; thicker lines indicate more expensive paths. You can highlight other functions based on various criteria, including performance, calling relationships, and possible causes for bottlenecks. You can also show additional functions, hide functions, and move functions around to better view the call graph.

You can use Quantify's other data analysis windows to further examine the program's performance. To review all functions in the current dataset, and to sort them by various criteria, use the Function List window. The Function Detail window displays data for a specific function, and data about its callers and descendants, in both tabular and graphical formats. If debug data was available when you ran the program and you measured functions at line level, you can use the Annotated Source window to analyze a specific function's performance line by line.

Quantify provides several ways to reduce large datasets and display only the data you're interested in. For example, you can specify filters to remove functions based on module, pattern (for example, functions with CWnd in their name), or measurement type (for example, all waiting and blocking functions). You can also focus on a specific subtree.

You can easily analyze the performance of the program over several runs by merging the separate runs to create a new dataset.

    More information? Look up call graph window, function list window, function detail window, annotated source window, highlighting functions, filtering data, and subtrees in the Quantify online Help index.

         

  1. Run the program again and use Quantify's Compare Runs tool to find performance changes To top of page

The third and final step in profiling your program's performance is to compare performance data from two runs, to see whether code changes have caused the performance to improve or regress.

After you make changes to your code, you can rerun the updated program and compare the new results to a previous run. The Diff Call Graph highlights performance improvements in green and regressions in red, to help you pinpoint performance changes quickly. The Diff Function List displays the differences between the two runs, as well as original data from the two runs.

Use the Navigator window to keep track of all the runs you're working with. You can save performance data as a Quantify data file (.qfy), to use for further analysis or to share with other Quantify users. You can save data to a tab-delimited ASCII text file (.txt) to use outside of Quantify, for example, in test scripts or in Microsoft Excel. You can also copy data directly from the Function List window to use in Excel.

    More information? Look up comparing runs, navigator window, and saving data in the Quantify online Help index.

 

Copyright  ⌐ 1987 - 2000 Rational Software Corporation

Display Rational Unified Process using frames

Rational Unified Process