22 min read

In this article by Bayo Erinle, author of JMeter Cookbook, we will cover the following recipes:

  • Using the View Results Tree listener
  • Using the Aggregate Report listener
  • Debugging with Debug Sampler
  • Using Constant Throughput Timer
  • Using the JSR223 postprocessor
  • Analyzing Response Times Over Time
  • Analyzing transactions per second

(For more resources related to this topic, see here.)

One of the critical aspects of performance testing is knowing the right tools to use to attain your desired targets. Even when you settle on a tool, it is helpful to understand its features, component sets, and extensions, and appropriately apply them when needed.

In this article, we will go over some helpful components that will aid you in recording robust and realistic test plans while effectively analyzing reported results. We will also cover some components to help you debug test plans.

Using the View Results Tree listener

One of the most often used listeners in JMeter is the View Results Tree listener. This listener shows a tree of all sample responses, giving you quick navigation of any sample’s response time, response codes, response content, and so on. The component offers several ways to view the response data, some of which allow you to debug CSS/jQuery, regular expressions, and XPath queries, among other things. In addition, the component offers the ability to save responses to file, in case you need to store them for offline viewing or run some other processes on them. Along with the various bundled testers, the component provides a search functionality that allows you to quickly search for the responses of relevant items.

How to do it…

In this recipe, we will cover how to add the View Results Tree listener to a test plan and then use its in-built testers to test the response and derive expressions that we can use in postprocessor components. Perform the following steps:

  1. Launch JMeter.
  2. Add Thread Group to the test plan by navigating to Test Plan | Add | Threads (Users) | Thread Group.
  3. Add HTTP Request to the thread group by navigating to Thread Group | Add | Sampler | HTTP Request.
  4. Fill in the following details:
    •    Server Name or IP: dailyjs.com
  5. Add the View Results Tree listener to the test plan by navigating to Test Plan | Add | Listener | View Results Tree.
  6. Save and run the test plan.
  7. Once done, navigate to the View Results Tree component and click on the Response Data tab.
  8. Observe some of the built-in renders.
  9. Switch to the HTML render view by clicking on the dropdown and use the search textbox to search for any word on the page.
  10. Switch to the HTML (download resources) render view by clicking on the dropdown.
  11. Switch to the XML render view by clicking on the dropdown. Notice the entire HTML DOM structure is presented as the XML node elements.
  12. Switch to the RegExp Tester render view by clicking on the dropdown and try out some regular expression queries.
  13. Switch to the XPath Query Tester render view and try out some XPath queries.
  14. Switch to the CSS/jQuery Tester render view and try out some jQuery queries, for example, selecting all links inside divs marked with a class preview (Selector: div.preview a, Attribute: href, CSS/jQuery Implementation: JSOUP).

How it works…

As your test plans execute, the View Result Tree listener reports each sampler in your test plans individually. The Sampler Result tab of the component gives you a summarized view of the request and response including information such as load time, latency, response headers, body content sizes, response code and messages, response header content, and so on. The Request tab shows the actual request that got fulfilled by the sampler, which could be any of the acceptable requests the server can fulfill (for example, GET, POST, PUT, DELETE, and so on) along with details of the request headers. Finally, the Response Data tab gives the rendered view of the response received back from the server. The component includes several built-in renders along with tester components (CSS/JQuery, RegExp, and XPath) that allow us to test and come up with the right expressions or queries needed to use in postprocessor components within our test plans. This is a huge time saver as it means we don’t have to exercise the same tests repeatedly to nail down such expressions.

There’s more…

As with most things bundled with JMeter, additional view renders can be added to the View Result Tree component. The defaults included are Document, HTML, HTML (download resources), JSON, Text, and XML. Should any of these not suit your needs, you can create additional ones by implementing org.apache.jmeter.visualizers.ResultRender interface and/or extending org.apache.jmeter.visualizers.SamplerResultTab abstract class, bundling up the compiled classes as a JAR file and placing them in the $JMETER_HOME/lib/ext directory to make them available for JMeter.

The View Result Tree listener consumes a lot of memory and CPU resources, and should not be used during load testing. Use it only to debug and validate the test plans.

See also

Using the Aggregate Report listener

Another often used listener in JMeter is the Aggregate Report listener. This listener creates a row for each uniquely named request in the test plan. Each row gives a summarized view of useful information including Request Count, Average, Median, Min, Max, 90% Line, Error Rate, Throughput, Requests/second, and KB/sec. The 90% Line column is particularly worth paying close attention to as you execute your tests. This figure gives you the time it takes for the majority of threads/users to execute a particular request. It is measured in milliseconds. Higher numbers here are indicative of slow requests and/or components within the application under test.

Equally important is the Error % column, which reports the failure rate of each sampled request. It is reasonable to have some level of failure when exercising test runs, but too high a number is an indication of either errors in scripts or certain components in the application under test. Finally, of interest to stack holders might be the number of requests per second, which the Throughput column reports. The throughput values are approximate and let you know just how many requests per second the server is able to handle.

How to do it…

In this recipe, we will cover how to add an Aggregate Report listener to a test plan and then see the summarized view of our execution:

  1. Launch JMeter.
  2. Open the ch7_shoutbox.jmx script bundled with the code samples. Alternatively, you can download it from https://github.com/jmeter-cookbook/bundled-code/scripts/ch7/ch7_shoutbox.jmx.
  3. Add the Aggregate Report listener to Thread Group by navigating to Thread Group | Add | Listener | Aggregate Report.
  4. Save and run the test plan.
  5. Observe the real-time summary of results in the listener as the test proceeds.

How it works…

As your test plans execute, the Aggregate Report listener reports each sampler in your test plan on a separate row. Each row is packed with useful information. The Label column reflects the sample name, # Samples gives a count of each sampler, and Average, Mean, Min, and Max all give you the respective times of each sampler. As mentioned earlier, you should pay close attention to the 90% Line and Error % columns. This can help quickly pinpoint problematic components within the application under test and/or scripts. The Throughput column gives an idea of the responsiveness of the application under test and/or server. This can also be indicative of the capacity of the underlying server that the application under test runs on. This entire process is demonstrated in the following screenshot:

JMeter Cookbook
Using the Aggregate Report listener

See also

Debugging with Debug Sampler

Often, in the process of recording a new test plan or modifying an existing one, you will need to debug the scripts to finally get your desired results. Without such capabilities, the process will be a mix of trial and error and will become a time-consuming exercise. Debug Sampler is a nifty little component that generates a sample containing the values of all JMeter variables and properties. The generated values can then be seen in the Response Data tab of the View Results Tree listener. As such, to use this component, you need to have a View Results Tree listener added to your test plan. This component is especially useful when dealing with postprocessor components as it helps to verify the correct or expected values that were extracted during the test run.

How to do it…

In this recipe, we will see how we can use Debug Sampler to debug a postprocessor in our test plans. Perform the following steps:

  1. Launch JMeter.
  2. Open the prerecorded script ch7_debug_sampler.jmx bundled with the book. Alternatively, you can download it from http://git.io/debug_sampler.
  3. Add Debug Sampler to the test Thread Group by navigating to Thread Group | Add | Sampler | Debug Sampler.
  4. Save and run the test.
  5. Navigate to the View Results Tree listener component.
  6. Switch to RegExp Tester by clicking on the dropdown.
  7. Observe the response data of the Get All Requests sampler.
  8. What we want is a regular expression that will help us extract the ID of entries within this response. After a few attempts, we settle at “id”:(d+).
  9. Enable all the currently disabled samplers, that is, Request/Create Holiday Request, Modify Holiday, Get All Requests, and Delete Holiday Request.
  10. You can achieve this by selecting all the disabled components, right-clicking on them, and clicking on Enable.
  11. Add the Regular Expression Extractor postprocessor to the Request/Create Holiday Request sampler by navigating to Request/Create Holiday Request | Add | Post Processors | Regular Expression Extractor.
  12. Fill in the following details:
    •    Reference Name: id
    •    Regular Expression: “id”:(d+)
    •    Template: $1$
    •    Match No.: 0
    •    Default Value: NOT_FOUND
  13. Save and rerun the test.
  14. Observe the ID of the newly created holiday request and whether it was correctly extracted and reported in Debug Sampler.

How it works…

Our goal was to test a REST API endpoint that allows us to list, modify, and delete existing resources or create new ones. When we create a new resource, the identifier (ID) is autogenerated from the server. To perform any other operations on the newly created resource, we need to grab its autogenerated ID, store that in a JMeter variable, and use it further down the execution chain. In step 7, we were able to observe the format of the server response for the resource when we executed the Get All Requests sampler. With the aid of RegExp Tester, we were able to nail down the right regular expression to use to extract the ID of a resource, that is, “id”:(d+). Armed with this information, we added a Regular Expression Extractor postprocessor component to the Request/Create Holiday Request sampler and used the derived expression to get the ID of the newly created resource. We then used the ID stored in JMeter to modify and delete the resource down the execution chain. After test completion, with the help of Debug Sampler, we were able to verify whether the resource ID was properly extracted by the Regular Expression Extractor component and stored in JMeter as an ID variable.

Using Constant Throughput Timer

While running test simulations, it is sometimes necessary to be able to specify the throughput in terms of the number of requests per minute. This is the function of Constant Throughput Timer. This component introduces pauses to the test plan in such a way as to keep the throughput as close as possible to the target value specified. Though the name implies it is constant, various factors affect the behavior, such as server capacity, other timers or time-consuming elements in the test plan, and so on. As a result, the targeted throughput could be lowered.

How to do it…

In this recipe, we will add Constant Throughput Timer to our test plan and see how we can specify the expected throughput with it. Perform the following steps:

  1. Launch JMeter.
  2. Open the prerecorded script ch7_constant_throughput.jmx bundled with the book. Alternatively, you can download it from http://git.io/constant_throughput.
  3. Add Constant Throughput Timer to Thread Group by navigating to Thread Group | Add | Timer | Constant Throughput Timer.
  4. Fill in the following details:
    •    Target throughput (in samples per minute): 200
    •    Calculate Throughput based on: this thread only
  5. Save and run the test plan.
  6. Allow the test to run for about 5 minutes.
  7. Observe the result in the Aggregate Result listener as the test is going on.
  8. Stop the test manually as it is currently set to run forever.

How it works…

The goal of the Constant Throughput Timer component is to get your test plan samples as close as possible to a specified desired throughput. It achieves this by introducing variable pauses to the test plan in such a manner that will keep numbers as close as possible to the desired throughput. That said, throughput will be lowered if the server resources of the system under test can’t handle the load. Also, other elements (for example, other timers, the number of specified threads, and so on) within the test plan can affect attaining the desired throughput.

In our recipe, we have specified the throughput rate to be calculated based on a single thread, but Constant Throughput Timer also allows throughput to be calculated based on all active threads and all active threads in the current thread group. Each of these settings can be used to alter the behavior of the desired throughput.

As a rule of thumb, avoid using other timers at the same time you use Constant Throughput Timer, since you’ll not achieve the desired throughput.

See also

Using the JSR223 postprocessor

The JSR223 postprocessor allows you to use precompiled scripts within test plans. The fact that the scripts are compiled before they are actually used brings a significant performance boost compared to other postprocessors. This also allows a variety of programming languages to be used, including Java, Groovy, BeanShell, JEXL, and so on. This allows us to harness the powerful language features in those languages within our test plans.

JSR223 components, for example, could help us tackle preprocessor or postprocessor elements and samplers, allowing us more control over how elements are extracted from responses and stored as JMeter variables.

How to do it…

In this recipe, we will see how to use a JSR223 postprocessor within our test plan. We have chosen Groovy (http://groovy.codehaus.org/) as our choice of scripting language, but any of the other supporting languages will do:

  1. Download the standard set of plugins from http://jmeter-plugins.org/.
  2. Install the plugins by doing the following:
    •    Extract the ZIP archive to the location of your chosen directory
    •    Copy the lib folder in the extracted directory into the $JMETER_HOME directory
  3. Download the groovy-all JAR file from http://devbucket-afriq.s3.amazonaws.com/jmeter-cookbook/groovy-all-2.3.3.jar and add it to the $JMETER_HOME/lib directory.
  4. Launch JMeter.
  5. Add Thread Group by navigating to Test Plan | Add | Threads(Users) | Thread Group.
  6. Add Dummy Sampler to Thread Group by navigating to Thread Group | Add | Sampler | jp@gc – Dummy Sampler.
  7. In the Response Data text area, add the following content:
    <records>
       <car name='HSV Maloo' make='Holden' year='2006'>
           <country>Australia</country>
           <record type='speed'>Production Pickup Truck with speed of 271kph</record>
       </car>
       <car name='P50' make='Peel' year='1962'>
           <country>Isle of Man</country>
           <record type='size'>Smallest Street-Legal Car at 99cm wide and 59 kg in weight</record>
       </car>
       <car name='Royale' make='Bugatti' year='1931'>
           <country>France</country>
           <record type='price'>Most Valuable Car at $15 million</record>
       </car>
    </records>
  8. Download the Groovy script file from http://git.io/8jCXMg to any location of your choice. Alternatively, you can get it from the code sample bundle accompanying the book (ch7_jsr223.groovy).
  9. Add JSR223 PostProcessor as a child of Dummy Sampler by navigating to jp@gc – Dummy Sampler | Add | Post Processors | JSR223 PostProcessor.
  10. Select Groovy as the language of choice in the Language drop-down box.
  11. In the File Name textbox, put in the absolute path to where the Groovy script file is, for example, /tmp/scripts/ch7/ch7_jsr223.groovy.
  12. Add the View Results Tree listener to the test plan by navigating to Test Plan | Add | Listener | View Results Tree.
  13. Add Debug Sampler to Thread Group by navigating to Thread Group | Add | Sampler | Debug Sampler.
  14. Save and run the test.
  15. Observe the Response Data tab of Debug Sampler and see how we now have the JMeter variables car_0, car_1, and car_2, all extracted from the Response Data tab and populated by our JSR223 postprocessor component.

How it works…

JMeter exposes certain variables to the JSR223 component, allowing it to get hold of sample details and information, perform logic operations, and store the results as JMeter variables. The exposed attributes include Log, Label, Filename, Parameters, args[], ctx, vars, props, prev, sampler, and OUT. Each of these allows access to important and useful information that can be used during the postprocessing of sampler responses.

The log gives access to Logger (an instance of an Apache Commons Logging log instance; see http://bit.ly/1xt5dmd), which can be used to write log statements to the logfile. The Label and Filename attributes give us access to the sample label and script file name respectively. The Parameters and args[] attributes give us access to parameters sent to the script. The ctx attribute gives access to the current thread’s JMeter context (http://bit.ly/1lM31MC). vars gives access to write values into JMeter variables (http://bit.ly/1o5DDBr), exposing them to the result of the test plan. The props attribute gives us access to JMeterProperties. The sampler attribute gives us access to the current sampler while OUT allows us to write log statements to the standard output, that is, System.out. Finally, the prev sample gives access to previous sample results (http://bit.ly/1rKn8Cs), allowing us to get useful information such as the response data, headers, assertion results, and so on.

In our script, we made use of the prev and vars attributes. With prev, we were able to get hold of the XML response from the sample. Using Groovy’s XmlSlurper (http://bit.ly/1AoRMnb), we were able to effortlessly process the XML response and compose the interesting bits, storing them as JMeter variables using the vars attribute.

Using this technique, we are able to accomplish tasks that might have otherwise been cumbersome to achieve using any other postprocessor elements we have seen in other recipes. We are able to take full advantage of the language features of any chosen scripting language. In our case, we used Groovy, but any other supported scripting languages you are comfortable with will do as well.

See also

Analyzing Response Times Over Time

An important aspect of performance testing is the response times of the application under test. As such, it is often important to visually see the response times over a duration of time as the test plan is executed. Out of the box, JMeter comes with the Response Time Graph listener for this purpose, but it is limited and lacks some features. Such features include the ability to focus on a particular sample when viewing chat results, controlling the granularity of timeline values, selectively choosing which samples appear or not in the resulting chart, controlling whether to use relative graphs or not, and so on. To address all these and more, the Response Times Over Time listener extension from the JMeter plugins project comes to the rescue. It shines in areas where the Response Time Graph falls short.

How to do it…

In this recipe, we will see how to use the Response Times Over Time listener extension in our test plan and get the response times of our samples over time. Perform the following steps:

  1. Download the standard set of plugins from http://jmeter-plugins.org/.
  2. Install the plugins by doing the following:
    •    Extract the ZIP archive to the location of your chosen directory
    •    Copy the lib folder in the extracted directory into the $JMETER_HOME directory
  3. Launch JMeter.
  4. Open any of your existing prerecorded scripts or record a new one. Alternatively, you can open the ch7_response_times_over_time.jmx script accompanying the book or download it from http://git.io/response_times_over_time.
  5. Add the Response Times Over Time listener to the test plan by navigating to Test Plan | Add | Listener | jp@gc – Response Times Over Time.
  6. Save and execute the test plan.
  7. View the resulting chart in the tab by clicking on the Response Times Over Time component.
  8. Observe the time elapsed on the x axis and the response time in milliseconds on the y axis for all samples contained in the test plan.
  9. Navigate to the Rows tab and exclude some of the samples from the chart by unchecking the selection boxes next to the samples.
  10. Switch back to the Chart tab and observe that the chart now reflects your changes, allowing you to focus in on interested samples.
  11. Switch to the Settings tab and see all the available configuration options.
  12. Change some options and repeat the test execution. This is shown in the following screenshot:
    JMeter Cookbook

Analyzing Response Times Over Time

How it works…

Just like its name implies, the Response Times Over Time listener extension displays the average response time in milliseconds for each sampler in the test plan. It comes with various configuration options that allow you to customize the resulting graph to your heart’s content. More importantly, it allows you to focus in on specific samples in your test plan, helping you pinpoint potential bottlenecks or problematic modules within the application under test.

For graphs to be more meaningful, it helps to give samples sensible descriptive names and tweak the granularity of the elapsed time to a higher number in the Settings tab if you have long running tests. After test execution, data of any chart can also be exported to a CSV file for further analysis or use as you desire.

Any listener that charts results will have some impact on performance and shouldn’t be used during high volume load testing.

Analyzing transactions per second

Sometimes we are tasked with testing backend services, application program interfaces (APIs), or some other components that may not necessarily have a graphical user interface (GUI) attached to it, for example, a classic web application. At such times, the measure of the responsiveness of the module, for example, will be how many transactions per second it can withstand before slowness is observed. For example, Transactions Per Second (TPS) is useful information for stakeholders who are providing services that can be consumed by various third-party components or other services. Good examples of these include the Google search engine, which can be consumed by third-parties, and the Twitter and Facebook APIs, which allow developers to integrate their application with Twitter and Facebook respectively.

The Transactions Per Second listener extension component from the JMeter plugins project allows us to measure the transactions per second. It plots a chart of the transactions per second over an elapsed duration of time.

How to do it…

In this recipe, we will see how to use the Transactions Per Second listener extension in our test plan and get the transactions per second for a test API service:

  1. Download the standard set of plugins from http://jmeter-plugins.org/.
  2. Install the plugins by doing the following:
    •    Extract the ZIP archive to the location of your chosen directory
    •    Copy the lib folder in the extracted directory into the $JMETER_HOME directory
  3. Launch JMeter.
  4. Open the ch7_transaction_per_sec.jmx script accompanying the book or download it from http://git.io/trans_per_sec.
  5. Add the Transactions Per Second listener to the test plan by navigating to Test Plan | Add | Listener | jp@gc – Transactions per Second.
  6. Save and execute the test plan.
  7. View the resulting chart in the tab by clicking on the Transactions Per Second component.
  8. Observe the time elapsed on the x axis and the transactions/sec on the y axis for all samples contained in the test plan.
  9. Navigate to the Rows tab and exclude some of the samples from the chart by unchecking the selection boxes next to the samples.
  10. Switch back to the Chart tab and observe that the chart now reflects your changes, allowing you to focus in on interesting samples.
  11. Switch to the Settings tab and see all the available configuration options.
  12. Change some options and repeat the test execution.

How it works…

The Transactions Per Second listener extension displays the transactions per second for each sample in the test plan by counting the number of successfully completed transactions each second. It comes with various configuration options that allow you to customize the resulting graph. Such configurations allow you to focus in on specific samples of interest in your test plan, helping you to get at impending bottlenecks within the application under test.

It is helpful to give your samples sensible descriptive names to help make better sense of the resulting graphs and data points. This is shown in the following screenshot:

JMeter Cookbook

Analyzing Transactions per Second

Summary

In this article, you learned how to build a test plan using the steps mentioned in the recipe. Furthermore, you saw how to debug and analyze the result of a test plan after building it.

Resources for Article:


Further resources on this subject:


LEAVE A REPLY

Please enter your comment!
Please enter your name here