This sampler lets you send an HTTP/HTTPS request to a web server. It also lets you control whether or not JMeter parses HTML files for images and other embedded resources and sends HTTP requests to retrieve them. The following types of embedded resource are retrieved:
https.sessioncontext.shared=true By default, since version 5.0, the SSL context is retained during a Thread Group iteration and reset for each test iteration. If in your test plan the same user iterates multiple times, then you should set this to false. httpclient.reset_state_on_thread_group_iteration=true JMeter defaults to the SSL protocol level TLS. If the server needs a different level, e.g. SSLv3 , change the JMeter property, for example: https.default.protocol=SSLv3 JMeter also allows one to enable additional protocols, by changing the property https.socket.protocols . If the request uses cookies, then you will also need an HTTP Cookie Manager . You can add either of these elements to the Thread Group or the HTTP Request. If you have more than one HTTP Request that needs authorizations or cookies, then add the elements to the Thread Group. That way, all HTTP Request controllers will share the same Authorization Manager and Cookie Manager elements. If the request uses a technique called "URL Rewriting" to maintain sessions, then see section 6.1 Handling User Sessions With URL Rewriting for additional configuration steps.
"Redirect requested but followRedirects is disabled"This can be ignored. JMeter will collapse paths of the form ' /../segment ' in both absolute and relative redirect URLs. For example http://host/one/../two will be collapsed into http://host/two . If necessary, this behaviour can be suppressed by setting the JMeter property httpsampler.redirect.removeslashdotdot=false
http.java.sampler.retries=3Note: Certificates does not conform to algorithm constraints You may encounter the following error: java.security.cert.CertificateException: Certificates does not conform to algorithm constraints if you run a HTTPS request on a web site with a SSL certificate (itself or one of SSL certificates in its chain of trust) with a signature algorithm using MD2 (like md2WithRSAEncryption ) or with a SSL certificate with a size lower than 1024 bits. This error is related to increased security in Java 8. To allow you to perform your HTTPS request, you can downgrade the security of your Java installation by editing the Java jdk.certpath.disabledAlgorithms property. Remove the MD2 value or the constraint on size, depending on your case. This property is in this file:
JAVA_HOME/jre/lib/security/java.security56357 for details.
"Dbl-Quote: "" and Comma: ,"There must be as many values as there are placeholders in the statement even if your parameters are OUT ones. Be sure to set a value even if the value will not be used (for example in a CallableStatement).
columnValue = vars.getObject("resultObject").get(0).get("Column Name");
totalSleepTime = SleepTime + (System.currentTimeMillis() % SleepMask)
This will add a pre-defined entry in the LDAP Server and calculate the execution time. After execution of the test, the created entry will be deleted from the LDAP Server.
This will add the entry in the LDAP Server. User has to enter all the attributes in the table.The entries are collected from the table to add. The execution time is calculated. The created entry will not be deleted after the test.
This will create a pre-defined entry first, then will modify the created entry in the LDAP Server.And calculate the execution time. After execution of the test, the created entry will be deleted from the LDAP Server.
User defined test:This will modify the entry in the LDAP Server. User has to enter all the attributes in the table. The entries are collected from the table to modify. The execution time is calculated. The entry will not be deleted from the LDAP Server.
This will create the entry first, then will search if the attributes are available. It calculates the execution time of the search query. At the end of the execution,created entry will be deleted from the LDAP Server.
This will search the user defined entry(Search filter) in the Search base (again, defined by the user). The entries should be available in the LDAP Server. The execution time is calculated.
User defined test:This will delete the user-defined entry in the LDAP Server. The entries should be available in the LDAP Server. The execution time is calculated.
Domain name or IP address of the LDAP server. JMeter assumes the LDAP server is listening on the default port ( 389 ). Port to connect to (default is 389 ).AccessLogSampler was designed to read access logs and generate http requests. For those not familiar with the access log, it is the log the webserver maintains of every request it accepted. This means every image, CSS file, JavaScript file, html file, …
Tomcat uses the common format for access logs. This means any webserver that uses the common log format can use the AccessLogSampler. Server that use common log format include: Tomcat, Resin, Weblogic, and SunOne. Common log format looks like this:
127.0.0.1 - - [21/Oct/2003:05:37:21 -0500] "GET /index.jsp?%2Findex.jsp= HTTP/1.1" 200 8343The current implementation of the parser only looks at the text within the quotes that contains one of the HTTP protocol methods ( GET , PUT , POST , DELETE , …). Everything else is stripped out and ignored. For example, the response code is completely ignored by the parser. For the future, it might be nice to filter out entries that do not have a response code of 200 . Extending the sampler should be fairly simple. There are two interfaces you have to implement: The current implementation of AccessLogSampler uses the generator to create a new HTTPSampler. The servername, port and get images are set by AccessLogSampler. Next, the parser is called with integer 1 , telling it to parse one entry. After that, HTTPSampler.sample() is called to make the request. samp = (HTTPSampler) GENERATOR.generateRequest(); samp.setDomain(this.getDomain()); samp.setPort(this.getPort()); samp.setImageParser(this.isImageParser()); PARSER.parse(1); res = samp.sample(); res.setSampleLabel(samp.toString()); The required methods in LogParser are: Classes implementing Generator interface should provide concrete implementation for all the methods. For an example of how to implement either interface, refer to StandardGenerator and TCLogParser . The TCLogParser processes the access log independently for each thread. The SharedTCLogParser and OrderPreservingLogParser share access to the file, i.e. each thread gets the next entry in the log. The SessionFilter is intended to handle Cookies across threads. It does not filter out any entries, but modifies the cookie manager so that the cookies for a given IP are processed by a single thread at a time. If two threads try to process samples from the same client IP address, then one will be forced to wait until the other has completed. The LogFilter is intended to allow access log entries to be filtered by filename and regex, as well as allowing for the replacement of file extensions. However, it is not currently possible to configure this via the GUI, so it cannot really be used. The test element supports the ThreadListener and TestListener interface methods. These must be defined in the initialisation file. See the file BeanShellListeners.bshrc for example definitions.
JMeter processes function and variable references before passing the script field to the interpreter, so the references will only be resolved once. Variable and function references in script files will be passed verbatim to the interpreter, which is likely to cause a syntax error. In order to use runtime variables, please use the appropriate props methods, e.g. props.get("START.HMS"); props.put("PROP1","1234"); BeanShell does not currently support Java 5 syntax such as generics and the enhanced for loop.
Before invoking the script, some variables are set up in the BeanShell interpreter:
The contents of the Parameters field is put into the variable " Parameters ". The string is also split into separate tokens using a single space as the separator, and the resulting list is stored in the String array bsh.args .The full list of BeanShell variables that is set up is as follows:
log - the Logger Label - the Sampler label FileName - the file name, if any Parameters - text from the Parameters field bsh.args - the parameters, split as described above SampleResult - pointer to the current SampleResult ResponseCode defaults to 200 ResponseMessage defaults to " OK " IsSuccess defaults to true ctx - JMeterContext vars - JMeterVariables - e.g.vars.get("VAR1"); vars.put("VAR2","value"); vars.remove("VAR3"); vars.putObject("OBJ1",new Object());props - JMeterProperties (class java.util.Properties ) - e.g.
props.get("START.HMS"); props.put("PROP1","1234");When the script completes, control is returned to the Sampler, and it copies the contents of the following script variables into the corresponding variables in the SampleResult : The SampleResult ResponseData is set from the return value of the script. If the script returns null, it can set the response directly, by using the method SampleResult.setResponseData(data) , where data is either a String or a byte array. The data type defaults to " text ", but can be set to binary by using the method SampleResult.setDataType(SampleResult.BINARY) . The SampleResult variable gives the script full access to all the fields and methods in the SampleResult . For example, the script has access to the methods setStopThread(boolean) and setStopTest(boolean) . Here is a simple (not very useful!) example script: if (bsh.args[0].equalsIgnoreCase("StopThread")) { log.info("Stop Thread detected!"); SampleResult.setStopThread(true); return "Data from sample with Label "+Label; SampleResult.setResponseData("My data"); return null; Another example: ensure that the property beanshell.sampler.init=BeanShellSampler.bshrc is defined in jmeter.properties . The following script will show the values of all the variables in the ResponseData field: return getVariables(); For details on the methods available for the various classes ( JMeterVariables , SampleResult etc.) please check the Javadoc or the source code. Beware however that misuse of any methods can cause subtle faults that may be difficult to find. If you don't want to generate a SampleResult when this sampler is run, call the following method:
SampleResult.setIgnore();This call will have the following impact:
props.get("START.HMS"); props.put("PROP1","1234");There are other languages supported than those that appear in the drop-down list. Others may be available if the appropriate jar is installed in the JMeter lib directory. Notice that some languages such as Velocity may use a different syntax for JSR223 variables,
$log.debug("Hello " + $vars.get("a"));for Velocity.
Before invoking the script, some variables are set up. Note that these are JSR223 variables - i.e. they can be used directly in the script. The SampleResult ResponseData is set from the return value of the script. If the script returns null , it can set the response directly, by using the method SampleResult.setResponseData(data) , where data is either a String or a byte array. The data type defaults to " text ", but can be set to binary by using the method SampleResult.setDataType(SampleResult.BINARY) . The SampleResult variable gives the script full access to all the fields and methods in the SampleResult. For example, the script has access to the methods setStopThread(boolean) and setStopTest(boolean) . Unlike the BeanShell Sampler, the JSR223 Sampler does not set the ResponseCode , ResponseMessage and sample status via script variables. Currently the only way to changes these is via the SampleResult methods: The TCP Sampler opens a TCP/IP connection to the specified server. It then sends the text, and waits for a response. If " Re-use connection " is selected, connections are shared between Samplers in the same thread, provided that the exact same host name string and port are used. Different hosts/port combinations will use different connections, as will different threads. If both of " Re-use connection " and " Close connection " are selected, the socket will be closed after running the sampler. On the next sampler, another socket will be created. You may want to close a socket at the end of each thread loop. If an error is detected - or " Re-use connection " is not selected - the socket is closed. Another socket will be reopened on the next sample. The following properties can be used to control its operation: The class that handles the connection is defined by the GUI, failing that the property tcp.handler . If not found, the class is then searched for in the package org.apache.jmeter.protocol.tcp.sampler . Users can provide their own implementation. The class must extend org.apache.jmeter.protocol.tcp.sampler.TCPClient . This implementation is fairly basic. When reading the response, it reads until the end of line byte, if this is defined by setting the property tcp.eolByte , otherwise until the end of the input stream. You can control charset encoding by setting tcp.charset , which will default to Platform default encoding. This implementation converts the GUI input, which must be a hex-encoded string, into binary, and performs the reverse when reading the response. When reading the response, it reads until the end of message byte, if this is defined by setting the property tcp.BinaryTCPClient.eomByte , otherwise until the end of the input stream. This implementation extends BinaryTCPClientImpl by prefixing the binary message data with a binary length byte. The length prefix defaults to 2 bytes. This can be changed by setting the property tcp.binarylength.prefix.length . If the timeout is set, the read will be terminated when this expires. So if you are using an eolByte / eomByte , make sure the timeout is sufficiently long, otherwise the read will be terminated early. If tcp.status.prefix is defined, then the response message is searched for the text following that up to the suffix. If any such text is found, it is used to set the response code. The response message is then fetched from the properties file (if provided). Usage of pre- and suffix ¶ For example, if the prefix = " [ " and the suffix = " ] ", then the following response:
[J28] XI123,23,GBP,CRwould have the response code J28 . Response codes in the range " 400 "-" 499 " and " 500 "-" 599 " are currently regarded as failures; all others are successful. [This needs to be made configurable!]
This sampler can also be useful in conjunction with the Transaction Controller, as it allows pauses to be included without needing to generate a sample. For variable delays, set the pause time to zero, and add a Timer as a child.
The " Stop " action stops the thread or test after completing any samples that are in progress. The " Stop Now " action stops the test without waiting for samples to complete; it will interrupt any active samples. If some threads fail to stop within the 5 second time-limit, a message will be displayed in GUI mode. You can try using the Stop command to see if this will stop the threads, but if not, you should exit JMeter. In CLI mode, JMeter will exit if some threads fail to stop within the 5 second time limit. The time to wait can be changed using the JMeter property jmeterengine.threadstop.wait . The time is given in milliseconds. The SMTP Sampler can send mail messages using SMTP/SMTPS protocol. It is possible to set security protocols for the connection (SSL and TLS), as well as user authentication. If a security protocol is used a verification on the server certificate will occur. Two alternatives to handle this verification are available:Every request uses a connection acquired from the pool and returns it to the pool when the sampler completes. The connection pool size defaults to 100 and is configurable.
The measured response time corresponds to the "full" query execution, including both the time to execute the cypher query AND the time to consume the results sent back by the database.
The Simple Logic Controller lets you organize your Samplers and other Logic Controllers. Unlike other Logic Controllers, this controller provides no functionality beyond that of a storage device.
Download this example (see Figure 6). In this example, we created a Test Plan that sends two Ant HTTP requests and two Log4J HTTP requests. We grouped the Ant and Log4J requests by placing them inside Simple Logic Controllers. Remember, the Simple Logic Controller has no effect on how JMeter processes the controller(s) you add to it. So, in this example, JMeter sends the requests in the following order: Ant Home Page, Ant News Page, Log4J Home Page, Log4J History Page. Note, the File Reporter is configured to store the results in a file named " simple-test.dat " in the current directory. If you add Generative or Logic Controllers to a Loop Controller, JMeter will loop through them a certain number of times, in addition to the loop value you specified for the Thread Group. For example, if you add one HTTP Request to a Loop Controller with a loop count of two, and configure the Thread Group loop count to three, JMeter will send a total of 2 * 3 = 6 HTTP Requests. JMeter will expose the looping index as a variable named __jm__<Name of your element>__idx . So for example, if your Loop Controller is named LC, then you can access the looping index through ${__jm__LC__idx} . Index starts at 0We configured the Thread Group for a single thread and a loop count value of one. Instead of letting the Thread Group control the looping, we used a Loop Controller. You can see that we added one HTTP Request to the Thread Group and another HTTP Request to a Loop Controller. We configured the Loop Controller with a loop count value of five.
JMeter will send the requests in the following order: Home Page, News Page, News Page, News Page, News Page, and News Page.
Note, the File Reporter is configured to store the results in a file named " loop-test.dat " in the current directory.The Once Only Logic Controller tells JMeter to process the controller(s) inside it only once per Thread, and pass over any requests under it during further iterations through the test plan.
The Once Only Controller will now execute always during the first iteration of any looping parent controller. Thus, if the Once Only Controller is placed under a Loop Controller specified to loop 5 times, then the Once Only Controller will execute only on the first iteration through the Loop Controller (i.e. every 5 times).
Note this means the Once Only Controller will still behave as previously expected if put under a Thread Group (runs only once per test per Thread), but now the user has more flexibility in the use of the Once Only Controller.
For testing that requires a login, consider placing the login request in this controller since each thread only needs to login once to establish a session.
Download this example (see Figure 5). In this example, we created a Test Plan that has two threads that send HTTP request. Each thread sends one request to the Home Page, followed by three requests to the Bug Page. Although we configured the Thread Group to iterate three times, each JMeter thread only sends one request to the Home Page because this request lives inside a Once Only Controller.Each JMeter thread will send the requests in the following order: Home Page, Bug Page, Bug Page, Bug Page.
Note, the File Reporter is configured to store the results in a file named " loop-test.dat " in the current directory.If you add Generative or Logic Controllers to an Interleave Controller, JMeter will alternate among each of the other controllers for each loop iteration.
The outer Interleave Controller alternates between the two inner ones. Then, each inner Interleave Controller alternates between each of the HTTP Requests. Each JMeter thread will send the requests in the following order: Home Page, Interleaved, Bug Page, Interleaved, CVS Page, Interleaved, and FAQ Page, Interleaved.
Note, the File Reporter is configured to store the results in a file named " interleave-test2.dat " in the current directory.If the two interleave controllers under the main interleave controller were instead simple controllers, then the order would be: Home Page, CVS Page, Interleaved, Bug Page, FAQ Page, Interleaved.
However, if " ignore sub-controller blocks " was checked on the main interleave controller, then the order would be: Home Page, Interleaved, Bug Page, Interleaved, CVS Page, Interleaved, and FAQ Page, Interleaved.The Random Logic Controller acts similarly to the Interleave Controller, except that instead of going in order through its sub-controllers and samplers, it picks one at random at each pass.
The Random Order Controller is much like a Simple Controller in that it will execute each child element at most once, but the order of execution of the nodes will be random.
Percent executionsThe If Controller allows the user to control whether the test elements below it (its children) are run or not.
By default, the condition is evaluated only once on initial entry, but you have the option to have it evaluated for every runnable element contained in the controller. The best option (default one) is to check Interpret Condition as Variable Expression? , then in the condition field you have 2 options: For example, previously one could use the condition: ${__jexl3(${VAR} == 23)} and this would be evaluated as true / false , the result would then be passed to JavaScript which would then return true / false . If the Variable Expression option is selected, then the expression is evaluated and compared with " true ", without needing to use JavaScript. To test if a variable is undefined (or null) do the following, suppose var is named myVar , expression will be:"${myVar}" == "\${myVar}"Or use:
"${myVar}" != "\${myVar}"to test if a variable is defined and is not null. If you uncheck Interpret Condition as Variable Expression? , If Controller will internally use javascript to evaluate the condition which has a performance penalty that can be very big and make your test less scalable.
If the switch value is out of range, it will run the zeroth element, which therefore acts as the default for the numeric case. It also runs the zeroth element if the value is the empty string.
If the value is non-numeric (and non-empty), then the Switch Controller looks for the element with the same name (case is significant). If none of the names match, then the element named " default " (case not significant) is selected. If there is no default, then no element is selected, and the controller will not run anything.We configured the Thread Group for a single thread and a loop count value of one. You can see that we added one HTTP Request to the Thread Group and another HTTP Request to the ForEach Controller.
After the first HTTP request, a regular expression extractor is added, which extracts all the html links out of the return page and puts them in the inputVar variableIn the ForEach loop, a HTTP sampler is added which requests all the links that were extracted from the first returned HTML page. Here is another example you can download. This has two Regular Expressions and ForEach Controllers. The first RE matches, but the second does not match, so no samples are run by the second ForEach Controller The Regex Extractor uses the expression (\w)\s which matches a letter followed by a space, and returns the letter (not the space). Any matches are prefixed with the string " inputVar ". The ForEach Controller extracts all variables with the prefix " inputVar_ ", and executes its sample, passing the value in the variable " returnVar ". In this case it will set the variable to the values " a " " b " and " c " in turn. The For 1 Sampler is another Java Sampler which uses the return variable " returnVar " as part of the sample Label and as the sampler Data. Sample 2 , Regex 2 and For 2 are almost identical, except that the Regex has been changed to " (\w)\sx ", which clearly won't match. Thus the For 2 Sampler will not be run. A test plan fragment consists of a Controller and all the test elements (samplers etc.) contained in it. The fragment can be located in any Thread Group. If the fragment is located in a Thread Group, then its Controller can be disabled to prevent the fragment being run except by the Module Controller. Or you can store the fragments in a dummy Thread Group, and disable the entire Thread Group. There can be multiple fragments, each with a different series of samplers under them. The module controller can then be used to easily switch between these multiple test cases simply by choosing the appropriate controller in its drop down box. This provides convenience for running many alternate test plans quickly and easily. A fragment name is made up of the Controller name and all its parent names. For example: Test Plan / Protocol: JDBC / Control / Interleave Controller (Module1) Any fragments used by the Module Controller must have a unique name , as the name is used to find the target controller when a test plan is reloaded. For this reason it is best to ensure that the Controller name is changed from the default - as shown in the example above - otherwise a duplicate may be accidentally created when new elements are added to the test plan.
The generated sample is only regarded as successful if all its sub-samples are successful.
In parent mode, the individual samples can still be seen in the Tree View Listener, but no longer appear as separate entries in other Listeners. Also, the sub-samples do not appear in CSV log files, but they can be saved to XML files. In parent mode, Assertions (etc.) can be added to the Transaction Controller. However by default they will be applied to both the individual samples and the overall transaction sample. To limit the scope of the Assertions, use a Simple Controller to contain the samples, and add the Assertions to the Simple Controller. Parent mode controllers do not currently properly support nested transaction controllers of either type.The Critical Section Controller ensures that its children elements (samplers/controllers, etc.) will be executed by only one thread as a named lock will be taken before executing children of controller.
The figure below shows an example of using Critical Section Controller, in the figure below 2 Critical Section Controllers ensure that:Note that Listeners are processed at the end of the scope in which they are found.
The saving and reading of test results is generic. The various listeners have a panel whereby one can specify the file to which the results will be written (or read from). By default, the results are stored as XML files, typically with a " .jtl " extension. Storing as CSV is the most efficient option, but is less detailed than XML (the other available option). Listeners do not process sample data in CLI mode, but the raw data will be saved if an output file has been configured. In order to analyse the data generated by a CLI run, you need to load the file into the appropriate Listener. If you want to clear any current data before loading a new file, use the menu item Run → Clear ( Ctrl + Shift + E ) Run → Clear All ( Ctrl + E ) before loading the file. Results can be read from XML or CSV format files. When reading from CSV results files, the header (if present) is used to determine which fields are present. In order to interpret a header-less CSV file correctly, the appropriate properties must be set in jmeter.properties . XML files written by JMeter have version 1.0 declared in header while actual file is serialized with 1.1 rules. (This is done for historical compatibility reasons; see 59973 and 58679 ) This causes strict XML parsers to fail. Consider using non-strict XML parsers to read JTL files. The file name can contain function and/or variable references. However variable references do not work in client-server mode (functions work OK). This is because the file is created on the client, and the client does not run the test locally so does not set up variables. The following Listeners no longer need to keep copies of every single sample. Instead, samples with the same elapsed time are aggregated. Less memory is now needed, especially if most samples only take a second or two at most. JMeter variables can be saved to the output files. This can only be specified using a property. See the Listener Sample Variables for details For full details on setting up the default items to be saved see the Listener Default Configuration documentation. For details of the contents of the output files, see the CSV log format or the XML log format. The entries in jmeter.properties are used to define the defaults; these can be overridden for individual listeners by using the Configure button, as shown below. The settings in jmeter.properties also apply to the listener that is added by using the -l command-line flag. Name of the file containing sample results. The file name can be specified using either a relative or an absolute path name. Relative paths are resolved relative to the current working directory (which defaults to the bin/ directory). JMeter also support paths relative to the directory containing the current test plan (JMX file). If the path name begins with " ~/ " (or whatever is in the jmeter.save.saveservice.base_prefix JMeter property), then the path is assumed to be relative to the JMX file location.The Graph Results listener generates a simple graph that plots all sample times. Along the bottom of the graph, the current sample (black), the current average of all samples (blue), the current standard deviation (red), and the current throughput rate (green) are displayed in milliseconds.
The throughput number represents the actual number of requests/minute the server handled. This calculation includes any delays you added to your test and JMeter's own internal processing time. The advantage of doing the calculation like this is that this number represents something real - your server in fact handled that many requests per minute, and you can increase the number of threads and/or decrease the delays to discover your server's maximum throughput. Whereas if you made calculations that factored out delays and JMeter's processing, it would be unclear what you could conclude from that number.
The following table briefly describes the items on the graph. Further details on the precise meaning of the statistical terms can be found on the web - e.g. Wikipedia - or by consulting a book on statistics. The individual figures at the bottom of the display are the current values. " Latest Sample " is the current elapsed sample time, shown on the graph as " Data ". The value displayed on the top left of graph is the max of 90 th percentile of response time. Assertion Results MUST NOT BE USED during load test as it consumes a lot of resources (memory and CPU). Use it only for either functional testing or during Test Plan debugging and Validation. The Assertion Results visualizer shows the Label of each sample taken. It also reports failures of any Assertions that are part of the test plan. View Results Tree MUST NOT BE USED during load test as it consumes a lot of resources (memory and CPU). Use it only for either functional testing or during Test Plan debugging and Validation. The View Results Tree shows a tree of all sample responses, allowing you to view the response for any sample. In addition to showing the response, you can see the time it took to get this response, and some response codes. Note that the Request panel only shows the headers added by JMeter. It does not show any headers (such as Host ) that may be added by the HTTP protocol implementation. There are several ways to view the response, selectable by a drop-down box at the bottom of the left hand panel.
CSS/JQuery Tester The CSS/JQuery Tester only works for text responses. It shows the plain text in the upper panel. The " Test " button allows the user to apply the CSS/JQuery to the upper panel and the results will be displayed in the lower panel. The CSS/JQuery expression engine can be JSoup or Jodd, syntax of these 2 implementation differs slightly. For example, the Selector a[class=sectionlink] with attribute href applied to the current JMeter functions page gives the following output: Document The Document view will show the extract text from various type of documents like Microsoft Office (Word, Excel, PowerPoint 97-2003, 2007-2010 (openxml), Apache OpenOffice (writer, calc, impress), HTML, gzip, jar/zip files (list of content), and some meta-data on "multimedia" files like mp3, mp4, flv, etc. The complete list of support format is available on Apache Tika format page. A requirement to the Document view is to download the Apache Tika binary package ( tika-app-x.x.jar ) and put this in JMETER_HOME/lib directory. If the document is larger than 10 MB, then it won't be displayed. To change this limit, set the JMeter property document.max_size (unit is byte) or set to 0 to remove the limit. The HTML view attempts to render the response as HTML. The rendered HTML is likely to compare poorly to the view one would get in any web browser; however, it does provide a quick approximation that is helpful for initial result evaluation. Images, style-sheets, etc. aren't downloaded. HTML (download resources) If the HTML (download resources) view option is selected, the renderer may download images, style-sheets, etc. referenced by the HTML code. Regexp Tester The Regexp Tester view only works for text responses. It shows the plain text in the upper panel. The " Test " button allows the user to apply the Regular Expression to the upper panel and the results will be displayed in the lower panel. The regular expression engine is the same as that used in the Regular Expression Extractor. For example, the RE (JMeter\w*).* applied to the current JMeter home page gives the following output: Match[1][0]=JMeter - Apache JMeter</title> Match[1][1]=JMeter Match[2][0]=JMeter" title="JMeter" border="0"/></a> Match[2][1]=JMeter Match[3][0]=JMeterCommitters">Contributors</a> Match[3][1]=JMeterCommitters … and so on … The first number in [] is the match number; the second number is the group. Group [0] is whatever matched the whole RE. Group [1] is whatever matched the 1 st group, i.e. (JMeter\w*) in this case. See Figure 9b (below). The default Text view shows all of the text contained in the response. Note that this will only work if the response content-type is considered to be text. If the content-type begins with any of the following, it is considered as binary, otherwise it is considered to be text. image/ audio/ video/ The XML view will show response in tree style. Any DTD nodes or Prolog nodes will not show up in tree; however, response may contain those nodes. You can right-click on any node and expand or collapse all nodes below it. XPath Tester The XPath Tester only works for text responses. It shows the plain text in the upper panel. The " Test " button allows the user to apply the XPath query to the upper panel and the results will be displayed in the lower panel. Boundary Extractor Tester The Boundary Extractor Tester only works for text responses. It shows the plain text in the upper panel. The " Test " button allows the user to apply the Boundary Extractor query to the upper panel and the results will be displayed in the lower panel. Starting with version 3.2 the number of entries in the View is restricted to the value of the property view.results.tree.max_results which defaults to 500 entries. The old behaviour can be restored by setting the property to 0 . Beware, that this might consume a lot of memory. With Search option, most of the views also allow the displayed data to be searched; the result of the search will be high-lighted in the display above. For example the Control panel screenshot below shows one result of searching for " Java ". Note that the search operates on the visible text, so you may get different results when searching the Text and HTML views. Note: The regular expression uses the Java engine (not ORO engine like the Regular Expression Extractor or Regexp Tester view). If there is no content-type provided, then the content will not be displayed in the any of the Response Data panels. You can use Save Responses to a file to save the data in this case. Note that the response data will still be available in the sample result, so can still be accessed using Post-Processors. If the response data is larger than 200K, then it won't be displayed. To change this limit, set the JMeter property view.results.tree.max_size . You can also use save the entire response to a file using Save Responses to a file . Additional renderers can be created. The class must implement the interface org.apache.jmeter.visualizers.ResultRenderer and/or extend the abstract class org.apache.jmeter.visualizers.SamplerResultTab , and the compiled code must be available to JMeter (e.g. by adding it to the lib/ext directory). The Control Panel (above) shows an example of an HTML display. Figure 9 (below) shows an example of an XML display. Figure 9a (below) shows an example of a Regexp tester display. Figure 9b (below) shows an example of a Document display. The aggregate report creates a table row for each differently named request in your test. For each request, it totals the response information and provides request count, min, max, average, error rate, approximate throughput (request/second) and Kilobytes per second throughput. Once the test is done, the throughput is the actual through for the duration of the entire test. The throughput is calculated from the point of view of the sampler target (e.g. the remote server in the case of HTTP samples). JMeter takes into account the total time over which the requests have been generated. If other samplers and timers are in the same thread, these will increase the total time, and therefore reduce the throughput value. So two identical samplers with different names will have half the throughput of two samplers with the same name. It is important to choose the sampler names correctly to get the best results from the Aggregate Report. Calculation of the Median and 90 % Line (90 th percentile ) values requires additional memory. JMeter now combines samples with the same elapsed time, so far less memory is used. However, for samples that take more than a few seconds, the probability is that fewer samples will have identical times, in which case more memory will be needed. Note you can use this listener afterwards to reload a CSV or XML results file which is the recommended way to avoid performance impacts. See the Summary Report for a similar Listener that does not store individual samples and so needs constant memory. Label - The label of the sample. If " Include group name in label? " is selected, then the name of the thread group is added as a prefix. This allows identical labels from different thread groups to be collated separately if required. Throughput - the Throughput is measured in requests per second/minute/hour. The time unit is chosen so that the displayed rate is at least 1.0. When the throughput is saved to a CSV file, it is expressed in requests/second, i.e. 30.0 requests/minute is saved as 0.5. This visualizer creates a row for every sample result. Like the View Results Tree , this visualizer uses a lot of memory. By default, it only displays the main (parent) samples; it does not display the sub-samples (child samples). JMeter has a " Child Samples? " check-box. If this is selected, then the sub-samples are displayed instead of the main samples. This listener can record results to a file but not to the UI. It is meant to provide an efficient means of recording data by eliminating GUI overhead. When running in CLI mode, the -l flag can be used to create a data file. The fields to save are defined by JMeter properties. See the jmeter.properties file for details.Before invoking the script, some variables are set up in the BeanShell interpreter:
log - ( Logger ) - can be used to write to the log file ctx - ( JMeterContext ) - gives access to the context vars - ( JMeterVariables ) - gives read/write access to variables:vars.get(key); vars.put(key,val); vars.putObject("OBJ1",new Object());For details of all the methods available on each of the above variables, please check the Javadoc
If the property beanshell.listener.init is defined, this is used to load an initialisation file, which can be used to define methods etc. for use in the BeanShell script. The summary report creates a table row for each differently named request in your test. This is similar to the Aggregate Report , except that it uses less memory. The throughput is calculated from the point of view of the sampler target (e.g. the remote server in the case of HTTP samples). JMeter takes into account the total time over which the requests have been generated. If other samplers and timers are in the same thread, these will increase the total time, and therefore reduce the throughput value. So two identical samplers with different names will have half the throughput of two samplers with the same name. It is important to choose the sampler labels correctly to get the best results from the Report. Label - The label of the sample. If "Include group name in label?" is selected, then the name of the thread group is added as a prefix. This allows identical labels from different thread groups to be collated separately if required. Throughput - the Throughput is measured in requests per second/minute/hour. The time unit is chosen so that the displayed rate is at least 1.0. When the throughput is saved to a CSV file, it is expressed in requests/second, i.e. 30.0 requests/minute is saved as 0.5. This test element can be placed anywhere in the test plan. For each sample in its scope, it will create a file of the response Data. The primary use for this is in creating functional tests, but it can also be useful where the response is too large to be displayed in the View Results Tree Listener. The file name is created from the specified prefix, plus a number (unless this is disabled, see below). The file extension is created from the document type, if known. If not known, the file extension is set to 'unknown'. If numbering is disabled, and adding a suffix is disabled, then the file prefix is taken as the entire file name. This allows a fixed file name to be generated if required. The generated file name is stored in the sample response, and can be saved in the test log output file if required. The current sample is saved first, followed by any sub-samples (child samples). If a variable name is provided, then the names of the files are saved in the order that the sub-samples appear. See below.Filename Prefix (can include folders)Prefix for the generated file names; this can include a directory name. Relative paths are resolved relative to the current working directory (which defaults to the bin/ directory). JMeter also supports paths relative to the directory containing the current test plan (JMX file). If the path name begins with "~/" (or whatever is in the jmeter.save.saveservice.base_prefix JMeter property), then the path is assumed to be relative to the JMX file location. If parent folders in prefix do not exists, JMeter will create them and stop test if it fails. Please note that Filename Prefix must not contain Thread related data, so don't use any Variable (${varName}) or functions like ${__threadNum} in this fieldVariable Name containing saved file nameName of a variable in which to save the generated file name (so it can be used later in the test plan). If there are sub-samples then a numeric suffix is added to the variable name. E.g. if the variable name is FILENAME, then the parent sample file name is saved in the variable FILENAME, and the filenames for the child samplers are saved in FILENAME1, FILENAME2 etc.Minimum Length of sequence numberIf "Don't add number to prefix" is not checked, then numbers added to prefix will be padded by 0 so that prefix is has size of this value. Defaults to 0.Save Failed Responses onlyIf selected, then only failed responses are savedSave Successful Responses onlyIf selected, then only successful responses are savedDon't add number to prefixIf selected, then no number is added to the prefix. If you select this option, make sure that the prefix is unique or the file may be overwritten.Don't add content type suffixIf selected, then no suffix is added. If you select this option, make sure that the prefix is unique or the file may be overwritten.Add timestampIf selected, then date will be included in file suffix following format yyyyMMdd-HHmm_Don't Save Transaction Controller SampleResultIf selected, then SamplerResult generated by Transaction Controller will be ignoredScript fileA file containing the script to run, if a relative file path is used, then it will be relative to directory referenced by "user.dir" System propertyScript compilation cachingUnique String across Test Plan that JMeter will use to cache result of Script compilation if language used supports Compilable interface (Groovy is one of these, java, beanshell and javascript are not).See note in JSR223 Sampler Java System property if you're using Groovy without checking this optionScriptThe script to run.Yes (unless script file is provided)(JMeterVariables) - gives read/write access to variables:vars.get(key); vars.put(key,val); vars.putObject("OBJ1",new Object()); vars.getObject("OBJ2");props (JMeterProperties - class java.util.Properties) - e.g. props.get("START.HMS"); props.put("PROP1","1234"); sampleResult, prev (SampleResult) - gives access to the SampleResult sampleEvent (SampleEvent) - gives access to the SampleEvent sampler (Sampler)- gives access to the last sampler System.out - e.g. OUT.println("message")For details of all the methods available on each of the above variables, please check the Javadoc
This test element can be placed anywhere in the test plan. Generates a summary of the test run so far to the log file and/or standard output. Both running and differential totals are shown. Output is generated every n seconds (default 30 seconds) on the appropriate time boundary, so that multiple test runs on the same time will be synchronised.Since a summary/differential line is written only if there are samples emitted, the interval for generation may not be respected if your test has no sample generated within the intervalSee jmeter.properties file for the summariser configuration items: # Define the following property to automatically start a summariser with that name # (applies to CLI mode only) #summariser.name=summary # interval between summaries (in seconds) default 3 minutes #summariser.interval=30 # Write messages to log file #summariser.log=true # Write messages to System.out #summariser.out=true This element is mainly intended for batch (CLI) runs. The output looks like the following: label + 16 in 0:00:12 = 1.3/s Avg: 1608 Min: 1163 Max: 2009 Err: 0 (0.00%) Active: 5 Started: 5 Finished: 0 label + 82 in 0:00:30 = 2.7/s Avg: 1518 Min: 1003 Max: 2020 Err: 0 (0.00%) Active: 5 Started: 5 Finished: 0 label = 98 in 0:00:42 = 2.3/s Avg: 1533 Min: 1003 Max: 2020 Err: 0 (0.00%) label + 85 in 0:00:30 = 2.8/s Avg: 1505 Min: 1008 Max: 2005 Err: 0 (0.00%) Active: 5 Started: 5 Finished: 0 label = 183 in 0:01:13 = 2.5/s Avg: 1520 Min: 1003 Max: 2020 Err: 0 (0.00%) label + 79 in 0:00:30 = 2.7/s Avg: 1578 Min: 1089 Max: 2012 Err: 0 (0.00%) Active: 5 Started: 5 Finished: 0 label = 262 in 0:01:43 = 2.6/s Avg: 1538 Min: 1003 Max: 2020 Err: 0 (0.00%) label + 80 in 0:00:30 = 2.7/s Avg: 1531 Min: 1013 Max: 2014 Err: 0 (0.00%) Active: 5 Started: 5 Finished: 0 label = 342 in 0:02:12 = 2.6/s Avg: 1536 Min: 1003 Max: 2020 Err: 0 (0.00%) label + 83 in 0:00:31 = 2.7/s Avg: 1512 Min: 1003 Max: 1982 Err: 0 (0.00%) Active: 5 Started: 5 Finished: 0 label = 425 in 0:02:43 = 2.6/s Avg: 1531 Min: 1003 Max: 2020 Err: 0 (0.00%) label + 83 in 0:00:29 = 2.8/s Avg: 1487 Min: 1023 Max: 2013 Err: 0 (0.00%) Active: 5 Started: 5 Finished: 0 label = 508 in 0:03:12 = 2.6/s Avg: 1524 Min: 1003 Max: 2020 Err: 0 (0.00%) label + 78 in 0:00:30 = 2.6/s Avg: 1594 Min: 1013 Max: 2016 Err: 0 (0.00%) Active: 5 Started: 5 Finished: 0 label = 586 in 0:03:43 = 2.6/s Avg: 1533 Min: 1003 Max: 2020 Err: 0 (0.00%) label + 80 in 0:00:30 = 2.7/s Avg: 1516 Min: 1013 Max: 2005 Err: 0 (0.00%) Active: 5 Started: 5 Finished: 0 label = 666 in 0:04:12 = 2.6/s Avg: 1531 Min: 1003 Max: 2020 Err: 0 (0.00%) label + 86 in 0:00:30 = 2.9/s Avg: 1449 Min: 1004 Max: 2017 Err: 0 (0.00%) Active: 5 Started: 5 Finished: 0 label = 752 in 0:04:43 = 2.7/s Avg: 1522 Min: 1003 Max: 2020 Err: 0 (0.00%) label + 65 in 0:00:24 = 2.7/s Avg: 1579 Min: 1007 Max: 2003 Err: 0 (0.00%) Active: 0 Started: 5 Finished: 5 label = 817 in 0:05:07 = 2.7/s Avg: 1526 Min: 1003 Max: 2020 Err: 0 (0.00%) The "label" is the name of the element. The "+" means that the line is a delta line, i.e. shows the changes since the last output. The "=" means that the line is a total line, i.e. it shows the running total. Entries in the JMeter log file also include time-stamps. The example "817 in 0:05:07 = 2.7/s" means that there were 817 samples recorded in 5 minutes and 7 seconds, and that works out at 2.7 samples per second. The Avg (Average), Min (Minimum) and Max (Maximum) times are in milliseconds. "Err" means number of errors (also shown as percentage). The last two lines will appear at the end of a test. They will not be synchronised to the appropriate time boundary. Note that the initial and final deltas may be for less than the interval (in the example above this is 30 seconds). The first delta will generally be lower, as JMeter synchronizes to the interval boundary. The last delta will be lower, as the test will generally not finish on an exact interval boundary. The label is used to group sample results together. So if you have multiple Thread Groups and want to summarize across them all, then use the same label - or add the summariser to the Test Plan (so all thread groups are in scope). Different summary groupings can be implemented by using suitable labels and adding the summarisers to appropriate parts of the test plan. In CLI mode by default a Generate Summary Results listener named "summariser" is configured, if you have already added one to your Test Plan, ensure you name it differently otherwise results will be accumulated under this label (summary) leading to wrong results (sum of total samples + samples located under the Parent of Generate Summary Results listener). This is not a bug but a design choice allowing to summarize across thread groups. Descriptive name for this element that is shown in the tree. It appears as the "label" in the output. Details for all elements with the same label will be added together. The backend listener is an Asynchronous listener that enables you to plug custom implementations of BackendListenerClient. By default, a Graphite implementation is provided.Async Queue sizeSize of the queue that holds the SampleResults while they are processed asynchronously.ParametersParameters of the BackendListenerClient implementation.graphiteMetricsSenderorg.apache.jmeter.visualizers.backend.graphite.TextGraphiteMetricsSender or org.apache.jmeter.visualizers.backend.graphite.PickleGraphiteMetricsSendergraphiteHostGraphite or InfluxDB (with Graphite plugin enabled) server hostgraphitePortGraphite or InfluxDB (with Graphite plugin enabled) server port, defaults to 2003. Note PickleGraphiteMetricsSender (port 2004) can only talk to Graphite server.rootMetricsPrefixPrefix of metrics sent to backend. Defaults to "jmeter." Note that JMeter does not add a separator between the root prefix and the samplerName which is why the trailing dot is currently needed.summaryOnlyOnly send a summary with no detail. Defaults to true.samplersListDefines the names (labels) of sample results to be sent to the back end. If useRegexpForSamplersList=false this is a list of semi-colon separated names. If useRegexpForSamplersList=true this is a regular expression which will be matched against the names.useRegexpForSamplersListConsider samplersList as a regular expression to select the samplers for which you want to report metrics to backend. Defaults to false.percentilesThe percentiles you want to send to the backend. A percentile may contain a fractional part, for example 12.5. (The separator is always ".") List must be semicolon separated. Generally 3 or 4 values should be sufficient. Since JMeter 3.2, an implementation that allows writing directly in InfluxDB with a custom schema. It is called InfluxdbBackendListenerClient. The following parameters apply to the InfluxdbBackendListenerClient implementation:influxdbTokenInfluxDB 2 authentication token (example: HE9yIdAPzWJDspH_tCc2UvdKZpX==); since 5.2.applicationName of tested application. This value is stored in the 'events' measurement as a tag named 'application'measurementMeasurement as per Influx Line Protocol Reference. Defaults to "jmeter".summaryOnlyOnly send a summary with no detail. Defaults to true.samplersRegexRegular expression which will be matched against the names of samples and sent to the back end.testTitleTest name. Defaults to Test name. This value is stored in the 'events' measurement as a field named 'text'. JMeter generate automatically at the start and the end of the test an annotation with this value ending with ' started' and ' ended'eventTagsGrafana allow to display tag for each annotation. You can fill them here. This value is stored in the 'events' measurement as a tag named 'tags'.percentilesThe percentiles you want to send to the backend. A percentile may contain a fractional part, for example 12.5 (The separator is always "."). List must be semicolon separated. Generally three or four values should be sufficient.TAG_WhatEverYouWantYou can add as many custom tags as you want. For each of them, create a new line and prefix its name by "TAG_" Since JMeter 5.4, an implementation that writes all sample results to InfluxDB. It is called InfluxDBRawBackendListenerClient. It is worth noting that this will use more resources than the InfluxdbBackendListenerClient, both by JMeter and InfluxDB due to the increase in data and individual writes. The following parameters apply to the InfluxDBRawBackendListenerClient implementation:influxdbUrlInflux URL (e.g. http://influxHost:8086/write?db=jmeter or, for the cloud, https://eu-central-1-1.aws.cloud2.influxdata.com/api/v2/write?org=org-id&bucket=jmeter)influxdbTokenInfluxDB 2 authentication token (e.g. HE9yIdAPzWJDspH_tCc2UvdKZpX==)measurementMeasurement as per Influx Line Protocol Reference. Defaults to "jmeter." Configuration elements can be used to set up defaults and variables for later use by samplers. Note that these elements are processed at the start of the scope in which they are found, i.e. before any samplers in the same scope. CSV Data Set Config is used to read lines from a file, and split them into variables. It is easier to use than the __CSVRead() and __StringFromFile() functions. It is well suited to handling large numbers of variables, and is also useful for testing with "random" and unique values.Generating unique random values at run-time is expensive in terms of CPU and memory, so just create the data in advance of the test. If necessary, the "random" data from the file can be used in conjunction with a run-time parameter to create different sets of values from each run - e.g. using concatenation - which is much cheaper than generating everything at run-time. JMeter allows values to be quoted; this allows the value to contain a delimiter. If "allow quoted data" is enabled, a value may be enclosed in double-quotes. These are removed. To include double-quotes within a quoted field, use two double-quotes. For example: By default, the file is only opened once, and each thread will use a different line from the file. However the order in which lines are passed to threads depends on the order in which they execute, which may vary between iterations. Lines are read at the start of each test iteration. The file name and mode are resolved in the first iteration. See the description of the Share mode below for additional options. If you want each thread to have its own set of values, then you will need to create a set of files, one for each thread. For example test1.csv, test2.csv, …, testn.csv. Use the filename test${__threadNum}.csv and set the "Sharing mode" to "Current thread". CSV Dataset variables are defined at the start of each test iteration. As this is after configuration processing is completed, they cannot be used for some configuration items - such as JDBC Config - that process their contents at configuration time (see 40394) However the variables do work in the HTTP Auth Manager, as the username etc. are processed at run-time. If the recycle option is false, and stopThread is false, then all the variables are set to <EOF> when the end of file is reached. This value can be changed by setting the JMeter property csvdataset.eofstring. Name of the file to be read. Relative file names are resolved with respect to the path of the active test plan. For distributed testing, the CSV file must be stored on the server host system in the correct relative directory to where the JMeter server is started. Absolute file names are also supported, but note that they are unlikely to work in remote mode, unless the remote server has the same directory structure. If the same physical file is referenced in two different ways - e.g. csvdata.txt and ./csvdata.txt - then these are treated as different files. If the OS does not distinguish between upper and lower case, csvData.TXT would also be opened separately.
File EncodingThe encoding to be used to read the file, if not the platform default.Variable NamesList of variable names. The names must be separated by the delimiter character. They can be quoted using double-quotes. JMeter supports CSV header lines: if the variable name field empty, then the first line of the file is read and interpreted as the list of column names.Use first line as Variable NamesIgnore first line of CSV file, it will only be used if Variable Names is not empty, if Variable Names is empty the first line must contain the headers.DelimiterDelimiter to be used to split the records in the file. If there are fewer values on the line than there are variables the remaining variables are not updated - so they will retain their previous value (if any).Allow quoted data?Should the CSV file allow values to be quoted? If enabled, then values can be enclosed in " - double-quote - allowing values to contain a delimiter.Recycle on EOF?Should the file be re-read from the beginning on reaching EOF? (default is true)Stop thread on EOF?Should the thread be stopped on EOF, if Recycle is false? (default is false)Sharing modeIdentifier - all threads sharing the same identifier share the same file. So for example if you have 4 thread groups, you could use a common id for two or more of the groups to share the file between them. Or you could use the thread number to share the file between the same thread numbers in different thread groups.The DNS Cache Manager element allows to test applications, which have several servers behind load balancers (CDN, etc.), when user receives content from different IP's. By default JMeter uses JVM DNS cache. That's why only one server from the cluster receives load. DNS Cache Manager resolves names for each thread separately each iteration and saves results of resolving to its internal DNS Cache, which is independent from both JVM and OS DNS caches. A mapping for static hosts can be used to simulate something like /etc/hosts file. These entries will be preferred over the custom resolver. Use custom DNS resolver has to be enabled, if you want to use this mapping. Say, you have a test server, that you want to reach with a name, that is not (yet) set up in your DNS servers. For our example, this would be www.example.com for the server name, which you want to reach at the IP of the server a123.another.example.org. You could change your workstation and add an entry to your /etc/hosts file - or the equivalent for your OS, or add an entry to the Static Host Table of the DNS Cache Manager. You would type www.example.com into the first column (Host) and a123.another.example.org into the second column (Hostname or IP address). As the name of the second column implies, you could even use the IP address of your test server there.
The IP address for the test server will be looked up by using the custom DNS resolver. When none is given, the system DNS resolver will be used. Now you can use www.example.com in your HTTPClient4 samplers and the requests will be made against a123.another.example.org with all headers set to www.example.com.
Clear cache each IterationIf selected, DNS cache of every Thread is cleared each time new iteration is started.Use system DNS resolverSystem DNS resolver will be used. For correct work edit $JAVA_HOME/jre/lib/security/java.security and add networkaddress.cache.ttl=0Use custom DNS resolverCustom DNS resolver (from dnsjava library) will be used.Hostname or IP addressList of DNS servers to use. If empty, network configuration DNS will used.Add ButtonAdd an entry to the DNS servers table.Delete ButtonDelete the currently selected table entry.Host and Hostname or IP addressMapping of hostnames to a static host entry which will be resolved using the custom DNS resolver.Add static host ButtonAdd an entry to the static hosts table.Delete static host ButtonDelete the currently selected static host in the table.The Authorization Manager lets you specify one or more user logins for web pages that are restricted using server authentication. You see this type of authentication when you use your browser to access a restricted page, and your browser displays a login dialog box. JMeter transmits the login information when it encounters this type of page.
The Authorization headers may not be shown in the Tree View Listener "Request" tab. The Java implementation does pre-emptive authentication, but it does not return the Authorization header when JMeter fetches the headers. The HttpComponents (HC 4.5.X) implementation defaults to pre-emptive since 3.2 and the header will be shown. To disable this, set the values as below, in which case authentication will only be performed in response to a challenge. When looking for a match against a URL, JMeter checks each entry in turn, and stops when it finds the first match. Thus the most specific URLs should appear first in the list, followed by less specific ones. Duplicate URLs will be ignored. If you want to use different usernames/passwords for different threads, you can use variables. These can be set up using a CSV Data Set Config Element (for example).Clear auth on each iteration?Used by Kerberos authentication. If checked, authentication will be done on each iteration of Main Thread Group loop even if it has already been done in a previous one. This is usually useful if each main thread group iteration represents behaviour of one Virtual User.Base URLA partial or complete URL that matches one or more HTTP Request URLs. As an example, say you specify a Base URL of "http://localhost/restricted/" with a Username of "jmeter" and a Password of "jmeter". If you send an HTTP request to the URL "http://localhost/restricted/ant/myPage.html", the Authorization Manager sends the login information for the user named, "jmeter".UsernameThe username to authorize.PasswordThe password for the user. (N.B. this is stored unencrypted in the test plan)DomainThe domain to use for NTLM.RealmThe realm to use for NTLM.MechanismType of authentication to perform. JMeter can perform different types of authentications based on used Http Samplers: BASICHttpClient 4 BASIC, DIGEST and Kerberos You can also configure those two properties in the file bin/system.properties. Look at the two sample configuration files (krb5.conf and jaas.conf) located in the JMeter bin folder for references to more documentation, and tweak them to match your Kerberos configuration. Delegation of credentials is disabled by default for SPNEGO. If you want to enable it, you can do so by setting the property kerberos.spnego.delegate_cred to true. When generating a SPN for Kerberos SPNEGO authentication IE and Firefox will omit the port number from the URL. Chrome has an option (--enable-auth-negotiate-port) to include the port number if it differs from the standard ones (80 and 443). That behavior can be emulated by setting the following JMeter property as below. In jmeter.properties or user.properties, set: Load Button - Load a previously saved authorization table and add the entries to the existing authorization table entries. Save As Button - Save the current authorization table to a file. Download this example. In this example, we created a Test Plan on a local server that sends three HTTP requests, two requiring a login and the other is open to everyone. See figure 10 to see the makeup of our Test Plan. On our server, we have a restricted directory named, "secret", which contains two files, "index.html" and "index2.html". We created a login id named, "kevin", which has a password of "spot". So, in our Authorization Manager, we created an entry for the restricted directory and a username and password (see figure 11). The two HTTP requests named "SecretPage1" and "SecretPage2" make requests to "/secret/index.html" and "/secret/index2.html". The other HTTP request, named "NoSecretPage" makes a request to "/index.html".When we run the Test Plan, JMeter looks in the Authorization table for the URL it is requesting. If the Base URL matches the URL, then JMeter passes this information along with the request.
You can download the Test Plan, but since it is built as a test for our local server, you will not be able to run it. However, you can use it as a reference in constructing your own Test Plan.The HTTP Cache Manager is used to add caching functionality to HTTP requests within its scope to simulate browser cache feature. Each Virtual User thread has its own Cache. By default, Cache Manager will store up to 5000 items in cache per Virtual User thread, using LRU algorithm. Use property "maxSize" to modify this value. Note that the more you increase this value the more HTTP Cache Manager will consume memory, so be sure to adapt the -Xmx JVM option accordingly. If a sample is successful (i.e. has response code 2xx) then the Last-Modified and Etag (and Expired if relevant) values are saved for the URL. Before executing the next sample, the sampler checks to see if there is an entry in the cache, and if so, the If-Last-Modified and If-None-Match conditional headers are set for the request. Additionally, if the "Use Cache-Control/Expires header" option is selected, then the Cache-Control/Expires value is checked against the current time. If the request is a GET request, and the timestamp is in the future, then the sampler returns immediately, without requesting the URL from the remote server. This is intended to emulate browser behaviour. Note that if Cache-Control header is "no-cache", the response will be stored in cache as pre-expired, so will generate a conditional GET request. If Cache-Control has any other value, the "max-age" expiry option is processed to compute entry lifetime, if missing then expire header will be used, if also missing entry will be cached as specified in RFC 2616 section 13.2.4 using Last-Modified time and response Date. If the requested document has not changed since it was cached, then the response body will be empty. Likewise if the Expires date is in the future. This may cause problems for Assertions.Clear cache each iterationIf selected, then the cache is cleared at the start of the thread.Use Cache Control/Expires header when processing GET requestsSee description above.Max Number of elements in cacheSee description above.The Cookie Manager element has two functions: First, it stores and sends cookies just like a web browser. If you have an HTTP Request and the response contains a cookie, the Cookie Manager automatically stores that cookie and will use it for all future requests to that particular web site. Each JMeter thread has its own "cookie storage area". So, if you are testing a web site that uses a cookie for storing session information, each JMeter thread will have its own session. Note that such cookies do not appear on the Cookie Manager display, but they can be seen using the View Results Tree Listener. JMeter checks that received cookies are valid for the URL. This means that cross-domain cookies are not stored. If you have bugged behaviour or want Cross-Domain cookies to be used, define the JMeter property "CookieManager.check.cookies=false". Received Cookies can be stored as JMeter thread variables. To save cookies as variables, define the property "CookieManager.save.cookies=true". Also, cookies names are prefixed with "COOKIE_" before they are stored (this avoids accidental corruption of local variables) To revert to the original behaviour, define the property "CookieManager.name.prefix= " (one or more spaces). If enabled, the value of a cookie with the name TEST can be referred to as ${COOKIE_TEST}.Second, you can manually add a cookie to the Cookie Manager. However, if you do this, the cookie will be shared by all JMeter threads.
Note that such Cookies are created with an Expiration time far in the future
Cookies with null values are ignored by default. This can be changed by setting the JMeter property: CookieManager.delete_null_cookies=false. Note that this also applies to manually defined cookies - any such cookies will be removed from the display when it is updated. Note also that the cookie name must be unique - if a second cookie is defined with the same name, it will replace the first.If there is more than one Cookie Manager in the scope of a Sampler, there is currently no way to specify which one is to be used. Also, a cookie stored in one cookie manager is not available to any other manager, so use multiple Cookie Managers with care.ParametersAttributeDescriptionRequiredDescriptive name for this element that is shown in the tree.Clear Cookies each IterationIf selected, all server-defined cookies are cleared each time the main Thread Group loop is executed. Any cookie defined in the GUI are not cleared.Cookie PolicyThe cookie policy that will be used to manage the cookies. "standard" is the default since 3.0, and should work in most cases. See Cookie specifications and CookieSpec implementations [Note: "ignoreCookies" is equivalent to omitting the CookieManager.]ImplementationHC4CookieHandler (HttpClient 4.5.X API). Default is HC4CookieHandler since 3.0. [Note: If you have a website to test with IPv6 address, choose HC4CookieHandler (IPv6 compliant)]User-Defined Cookiesgives you the opportunity to use hardcoded cookies that will be used by all threads during the test execution. The "domain" is the hostname of the server (without http://); the port is currently ignored.No (discouraged, unless you know what you're doing)Add ButtonAdd an entry to the cookie table.Delete ButtonDelete the currently selected table entry.Load ButtonLoad a previously saved cookie table and add the entries to the existing cookie table entries.Save As ButtonSave the current cookie table to a file (does not save any cookies extracted from HTTP Responses). This element lets you set default values that your HTTP Request controllers use. For example, if you are creating a Test Plan with 25 HTTP Request controllers and all of the requests are being sent to the same server, you could add a single HTTP Request Defaults element with the "Server Name or IP" field filled in. Then, when you add the 25 HTTP Request controllers, leave the "Server Name or IP" field empty. The controllers will inherit this field value from the HTTP Request Defaults element.ServerDomain name or IP address of the web server. E.g. www.example.com. [Do not include the http:// prefix.Port the web server is listening to.Connect TimeoutConnection Timeout. Number of milliseconds to wait for a connection to open.Response TimeoutResponse Timeout. Number of milliseconds to wait for a response.ImplementationJava, HttpClient4. If not specified the default depends on the value of the JMeter property jmeter.httpsampler, failing that, the Java implementation is used.ProtocolHTTP or HTTPS.Content encodingThe encoding to be used for the request.The path to resource (for example, /servlets/myServlet). If the resource requires query string parameters, add them below in the "Send Parameters With the Request" section. Note that the path is the default for the full path, not a prefix to be applied to paths specified on the HTTP Request screens.Send Parameters With the RequestThe query string will be generated from the list of parameters you provide. Each parameter has a name and value. The query string will be generated in the correct fashion, depending on the choice of "Method" you made (i.e. if you chose GET, the query string will be appended to the URL, if POST, then it will be sent separately). Also, if you are sending a file using a multipart form, the query string will be created using the multipart form specifications.Server (proxy)Hostname or IP address of a proxy server to perform request. [Do not include the http:// prefix.]Port the proxy server is listening to.No, unless proxy hostname is specifiedUsername(Optional) username for proxy server.Password(Optional) password for proxy server. (N.B. this is stored unencrypted in the test plan)Retrieve All Embedded Resources from HTML FilesTell JMeter to parse the HTML file and send HTTP/HTTPS requests for all images, Java applets, JavaScript files, CSSs, etc. referenced in the file.Use concurrent poolUse a pool of concurrent connections to get embedded resources.Pool size for concurrent connections used to get embedded resources.URLs must match:If present, this must be a regular expression that is used to match against any embedded URLs found. So if you only want to download embedded resources from http://example.invalid/, use the expression: http://example\.invalid/.*URLs must not match:If present, this must be a regular expression that is used to filter out any embedded URLs found. So if you don't want to download PNG or SVG files from any source, use the expression: .*\.(?i:svg|png) Note: radio buttons only have two states - on or off. This makes it impossible to override settings consistently - does off mean off, or does it mean use the current default? JMeter uses the latter (otherwise defaults would not work at all). So if the button is off, then a later element can set it on, but if the button is on, a later element cannot set it off. JMeter now supports multiple Header Managers. The header entries are merged to form the list for the sampler. If an entry to be merged matches an existing header name, it replaces the previous entry. This allows one to set up a default set of headers, and apply adjustments to particular samplers. Note that an empty value for a header does not remove an existing header, it justs replace its value. Name of the request header. Two common request headers you may want to experiment with are "User-Agent" and "Referer".No (You should have at least one, however)ValueRequest header value.No (You should have at least one, however)Add ButtonAdd an entry to the header table.Delete ButtonDelete the currently selected table entry.Load ButtonLoad a previously saved header table and add the entries to the existing header table entries.Save As ButtonSave the current header table to a file.Download this example. In this example, we created a Test Plan that tells JMeter to override the default "User-Agent" request header and use a particular Internet Explorer agent string instead. (see figures 12 and 13). Creates a database connection (used by JDBC RequestSampler) from the supplied JDBC Connection settings. The connection may be optionally pooled between threads. Otherwise each thread gets its own connection. The connection configuration name is used by the JDBC Sampler to select the appropriate connection. The used pool is DBCP, see BasicDataSource Configuration Parameters The name of the variable the connection is tied to. Multiple connections can be used, each tied to a different variable, allowing JDBC Samplers to select the appropriate connection.Each name must be different. If there are two configuration elements using the same name, only one will be saved. JMeter logs a message if a duplicate name is detected.Max Number of ConnectionsMaximum number of connections allowed in the pool. In most cases, set this to zero (0). This means that each thread will get its own pool with a single connection in it, i.e. the connections are not shared between threads. If you really want to use shared pooling (why?), then set the max count to the same as the number of threads to ensure threads don't wait on each other.Max Wait (ms)Pool throws an error if the timeout period is exceeded in the process of trying to retrieve a connection, see BasicDataSource.html#getMaxWaitMillisTime Between Eviction Runs (ms)The number of milliseconds to sleep between runs of the idle object evictor thread. When non-positive, no idle object evictor thread will be run. (Defaults to "60000", 1 minute). See BasicDataSource.html#getTimeBetweenEvictionRunsMillisAuto CommitTurn auto commit on or off for the connections.Transaction isolationTransaction isolation levelPool Prepared StatementsMax number of Prepared Statements to pool per connection. "-1" disables the pooling and "0" means unlimited number of Prepared Statements to pool. (Defaults to "-1")Preinit PoolThe connection pool can be initialized instantly. If set to False (default), the JDBC request samplers using this pool might measure higher response times for the first queries – as the connection establishment time for the whole pool is included.Init SQL statements separated by new lineA Collection of SQL statements that will be used to initialize physical connections when they are first created. These statements are executed only once - when the configured connection factory creates the connection.Test While IdleTest idle connections of the pool, see BasicDataSource.html#getTestWhileIdle. Validation Query will be used to test it.Soft Min Evictable Idle Time(ms)Minimum amount of time a connection may sit idle in the pool before it is eligible for eviction by the idle object evictor, with the extra condition that at least minIdle connections remain in the pool. See BasicDataSource.html#getSoftMinEvictableIdleTimeMillis. Defaults to 5000 (5 seconds)Validation QueryA simple query used to determine if the database is still responding. This defaults to the 'isValid()' method of the jdbc driver, which is suitable for many databases. However some may require a different query; for example Oracle something like 'SELECT 1 FROM DUAL' could be used. The list of the validation queries can be configured with jdbc.config.check.query property and are by default: Note this validation query is used on pool creation to validate it even if "Test While Idle" suggests query would only be used on idle connections. This is DBCP behaviour. Fully qualified name of driver class. (Must be in JMeter's classpath - easiest to copy .jar file into JMeter's /lib directory). The list of the preconfigured jdbc driver classes can be configured with jdbc.config.jdbc.driver.class property and are by default:Microsoft SQL Server (MS JDBC driver) com.microsoft.sqlserver.jdbc.SQLServerDriver or com.microsoft.jdbc.sqlserver.SQLServerDriver PostgreSQL org.postgresql.Driver Ingres com.ingres.jdbc.IngresDriver Derby org.apache.derby.jdbc.ClientDriver org.h2.Driver Firebird org.firebirdsql.jdbc.FBDriver Apache Derby org.apache.derby.jdbc.ClientDriver MariaDB org.mariadb.jdbc.Driver SQLite org.sqlite.JDBC Sybase AES net.sourceforge.jtds.jdbc.Driver Exasol com.exasol.jdbc.EXADriver Connection PropertiesConnection Properties to set when establishing connection (like internal_logon=sysdba for Oracle for example)Different databases and JDBC drivers require different JDBC settings. The Database URL and JDBC Driver class are defined by the provider of the JDBC implementation.
Some possible settings are shown below. Please check the exact details in the JDBC driver documentation.
If JMeter reports No suitable driver, then this could mean either: The driver class was not found. In this case, there will be a log message such as DataSourceElement: Could not load driver: {classname} java.lang.ClassNotFoundException: {classname}The driver class was found, but the class does not support the connection string. This could be because of a syntax error in the connection string, or because the wrong classname was used. If the database server is not running or is not accessible, then JMeter will report a java.net.ConnectException.Some examples for databases and their parameters are given below.
MySQL The Keystore Config Element lets you configure how Keystore will be loaded and which keys it will use. This component is typically used in HTTPS scenarios where you don't want to take into account keystore initialization into account in response time.
To use this element, you need to setup first a Java Key Store with the client certificates you want to test, to do that:
If created by PKI, import your keys in Java Key Store by converting them to a format acceptable by JKS Then reference the keystore file through the two JVM properties (or add them in system.properties): To use PKCS11 as the source for the store, you need to set javax.net.ssl.keyStoreType to PKCS11 and javax.net.ssl.keyStore to NONE.PreloadWhether or not to preload Keystore. Setting it to true is usually the best option.Variable name holding certificate aliasVariable name that will contain the alias to use for authentication by client certificate. Variable value will be filled from CSV Data Set for example. In the screenshot, "certificat_ssl" will also be a variable in CSV Data Set. Defaults to clientCertAliasVarNameFalseAlias Start IndexThe index of the first key to use in Keystore, 0-based.Alias End IndexThe index of the last key to use in Keystore, 0-based. When using "Variable name holding certificate alias" ensure it is large enough so that all keys are loaded at startup. Default to -1 which means load all.TCPClient classnameName of the TCPClient class. Defaults to the property tcp.handler, failing that TCPClientImpl.ServerName or IPName or IP of TCP serverRe-use connectionIf selected, the connection is kept open. Otherwise it is closed when the data has been read.Close connectionIf selected, the connection will be closed after running the sampler.SO_LINGEREnable/disable SO_LINGER with the specified linger time in seconds when a socket is created. If you set "SO_LINGER" value as 0, you may prevent large numbers of sockets sitting around with a TIME_WAIT status.End of line(EOL) byte valueByte value for end of line, set this to a value outside the range -128 to +127 to skip EOL checking. You may set this in jmeter.properties file as well with the tcp.eolByte property. If you set this in TCP Sampler Config and in jmeter.properties file at the same time, the setting value in the TCP Sampler Config will be used.Connect TimeoutConnect Timeout (milliseconds, 0 disables).Response TimeoutResponse Timeout (milliseconds, 0 disables).Set NodelayShould the nodelay property be set?Note that all the UDV elements in a test plan - no matter where they are - are processed at the start. So you cannot reference variables which are defined as part of a test run, e.g. in a Post-Processor. UDVs should not be used with functions that generate different results each time they are called. Only the result of the first function call will be saved in the variable. However, UDVs can be used with functions such as __P(), for example: If a runtime element such as a User Parameters Pre-Processor or Regular Expression Extractor defines a variable with the same name as one of the UDV variables, then this will replace the initial value, and all other test elements in the thread will see the updated value. If you have more than one Thread Group, make sure you use different names for different values, as UDVs are shared between Thread Groups. Also, the variables are not available for use until after the element has been processed, so you cannot reference variables that are defined in the same element. You can reference variables defined in earlier UDVs or on the Test Plan. Variable name/value pairs. The string under the "Name" column is what you'll need to place inside the brackets in ${…} constructs to use the variables later on. The whole ${…} will then be replaced by the string in the "Value" column. The Random Variable Config Element is used to generate random numeric strings and store them in variable for use later. It's simpler than using User Defined Variables together with the __Random() function. The output variable is constructed by using the random number generator, and then the resulting number is formatted using the format string. The number is calculated using the formula minimum+Random.nextInt(maximum-minimum+1). Random.nextInt() requires a positive integer. This means that maximum-minimum - i.e. the range - must be less than 2147483647, however the minimum and maximum values can be any long values so long as the range is OK.As the random value is evaluated at the start of each iteration, it is probably not a good idea to use a variable other than a property as a value for the minimum or maximum. It would be zero on the first iteration.The java.text.DecimalFormat format string to be used. For example "000" which will generate numbers with at least 3 digits, or "USER_000" which will generate output of the form USER_nnn. If not specified, the default is to generate the number using Long.toString()Minimum ValueThe minimum value (long) of the generated random number.Maximum ValueThe maximum value (long) of the generated random number.Random SeedThe seed for the random number generator. If you use the same seed value with Per Thread set to true, you will get the same value for each Thread as per Random class. If no seed is set, Default constructor of Random will be used.Per Thread(User)?If False, the generator is shared between all threads in the thread group. If True, then each thread has its own random generator.Allows the user to create a counter that can be referenced anywhere in the Thread Group. The counter config lets the user configure a starting point, a maximum, and the increment. The counter will loop from the start to the max, and then start over with the start, continuing on like that until the test is ended.
The counter uses a long to store the value, so the range is from -2^63 to 2^63-1.Starting valueThe starting value for the counter. The counter will equal this value during the first iteration (defaults to 0).IncrementHow much to increment the counter by after each iteration (defaults to 0, meaning no increment).Maximum valueIf the counter exceeds the maximum, then it is reset to the Starting value. Default is Long.MAX_VALUEFormatOptional format, e.g. 000 will format as 001, 002, etc. This is passed to DecimalFormat, so any valid formats can be used. If there is a problem interpreting the format, then it is ignored. [The default format is generated using Long.toString()]Exported Variable NameThis will be the variable name under which the counter value is available. If you name it counterA, you can then access it using ${counterA} as explained in user-defined values (By default, it creates an empty string variable that can be accessed using ${} but this is highly discouraged)Track Counter Independently for each UserIn other words, is this a global counter, or does each user get their own counter? If unchecked, the counter is global (i.e., user #1 will get value "1", and user #2 will get value "2" on the first iteration). If checked, each user has an independent counter.Reset counter on each Thread Group IterationThis option is only available when counter is tracked per User, if checked, counter will be reset to Start value on each Thread Group iteration. This can be useful when Counter is inside a Loop Controller.The Simple Config Element lets you add or override arbitrary values in samplers. You can choose the name of the value and the value itself. Although some adventurous users might find a use for this element, it's here primarily for developers as a basic GUI that they can use while developing new JMeter components.
Parameter NameThe name of each parameter. These values are internal to JMeter's workings and are not generally documented. Only those familiar with the code will know these values.Parameter ValueThe value to apply to that parameter.Creates a MongoDB connection (used by MongoDB ScriptSampler) from the supplied Connection settings. Each thread gets its own connection. The connection configuration name is used by the JDBC Sampler to select the appropriate connection. You can then access com.mongodb.DB object in Beanshell or JSR223 Test Elements through the element MongoDBHolder using this code import com.mongodb.DB; import org.apache.jmeter.protocol.mongodb.config.MongoDBHolder; DB db = MongoDBHolder.getDBFromSource("value of property MongoDB Source", "value of property Database Name");Keep TryingIf true, the driver will keep trying to connect to the same server in case that the socket cannot be established. There is maximum amount of time to keep retrying, which is 15s by default. This can be useful to avoid some exceptions being thrown when a server is down temporarily by blocking the operations. It can also be useful to smooth the transition to a new primary node (so that a new primary node is elected within the retry time).for a replica set, the driver will try to connect to the old primary node for that time, instead of failing over to the new one right away this does not prevent exception from being thrown in read/write operations on the socket, which must be handled by application. Even if this flag is false, the driver already has mechanisms to automatically recreate broken connections and retry the read operations. Default is false.Maximum connections per hostConnection timeoutThe connection timeout in milliseconds. It is used solely when establishing a new connection Socket.connect(java.net.SocketAddress, int) Default is 0 and means no timeout.Maximum retry timeThe maximum amount of time in milliseconds to spend retrying to open connection to the same server. Default is 0, which means to use the default 15s if autoConnectRetry is on.Maximum wait timeThe maximum wait time in milliseconds that a thread may wait for a connection to become available. Default is 120,000.Socket timeoutThe socket timeout in milliseconds It is used for I/O socket read and write operations Socket.setSoTimeout(int) Default is 0 and means no timeout.Socket keep aliveThis flag controls the socket keep alive feature that keeps a connection alive through firewalls Socket.setKeepAlive(boolean) Default is false.ThreadsAllowedToBlockForConnectionMultiplierThis multiplier, multiplied with the connectionsPerHost setting, gives the maximum number of threads that may be waiting for a connection to become available from the pool. All further threads will get an exception right away. For example if connectionsPerHost is 10 and threadsAllowedToBlockForConnectionMultiplier is 5, then up to 50 threads can wait for a connection. Default is 5.Write Concern: SafeIf true the driver will use a WriteConcern of WriteConcern.SAFE for all operations. If w, wtimeout, fsync or j are specified, this setting is ignored. Default is false.Write Concern: FsyncThe fsync value of the global WriteConcern. Default is false.Write Concern: Wait for JournalThe j value of the global WriteConcern. Default is false.Write Concern: Wait for serversThe w value of the global WriteConcern. Default is 0.Write Concern: Wait timeoutThe wtimeout value of the global WriteConcern. Default is 0.Write Concern: Continue on errorIf batch inserts should continue after the first errorConnection Pool Max SizeMax size of the Neo4j driver Bolt connection pool. Raise the value if running large number of concurrent threads, so that JMeter threads are not blocked waiting for a connection to be released to the pool.Assertions are used to perform additional checks on samplers, and are processed after every sampler in the same scope. To ensure that an Assertion is applied only to a particular sampler, add it as a child of the sampler. Note: Unless documented otherwise, Assertions are not applied to sub-samples (child samples) - only to the parent sample. In the case of JSR223 and BeanShell Assertions, the script can retrieve sub-samples using the method prev.getSubResults() which returns an array of SampleResults. The array will be empty if there are none. Assertions can be applied to either the main sample, the sub-samples or both. The default is to apply the assertion to the main sample only. If the Assertion supports this option, then there will be an entry on the GUI which looks like the following: If a sub-sampler fails and the main sample is successful, then the main sample will be set to failed status and an Assertion Result will be added. If the JMeter variable option is used, it is assumed to relate to the main sample, and any failure will be applied to the main sample only.The response assertion control panel lets you add pattern strings to be compared against various fields of the request or response. The pattern strings are: You can also choose whether the strings will be expected to match the entire response, or if the response is only expected to contain the pattern. You can attach multiple assertions to any controller for additional flexibility. Note that the pattern string should not include the enclosing delimiters, i.e. use Price: \d+ not /Price: \d+/. By default, the pattern is in multi-line mode, which means that the "." meta-character does not match newline. In multi-line mode, "^" and "$" match the start or end of any line anywhere within the string - not just the start and end of the entire string. Note that \s does match new-line. Case is also significant. To override these settings, one can use the extended regular expression syntax. For example: The overall success of the sample is determined by combining the result of the assertion with the existing Response status. When the Ignore Status checkbox is selected, the Response status is forced to successful before evaluating the Assertion. HTTP Responses with statuses in the 4xx and 5xx ranges are normally regarded as unsuccessful. The "Ignore status" checkbox can be used to set the status successful before performing further checks.
Note that this will have the effect of clearing any previous assertion failures, so make sure that this is only set on the first assertion.Pattern Matching RulesIndicates how the text being tested is checked against the pattern. Equals and Substring patterns are plain strings, not regular expressions. NOT may also be selected to invert the result of the check. OR Apply each assertion in OR combination (if 1 pattern to test matches, Assertion will be ok) instead of AND (All patterns must match so that Assertion is OK).Patterns to TestA list of patterns to be tested. Each pattern is tested separately. If a pattern fails, then further patterns are not checked. There is no difference between setting up one Assertion with multiple patterns and setting up multiple Assertions with one pattern each (assuming the other options are the same). However, when the Ignore Status checkbox is selected, this has the effect of cancelling any previous assertion failures - so make sure that the Ignore Status checkbox is only used on the first Assertion.Custom failure messageLets you define the failure message that will replace the generated oneThe Duration Assertion tests that each response was received within a given amount of time. Any response that takes longer than the given number of milliseconds (specified by the user) is marked as a failed response.
Duration in MillisecondsThe maximum number of milliseconds each response is allowed before being marked as failed.The Size Assertion tests that each response contains the right number of bytes in it. You can specify that the size be equal to, greater than, less than, or not equal to a given number of bytes.
An empty response is treated as being 0 bytes rather than reported as an error.Size in bytesThe number of bytes to use in testing the size of the response (or value of the JMeter variable).Type of ComparisonWhether to test that the response is equal to, greater than, less than, or not equal to, the number of bytes specified.The XML Assertion tests that the response data consists of a formally correct XML document. It does not validate the XML based on a DTD or schema or do any further validation.
Note that a different Interpreter is used for each independent occurrence of the assertion in each thread in a test script, but the same Interpreter is used for subsequent invocations. This means that variables persist across calls to the assertion. All Assertions are called from the same thread as the sampler. If the property "beanshell.assertion.init" is defined, it is passed to the Interpreter as the name of a sourced file. This can be used to define common methods and variables. There is a sample init file in the bin directory: BeanShellAssertion.bshrc The test element supports the ThreadListener and TestListener methods. These should be defined in the initialisation file. See the file BeanShellListeners.bshrc for example definitions.Reset bsh.Interpreter before each callIf this option is selected, then the interpreter will be recreated for each sample. This may be necessary for some long running scripts. For further information, see Best Practices - BeanShell scripting.ParametersParameters to pass to the BeanShell script. The parameters are stored in the following variables: A file containing the BeanShell script to run. This overrides the script. The file name is stored in the script variable FileNameScriptThe BeanShell script to run. The return value is ignored.Yes (unless script file is provided)The XPath Assertion tests a document for well formedness, has the option of validating against a DTD, or putting the document through JTidy and testing for an XPath. If that XPath exists, the Assertion is true. Using "/" will match any well-formed document, and is the default XPath Expression. The assertion also supports boolean expressions, such as "count(//*error)=2". See http://www.w3.org/TR/xpath for more information on XPath. Some sample expressions:Invert assertion(will fail if above conditions met)True if a XPath expression is not matched or returns falseprovide a Properties file (if for example your file is named namespaces.properties) which contains mappings for the namespace prefixes: prefix1=http\://foo.apache.org prefix2=http\://toto.apache.org The XPath2 Assertion tests a document for well formedness. Using "/" will match any well-formed document, and is the default XPath2 Expression. The assertion also supports boolean expressions, such as "count(//*error)=2". Some sample expressions:Namespaces aliases listList of namespaces aliases you want to use to parse the document, one line per declaration. You must specify them as follow: prefix=namespace. This implementation makes it easier to use namespaces than with the old XPathExtractor version.XPath2 AssertionXPath to match in the document.Invert assertionWill fail if xpath expression returns true or matches, succeed otherwiseNamespace aliases listList of namespace aliases prefix=full namespace (one per line)Script fileA file containing the script to run, if a relative file path is used, then it will be relative to directory referenced by "user.dir" System propertyScript compilation cachingUnique String across Test Plan that JMeter will use to cache result of Script compilation if language used supports Compilable interface (Groovy is one of these, java, BeanShell and JavaScript are not)See note in JSR223 Sampler Java System property if you're using Groovy without checking this optionScriptThe script to run.Yes (unless script file is provided)The following variables are set up for use by the script:
log - (Logger) - can be used to write to the log file Label - the String Label Filename - the script file name (if any) Parameters - the parameters (as a String) args - the parameters as a String array (split on whitespace) ctx - (JMeterContext) - gives access to the context vars - (JMeterVariables) - gives read/write access to variables: vars.get(key); vars.put(key,val); vars.putObject("OBJ1",new Object()); vars.getObject("OBJ2"); The script can check various aspects of the SampleResult. If an error is detected, the script should use AssertionResult.setFailureMessage("message") and AssertionResult.setFailure(true).For further details of all the methods available on each of the above variables, please check the Javadoc
Compare Assertion must not be used during load test as it consumes a lot of resources (memory and CPU). Use it only for either functional testing or during Test Plan debugging and Validation. The Compare Assertion can be used to compare sample results within its scope. Either the contents or the elapsed time can be compared, and the contents can be filtered before comparison. The assertion comparisons can be seen in the Comparison Assertion Visualizer.Compare TimeIf the value is ≥0, then check if the response time difference is no greater than the value. I.e. if the value is 0, then the response times must be exactly equal.Comparison FiltersFilters can be used to remove strings from the content comparison. For example, if the page has a time-stamp, it might be matched with: "Time: \d\d:\d\d:\d\d" and replaced with a dummy fixed time "Time: HH:MM:SS". The SMIME Assertion can be used to evaluate the sample results from the Mail Reader Sampler. This assertion verifies if the body of a mime message is signed or not. The signature can also be verified against a specific signer certificate. As this is a functionality that is not necessarily needed by most users, additional jars need to be downloaded and added to JMETER_HOME/lib:Verify SignatureIf selected, the assertion will verify if it is a valid signature according to the parameters defined in the Signer Certificate box.Message not signedWhether or not to expect a signature in the messageSigner Certificate"No Check" means that it will not perform signature verification. "Check values" is used to verify the signature against the inputs provided. And "Certificate file" will perform the verification against a specific certificate file.Message PositionThe Mail sampler can retrieve multiple messages in a single sample. Use this field to specify which message will be checked. Messages are numbered from 0, so 0 means the first message. Negative numbers count from the LAST message; -1 means LAST, -2 means penultimate etc. This component allows you to perform validations of JSON documents. First, it will parse the JSON and fail if the data is not JSON. Second, it will search for specified path, using syntax from Jayway JsonPath 1.2.0. If the path is not found, it will fail. Third, if JSON path was found in the document, and validation against expected value was requested, it will perform validation. For the null value there is special checkbox in the GUI. Note that if the path will return array object, it will be iterated and if expected value is found, the assertion will succeed. To validate empty array use [] string. Also, if patch will return dictionary object, it will be converted to string before comparison. When using indefinite JSON Paths you must assert the value due to the existing JSON library implementation, otherwise the assertion could always return successful. Since JMeter version 5.5 the assertion will fail, if an indefinite path is given, an empty list is extracted and no assertion value is set.Invert assertion (will fail if above conditions met)Invert assertion (will fail if above conditions met)This component allows you to perform assertion on JSON documents content using JMESPath. First, it will parse the JSON and fail if the data is not JSON. Second, it will search for specified path, using JMESPath syntax. If the path is not found, it will fail. Third, if JMES path was found in the document, and validation against expected value was requested, it will perform this additional check. If you want to check for nullity, use the Expect null checkbox. Note that the path cannot be null as the expression JMESPath will not be compiled and an error will occur. Even if you expect an empty or null response, you must put a valid JMESPath expression.Additionally assert valueSelect checkbox if you check the extracted JMESPath against an expected oneMatch as regular expressionSelect checkbox if you want to use a regular expression for matchingExpected ValueValue to use for exact matching or regular expression if Match as regular expression is checkedExpect nullSelect checkbox if you expect the value to be nullInvert assertion (will fail if above conditions met)Invert assertion (will fail if above conditions met)Since version 3.1, a new feature (in Beta mode as of JMeter 3.1 and subject to changes) has been implemented which provides the following feature. You can apply a multiplication factor on the sleep delays computed by Random timer by setting property timer.factor=float number where float number is a decimal positive number. JMeter will multiply this factor by the computed sleep delay. This feature can be used by: Note that timers are processed before each sampler in the scope in which they are found; if there are several timers in the same scope, all the timers will be processed before each sampler. Timers are only processed in conjunction with a sampler. A timer which is not in the same scope as a sampler will not be processed at all. To apply a timer to a single sampler, add the timer as a child element of the sampler. The timer will be applied before the sampler is executed. To apply a timer after a sampler, either add it to the next sampler, or add it as the child of a Flow Control Action Sampler. This timer pauses each thread request for a random amount of time, with most of the time intervals occurring near a particular value. The total delay is the sum of the Gaussian distributed value (with mean 0.0 and standard deviation 1.0) times the deviation value you specify, and the offset value. Another way to explain it, in Gaussian Random Timer, the variation around constant offset has a Gaussian curve distribution.This timer pauses each thread request for a random amount of time, with each time interval having the same probability of occurring. The total delay is the sum of the random value and the offset value.
This timer introduces variable pauses, calculated to keep the total throughput (in terms of samples per minute) as close as possible to a given figure. Of course the throughput will be lower if the server is not capable of handling it, or if other timers or time-consuming test elements prevent it.
N.B. although the Timer is called the Constant Throughput timer, the throughput value does not need to be constant. It can be defined in terms of a variable or function call, and the value can be changed during a test. The value can be changed in various ways: all active threads in current thread group - the target throughput is divided amongst all the active threads in the group. Each thread will delay as needed, based on when it last ran. all active threads - the target throughput is divided amongst all the active threads in all Thread Groups. Each thread will delay as needed, based on when it last ran. In this case, each other Thread Group will need a Constant Throughput timer with the same settings. all active threads in current thread group (shared) - as above, but each thread is delayed based on when any thread in the group last ran. all active threads (shared) - as above; each thread is delayed based on when any thread last ran. The shared and non-shared algorithms both aim to generate the desired throughput, and will produce similar results. The shared algorithm should generate a more accurate overall transaction rate. The non-shared algorithm should generate a more even spread of transactions across threads.This timer introduces variable pauses, calculated to keep the total throughput (e.g. in terms of samples per minute) as close as possible to a given figure. The timer does not generate threads, so the resulting throughput will be lower if the server is not capable of handling it, or if other timers add too big delays, or if there's not enough threads, or time-consuming test elements prevent it.
Note: in many cases, Open Model Thread Group would be a better choice for generating the desired load profileNote: if you alter timer configuration on the fly, then it might take time to adapt to the new settings. For instance, if the timer was initially configured for 1 request per hour, then it assigns incoming threads with 3600+sec pauses. Then, if the load configuration is altered to 1 per second, then the threads are not interrupted from their delays, and the threads keep waiting.Although the Timer is called Precise Throughput Timer, it does not aim to produce precisely the same number of samples over one-second intervals during the test.
The timer works best for rates under 36000 requests/hour, however your mileage might vary (see monitoring section below if your goals are vastly different).
Best location of a Precise Throughput Timer in a Test Plan
As you might know, the timers are inherited by all the siblings and their child elements. That is why one of the best places for Precise Throughput Timer is under the first element in a test loop. For instance, you might add a dummy sampler at the beginning, and place the timer under that dummy samplerProduced schedule
Precise Throughput Timer models Poisson arrivals schedule. That schedule often happens in a real-life, so it makes sense to use that for load testing. For instance, it naturally might generate samples that are close together thus it might reveal concurrency issues. Even if you manage to generate Poisson arrivals with Poisson Random Timer, it would be susceptible to the issues listed below. For instance, true Poisson arrivals might have indefinitely long pause, and that is not practical for load testing. For instance, "regular" Poisson arrivals with 1 per second rate might end up with 50 samples over 60 second long test. Constant Throughput Timer converges to the specified rate, however it tends to produce samples at even intervals.Ramp-up and startup spike
You might used "ramp-up" or similar approaches to avoid a spike at the test start. For instance, if you configure Thread Group to have 100 threads, and set Ramp-up Period to 0 (or to a small number), then all the threads would start at the same time, and it would produce an unwanted spike of the load. On top of that, if you set Ramp-up Period too high, it might result in "too few" threads being available at the very beginning to achieve the required load. Precise Throughput Timer schedules executions in a random way, so it can be used to generate constant load, and it is recommended to set both Ramp-up Period and Delay to 0.Multiple thread groups starting at the same time
A variation of Ramp-up issue might appear when Test Plan includes multiple Thread Groups. To mitigate that issue one typically adds "random" delay to each Thread Group so threads start at different times. Precise Throughput Timer avoids that issue since it schedules executions in a random way. You do not need to add extra random delays to mitigate startup spikeNumber of iterations per hour
One of the basic requirements is to issue N samples per M minutes. Let it be 60 iterations per hour. Business customers would not understand if you report load test results with 57 executions "just because the random was random". In order to generate 60 iterations per hour, you need to configure as follows (other parameters could be left with their default values)
The first two options set the throughput. Even though 60/3600, 30/1800, and 120/7200 represent exactly the same load level, pick the one that represents business requirements better. For instance, if the requirement is to test for "60 sample per hour", then set 60/3600. If the requirement is to test "1 sample per minute", then set 1/60.
Test duration (seconds) is there so the timer ensures exact number of samples for a given test duration. Precise Throughput Timer creates a schedule for the samples at the test startup. For instance, if you wish to perform 5 minutes test with 60 per hour throughput, you would set Test duration (seconds) to 300. This enables to configure throughput in a business-friendly way. Note: Test duration (seconds) does not limit test duration. It is just a hint for the timer.Number of threads and think times
One of the common pitfalls is to adjust number of threads and think times in order to end up with the desired throughput. Even though it might work, that approach results in lots of time spent on the test runs. It might require to adjust threads and delays again when new application version arrives.
Precise Throughput Timer enables to set throughput goal and go for it no matter how well application performs. In order to do that, Precise Throughput Timer creates a schedule at the test startup, then it uses that schedule to release threads. The main driver for the think times and number of threads should be business requirements, not the desire to match throughput somehow. For instance, if you application is used by support engineers in a call center. Suppose there are 2 engineers in the call center, and the target throughput is 1 per minute. Suppose it takes 4 minutes for the engineer to read and review the web page. For that case you should set 2 threads in the group, use 4 minutes for think time delays, and specify 1 per minute in Precise Throughput Timer. Of course it would result in something around 2samples/4minutes=0.5 per minute and the result of such a test means "you need more support engineers in a call center" or "you need to reduce the time it takes an engineer to fulfill a task".Testing low rates and repeatable tests
Testing at low rates (e.g. 60 per hour) requires to know the desired test profile. For instance, if you need to inject load at even intervals (e.g. 60 seconds in between) then you'd better use Constant Throughput Timer. However, if you need to have randomized schedule (e.g. to model real users that execute reports), then Precise Throughput Timer is your friend.When comparing outcomes of multiple load tests, it is useful to be able to repeat exactly the same test profile. For instance, if action X (e.g. "Profit Report") is invoked after 5 minutes of the test start, then it would be nice to replicate that pattern for subsequent test executions. Replicating the same load pattern simplifies analysis of the test results (e.g. CPU% chart).
Random seed (change from 0 to random) enables to control the seed value that is used by Precise Throughput Timer. By default it is initialized with 0 and that means random seed is used for each test execution. If you need to have repeatable load pattern, then change Random seed so some random value. The general advice is to use non-zero seed, and "0 by default" is an implementation limit.Note: when using multiple thread groups with same throughput rates and same non-zero seed it might result in unwanted firing the samples at the same time.
Testing high rates and/or long test durations
Precise Throughput Timer generates the schedule and keeps it in memory. In most cases it should not be a problem, however, remember that you might want to keep the schedule shorter than 1'000'000 samples. It takes ~200ms to generate a schedule for 1'000'000 samples, and the schedule consumes 8 megabytes in the heap. Schedule for 10 million entries takes 1-2 second to build and it consumes 80 megabytes in the heap. For instance, if you want to perform 2-week long test with 5'000 per hour rate, then you probably want to have exactly 5'000 samples for each hour. You can set Test duration (seconds) property of the timer of the timer to 1 hour. Then the timer would create a schedule of 5'000 samples for an hour, and when the schedule is exhausted, the timer would generate a schedule for the next hour. At the same time, you can set Test duration (seconds) to 2 weeks, and the timer would generate a schedule with 168'000 samples = 2 weeks * 5'000 samples/hour = 2*7*24*500. The schedule would take ~30ms to generate, and it would consume a little more than 1 megabyte.Bursty load
There might be a case when all the samples should come in pairs, triples, etc. Certain cases might be solved via Synchronizing Timer, however Precise Throughput Timer has native way to issue requests in packs. This behavior is disabled by default, and it is controlled with "Batched departures" settingsVariable load rate
Even though property values (e.g. throughput) can be defined via expressions, it is recommended to keep the value more or less the same through the test, as it takes time to recompute the new schedule to adapt new values.
Monitoring
As next schedule is generated, Precise Throughput Timer logs a message to jmeter.log: 2018-01-04 17:34:03,635 INFO o.a.j.t.ConstantPoissonProcessGenerator: Generated 21 timings (... 20 required, rate 1.0, duration 20, exact lim 20000, i21) in 0 ms. First 15 events will be fired at: 1.1869653574244292 (+1.1869653574244292), 1.4691340403043207 (+0.2821686828798915), 3.638151706179226 (+2.169017665874905), 3.836357090410566 (+0.19820538423134026), 4.709330071408575 (+0.8729729809980085), 5.61330076999953 (+0.903970698590955), This shows that schedule generation took 0ms, and it shows absolute timestamps in seconds. In the case above, the rate was set to be 1 per second, and the actual timestamps became 1.2 sec, 1.5 sec, 3.6 sec, 3.8 sec, 4.7 sec, and so on.Target throughput (in samples per 'throughput period')Maximum number of samples you want to obtain per "throughput period", including all threads in group, from all affected samplers.Throughput period (seconds)Throughput period. For example, if "throughput" is set to 42 and "throughput period" to 21 sec, then you'll get 2 samples per second.Test duration (seconds)This is used to ensure you'll get throughput*duration samples during "test duration" timeframe.Number of threads in the batch (threads)If the value exceeds 1, then multiple threads depart from the timer simultaneously. Average throughput still meets "throughput" value.Delay between threads in the batch (ms)For instance, if set to 42, and the batch size is 3, then threads will depart at x, x+42ms, x+84ms.Random seed (change from 0 to random)Note: different timers should better have different seed values. Constant seed ensures timer generates the same delays each test start. The value of "0" means the timer is truly random (non-repeatable from one execution to another)..The purpose of the SyncTimer is to block threads until X number of threads have been blocked, and then they are all released at once. A SyncTimer can thus create large instant loads at various points of the test plan.Number of Simultaneous Users to Group byNumber of threads to release at once. Setting it to 0 is equivalent to setting it to Number of threads in Thread Group.Timeout in millisecondsIf set to 0, Timer will wait for the number of threads to reach the value in "Number of Simultaneous Users to Group". If superior to 0, then timer will wait at max "Timeout in milliseconds" for the number of Threads. If after the timeout interval the number of users waiting is not reached, timer will stop waiting. Defaults to 0 If timeout in milliseconds is set to 0 and number of threads never reaches "Number of Simultaneous Users to Group by" then Test will pause infinitely. Only a forced stop will stop it. Setting Timeout in milliseconds is an option to consider in this case. Synchronizing timer blocks only within one JVM, so if using Distributed testing ensure you never set "Number of Simultaneous Users to Group by" to a value superior to the number of users of its containing Thread group considering 1 injector only. The test element supports the ThreadListener and TestListener methods. These should be defined in the initialisation file. See the file BeanShellListeners.bshrc for example definitions. Descriptive name for this element that is shown in the tree. The name is stored in the script variable LabelReset bsh.Interpreter before each callIf this option is selected, then the interpreter will be recreated for each sample. This may be necessary for some long running scripts. For further information, see Best Practices - BeanShell scripting.ParametersParameters to pass to the BeanShell script. The parameters are stored in the following variables: A file containing the BeanShell script to run. The file name is stored in the script variable FileName The return value is used as the number of milliseconds to wait.ScriptThe BeanShell script. The return value is used as the number of milliseconds to wait.Yes (unless script file is provided)Before invoking the script, some variables are set up in the BeanShell interpreter:
log - (Logger) - can be used to write to the log file ctx - (JMeterContext) - gives access to the context vars - (JMeterVariables) - gives read/write access to variables: vars.get(key); vars.put(key,val); vars.putObject("OBJ1",new Object());For details of all the methods available on each of the above variables, please check the Javadoc
If the property beanshell.timer.init is defined, this is used to load an initialisation file, which can be used to define methods etc. for use in the BeanShell script.Script fileA file containing the script to run, if a relative file path is used, then it will be relative to directory referenced by "user.dir" System property The return value is converted to a long integer and used as the number of milliseconds to wait.Script compilation cachingUnique String across Test Plan that JMeter will use to cache result of Script compilation if language used supports Compilable interface (Groovy is one of these, java, beanshell and javascript are not)See note in JSR223 Sampler Java System property if you're using Groovy without checking this optionScriptThe script. The return value is used as the number of milliseconds to wait.Yes (unless script file is provided)Before invoking the script, some variables are set up in the script interpreter:
log - (Logger) - can be used to write to the log file ctx - (JMeterContext) - gives access to the context vars - (JMeterVariables) - gives read/write access to variables:vars.get(key); vars.put(key,val); vars.putObject("OBJ1",new Object());props - (JMeterProperties - class java.util.Properties) - e.g. props.get("START.HMS"); props.put("PROP1","1234"); sampler - (Sampler) - the current Sampler Label - the name of the Timer FileName - the file name (if any) OUT - System.outFor details of all the methods available on each of the above variables, please check the Javadoc
This timer pauses each thread request for a random amount of time, with most of the time intervals occurring near a particular value. The total delay is the sum of the Poisson distributed value, and the offset value.
Note: if you want to model Poisson arrivals, consider using Precise Throughput Timer instead.This modifier parses HTML response from the server and extracts links and forms. A URL test sample that passes through this modifier will be examined to see if it "matches" any of the links or forms extracted from the immediately previous response. It would then replace the values in the URL test sample with appropriate values from the matching link or form. Perl-type regular expressions are used to find matches.
If using distributed testing, ensure you switch mode (see jmeter.properties) so that it's not a stripping one, see 56376 Consider a simple example: let's say you wanted JMeter to "spider" through your site, hitting link after link parsed from the HTML returned from your server (this is not actually the most useful thing to do, but it serves as a good example). You would create a Simple Controller, and add the "HTML Link Parser" to it. Then, create an HTTP Request, and set the domain to ".*", and the path likewise. This will cause your test sample to match with any link found on the returned pages. If you wanted to restrict the spidering to a particular domain, then change the domain value to the one you want. Then, only links to that domain will be followed. A more useful example: given a web polling application, you might have a page with several poll options as radio buttons for the user to select. Let's say the values of the poll options are very dynamic - maybe user generated. If you wanted JMeter to test the poll, you could either create test samples with hardcoded values chosen, or you could let the HTML Link Parser parse the form, and insert a random poll option into your URL test sample. To do this, follow the above example, except, when configuring your Web Test controller's URL options, be sure to choose "POST" as the method. Put in hard-coded values for the domain, path, and any additional form parameters. Then, for the actual radio button parameter, put in the name (let's say it's called "poll_choice"), and then ".*" for the value of that parameter. When the modifier examines this URL test sample, it will find that it "matches" the poll form (and it shouldn't match any other form, given that you've specified all the other aspects of the URL test sample), and it will replace your form parameters with the matching parameters from the form. Since the regular expression ".*" will match with anything, the modifier will probably have a list of radio buttons to choose from. It will choose at random, and replace the value in your URL test sample. Each time through the test, a new random value will be chosen.One important thing to remember is that you must create a test sample immediately prior that will return an HTML page with the links and forms that are relevant to your dynamic test sample.This modifier works similarly to the HTML Link Parser, except it has a specific purpose for which it is easier to use than the HTML Link Parser, and more efficient. For web applications that use URL Re-writing to store session ids instead of cookies, this element can be attached at the ThreadGroup level, much like the HTTP Cookie Manager. Simply give it the name of the session id parameter, and it will find it on the page and add the argument to every request of that ThreadGroup. Alternatively, this modifier can be attached to select requests and it will modify only them. Clever users will even determine that this modifier can be used to grab values that elude the HTML Link Parser.Session Argument NameThe name of the parameter to grab from previous response. This modifier will find the parameter anywhere it exists on the page, and grab the value assigned to it, whether it's in an HREF or a form.Path ExtensionSome web apps rewrite URLs by appending a semi-colon plus the session id parameter. Check this box if that is so.Do not use equals in path extensionSome web apps rewrite URLs without using an "=" sign between the parameter name and value (such as Intershop Enfinity).Do not use questionmark in path extensionPrevents the query string to end up in the path extension (such as Intershop Enfinity).Cache Session Id?Should the value of the session Id be saved for later use when the session Id is not present?URL EncodeURL Encode value when writing parameter If using distributed testing, ensure you switch mode (see jmeter.properties) so that it's not a stripping one, see 56376.Allows the user to specify values for User Variables specific to individual threads.
User Variables can also be specified in the Test Plan but not specific to individual threads. This panel allows you to specify a series of values for any User Variable. For each thread, the variable will be assigned one of the values from the series in sequence. If there are more threads than values, the values get re-used. For example, this can be used to assign a distinct user id to be used by each thread. User variables can be referenced in any field of any JMeter Component.
The variable is specified by clicking the Add Variable button in the bottom of the panel and filling in the Variable name in the 'Name:' column. To add a new value to the series, click the 'Add User' button and fill in the desired value in the newly added column. Values can be accessed in any test component in the same thread group, using the function syntax: ${variable}. See also the CSV Data Set Config element, which is more suitable for large numbers of parameters A flag to indicate whether the User Parameters element should update its variables only once per iteration. if you embed functions into the UP, then you may need greater control over how often the values of the variables are updated. Keep this box checked to ensure the values are updated each time through the UP's parent controller. Uncheck the box, and the UP will update the parameters for every sample request made within its scope. The test element supports the ThreadListener and TestListener methods. These should be defined in the initialisation file. See the file BeanShellListeners.bshrc for example definitions. Descriptive name for this element that is shown in the tree. The name is stored in the script variable LabelReset bsh.Interpreter before each callIf this option is selected, then the interpreter will be recreated for each sample. This may be necessary for some long running scripts. For further information, see Best Practices - BeanShell scripting.ParametersParameters to pass to the BeanShell script. The parameters are stored in the following variables: A file containing the BeanShell script to run. The file name is stored in the script variable FileNameScriptThe BeanShell script. The return value is ignored.Yes (unless script file is provided)Before invoking the script, some variables are set up in the BeanShell interpreter:
log - (Logger) - can be used to write to the log file ctx - (JMeterContext) - gives access to the context vars - (JMeterVariables) - gives read/write access to variables:vars.get(key); vars.put(key,val); vars.putObject("OBJ1",new Object());props - (JMeterProperties - class java.util.Properties) - e.g. props.get("START.HMS"); props.put("PROP1","1234"); prev - (SampleResult) - gives access to the previous SampleResult (if any) sampler - (Sampler)- gives access to the current samplerFor details of all the methods available on each of the above variables, please check the Javadoc
If the property beanshell.preprocessor.init is defined, this is used to load an initialisation file, which can be used to define methods etc. for use in the BeanShell script.Script fileA file containing the script to run, if a relative file path is used, then it will be relative to directory referenced by "user.dir" System propertyScript compilation cachingUnique String across Test Plan that JMeter will use to cache result of Script compilation if language used supports Compilable interface (Groovy is one of these, java, beanshell and javascript are not)See note in JSR223 Sampler Java System property if you're using Groovy without checking this optionScriptThe script to run.Yes (unless script file is provided)The following JSR223 variables are set up for use by the script:
log - (Logger) - can be used to write to the log file Label - the String Label FileName - the script file name (if any) Parameters - the parameters (as a String) args - the parameters as a String array (split on whitespace) ctx - (JMeterContext) - gives access to the context vars - (JMeterVariables) - gives read/write access to variables:vars.get(key); vars.put(key,val); vars.putObject("OBJ1",new Object()); vars.getObject("OBJ2");props - (JMeterProperties - class java.util.Properties) - e.g. props.get("START.HMS"); props.put("PROP1","1234"); sampler - (Sampler)- gives access to the current sampler OUT - System.out - e.g. OUT.println("message")For details of all the methods available on each of the above variables, please check the Javadoc
The JDBC PreProcessor enables you to run some SQL statement just before a sample runs. This can be useful if your JDBC Sample requires some data to be in DataBase and you cannot compute this in a setup Thread group. For details, see JDBC Request. In the linked test plan, "Create Price Cut-Off" JDBC PreProcessor calls a stored procedure to create a Price Cut-Off in Database, this one will be used by "Calculate Price cut off".Allows to specify dynamic values for HTTP parameters extracted from another HTTP Request using regular expressions. RegEx User Parameters are specific to individual threads.
This component allows you to specify reference name of a regular expression that extracts names and values of HTTP request parameters. Regular expression group numbers must be specified for parameter's name and also for parameter's value. Replacement will only occur for parameters in the Sampler that uses this RegEx User Parameters which name matches
Parameter names regexp group numberGroup number of regular expression used to extract parameter namesParameter values regex group numberGroup number of regular expression used to extract parameter valuesregular expression - expression that will extract input names and input values attributes Ex: input name="([^"]+?)" value="([^"]+?)" template - would be empty match nr - -1 (in order to iterate through all the possible matches) See also the Regular Expression Extractor element, which is used to extract parameters names and values This Pre-Processor schedules a timer task to interrupt a sample if it takes too long to complete. The timeout is ignored if it is zero or negative. For this to work, the sampler must implement Interruptible. The following samplers are known to do so: AJP, BeanShell, FTP, HTTP, Soap, AccessLog, MailReader, JMS Subscriber, TCPSampler, TestAction, JavaSampler The test element is intended for use where individual timeouts such as Connection Timeout or Response Timeout are insufficient, or where the Sampler does not support timeouts. The timeout should be set sufficiently long so that it is not triggered in normal tests, but short enough that it interrupts samples that are stuck. [By default, JMeter uses a Callable to interrupt the sampler. This executes in the same thread as the timer, so if the interrupt takes a long while, it may delay the processing of subsequent timeouts. This is not expected to be a problem, but if necessary the property InterruptTimer.useRunnable can be set to true to use a separate Runnable thread instead of the Callable.] As the name suggests, Post-Processors are applied after samplers. Note that they are applied to all the samplers in the same scope, so to ensure that a post-processor is applied only to a particular sampler, add it as a child of the sampler. Note: Unless documented otherwise, Post-Processors are not applied to sub-samples (child samples) - only to the parent sample. In the case of JSR223 and BeanShell post-processors, the script can retrieve sub-samples using the method prev.getSubResults() which returns an array of SampleResults. The array will be empty if there are none. Post-Processors are run before Assertions, so they do not have access to any Assertion Results, nor will the sample status reflect the results of any Assertions. If you require access to Assertion Results, try using a Listener instead. Also note that the variable JMeterThread.last_sample_ok is set to "true" or "false" after all Assertions have been run.Allows the user to extract values from a server response using a Perl-type regular expression. As a post-processor, this element will execute after each Sample request in its scope, applying the regular expression, extracting the requested values, generate the template string, and store the result into the given variable name.
Matching is applied to all qualifying samples in turn. For example if there is a main sample and 3 sub-samples, each of which contains a single match for the regex, (i.e. 4 matches in total). For match number = 3, Sub-samples only, the extractor will match the 3rd sub-sample. For match number = 3, Main sample and sub-samples, the extractor will match the 2nd sub-sample (1st match is main sample). For match number = 0 or negative, all qualifying samples will be processed. For match number > 0, matching will stop as soon as enough matches have been found.Field to checkThe following fields can be checked: Body (unescaped) - the body of the response, with all Html escape codes replaced. Note that Html escapes are processed without regard to context, so some incorrect substitutions may be made.Note that this option highly impacts performances, so use it only when absolutely necessary and be aware of its impactsBody as a Document - the extract text from various type of documents via Apache Tika (see View Results Tree Document view section).Note that the Body as a Document option can impact performances, so ensure it is OK for your testName of created variableThe name of the JMeter variable in which to store the result. Also note that each group is stored as [refname]_g#, where [refname] is the string you entered as the reference name, and # is the group number, where group 0 is the entire match, group 1 is the match from the first set of parentheses, etc.Regular ExpressionThe regular expression used to parse the response data. This must contain at least one set of parentheses "()" to capture a portion of the string, unless using the group $0$. Do not enclose the expression in / / - unless of course you want to match these characters as well.TemplateThe template used to create a string from the matches found. This is an arbitrary string with special elements to refer to groups within the regular expression. The syntax to refer to a group is: '$1$' to refer to group 1, '$2$' to refer to group 2, etc. $0$ refers to whatever the entire expression matches.Match No. (0 for Random)Indicates which match to use. The regular expression may match multiple times.Use a value of zero to indicate JMeter should choose a match at random. A positive number N means to select the nth match. Negative numbers are used in conjunction with the ForEach Controller - see below.Default ValueIf the regular expression does not match, then the reference variable will be set to the default value. This is particularly useful for debugging tests. If no default is provided, then it is difficult to tell whether the regular expression did not match, or the RE element was not processed or maybe the wrong variable is being used. However, if you have several test elements that set the same variable, you may wish to leave the variable unchanged if the expression does not match. In this case, remove the default value once debugging is complete.No, but recommendedUse empty default valueIf the checkbox is checked and Default Value is empty, then JMeter will set the variable to empty string instead of not setting it. Thus when you will for example use ${var} (if Reference Name is var) in your Test Plan, if the extracted value is not found then ${var} will be equal to empty string instead of containing ${var} which may be useful if extracted value is optional. If no match occurs, then the refName variable is set to the default (unless this is absent). Also, the following variables are removed: If the match number is set to a negative number, then all the possible matches in the sampler data are processed. The variables are set as follows: See also Response Assertion for some examples of how to specify modifiers, and for further information on JMeter regular expressions.Allows the user to extract values from a server HTML response using a CSS Selector syntax. As a post-processor, this element will execute after each Sample request in its scope, applying the CSS/JQuery expression, extracting the requested nodes, extracting the node as text or attribute value and store the result into the given variable name.
Matching is applied to all qualifying samples in turn. For example if there is a main sample and 3 sub-samples, each of which contains a single match for the regex, (i.e. 4 matches in total). For match number = 3, Sub-samples only, the extractor will match the 3rd sub-sample. For match number = 3, Main sample and sub-samples, the extractor will match the 2nd sub-sample (1st match is main sample). For match number = 0 or negative, all qualifying samples will be processed. For match number > 0, matching will stop as soon as enough matches have been found.CSS Selector Implementation2 Implementations for CSS/JQuery based syntax are supported:CSS/JQuery expressionThe CSS/JQuery selector used to select nodes from the response data. Selector, selectors combination and pseudo-selectors are supported, examples: :contains(text) - find elements that contain the given text. The search is case-insensitive; e.g. p:contains(jsoup) For more details on syntax, see:AttributeName of attribute (as per HTML syntax) to extract from nodes that matched the selector. If empty, then the combined text of this element and all its children will be returned. This is the equivalent Element#attr(name) function for JSoup if an attribute is set.Default ValueIf the expression does not match, then the reference variable will be set to the default value. This is particularly useful for debugging tests. If no default is provided, then it is difficult to tell whether the expression did not match, or the CSS/JQuery element was not processed or maybe the wrong variable is being used. However, if you have several test elements that set the same variable, you may wish to leave the variable unchanged if the expression does not match. In this case, remove the default value once debugging is complete.No, but recommendedUse empty default valueIf the checkbox is checked and Default Value is empty, then JMeter will set the variable to empty string instead of not setting it. Thus when you will for example use ${var} (if Reference Name is var) in your Test Plan, if the extracted value is not found then ${var} will be equal to empty string instead of containing ${var} which may be useful if extracted value is optional. If the match number is set to a negative number, then all the possible matches in the sampler data are processed. The variables are set as follows: XPath2 Extractor¶This test element allows the user to extract value(s) from structured response - XML or (X)HTML - using XPath2 query language. XPath matching is applied to all qualifying samples in turn, and all the matching results will be returned.Return entire XPath fragment instead of text content?If selected, the fragment will be returned rather than the text content. For example //title would return "<title>Apache JMeter</title>" rather than "Apache JMeter". In this case, //title/text() would return "Apache JMeter".Name of created variableThe name of the JMeter variable in which to store the result.XPath QueryElement query in XPath 2.0 language. Can return more than one match.Match No. (0 for Random)If the XPath Path query leads to many results, you can choose which one(s) to extract as Variables:Default ValueDefault value returned when no match found. It is also returned if the node has no value and the fragment option is not selected.Namespaces aliases listList of namespaces aliases you want to use to parse the document, one line per declaration. You must specify them as follow: prefix=namespace. This implementation makes it easier to use namespaces than with the old XPathExtractor version. To allow for use in a ForEach Controller, it works exactly the same as the above XPath ExtractorXPath2 Extractor provides some interestings tools such as an improved syntax and much more functions than in its first version.
Here are some exemples:
abs(/book/page[2]) extracts 2nd absolute value of the page from a book avg(/librarie/book/page)extracts the average number of page from all the books in the libraries compare(/book[1]/page[2],/book[2]/page[2]) return Integer value equal 0 to if the 2nd page of the first book is equal to the 2nd page of the 2nd book, else return -1. This test element allows the user to extract value(s) from structured response - XML or (X)HTML - using XPath query language. Since JMeter 5.0, you should use XPath2 Extractor as it provides better and easier namespace management, better performances and support for XPath 2.0 XPath matching is applied to all qualifying samples in turn, and all the matching results will be returned.Use Tidy (tolerant parser)If checked use Tidy to parse HTML response into XHTML.Use NamespacesIf checked, then the XML parser will use namespace resolution.(see note below on NAMESPACES) Note that currently only namespaces declared on the root element will be recognised. See below for user-definition of additional workspace names.If Tidy is not selectedValidate XMLCheck the document against its schema.If Tidy is not selectedIgnore WhitespaceIgnore Element Whitespace.If Tidy is not selectedFetch External DTDsIf selected, external DTDs are fetched.If Tidy is not selectedReturn entire XPath fragment instead of text content?If selected, the fragment will be returned rather than the text content. For example //title would return "<title>Apache JMeter</title>" rather than "Apache JMeter". In this case, //title/text() would return "Apache JMeter".Name of created variableThe name of the JMeter variable in which to store the result.XPath QueryElement query in XPath language. Can return more than one match.Match No. (0 for Random)If the XPath Path query leads to many results, you can choose which one(s) to extract as Variables: Note: The next refName_n variable is set to null - e.g. if there are 2 matches, then refName_3 is set to null, and if there are no matches, then refName_1 is set to null. XPath is query language targeted primarily for XSLT transformations. However it is useful as generic query language for structured data too. See XPath Reference or XPath specification for more information. Here are few examples: extracts value attribute of option element that match text 'Czech Republic' inside of select element with name attribute 'country' inside of form with name attribute 'countryForm' provide a Properties file (if for example your file is named namespaces.properties) which contains mappings for the namespace prefixes: prefix1=http\://foo.apache.org prefix2=http\://toto.apache.org//mynamespace:tagname//*[local-name()='tagname' and namespace-uri()='uri-for-namespace']uri-for-namespacemynamespaceIn the XPATH Extractor we support to extract multiple xpaths at the same time, but in JMES Extractor only one JMES Expression can be entered at a time.JMESPath expressionsElement query in JMESPath query language. Can return the matched result.Match No. (0 for Random)If the JMESPath query leads to many results, you can choose which one(s) to extract as Variables: JMESPath is a query language for JSON. It is described in an ABNF grammar with a complete specification. This ensures that the language syntax is precisely defined. See JMESPath Reference for more information. Here are also some examples JMESPath Example.Action to be taken after a Sampler errorDetermines what happens if a sampler error occurs, either because the sample itself failed or an assertion failed. The possible choices are: The test element supports the ThreadListener and TestListener methods. These should be defined in the initialisation file. See the file BeanShellListeners.bshrc for example definitions. Descriptive name for this element that is shown in the tree. The name is stored in the script variable LabelReset bsh.Interpreter before each callIf this option is selected, then the interpreter will be recreated for each sample. This may be necessary for some long running scripts. For further information, see Best Practices - BeanShell scripting.ParametersParameters to pass to the BeanShell script. The parameters are stored in the following variables: A file containing the BeanShell script to run. The file name is stored in the script variable FileNameScriptThe BeanShell script. The return value is ignored.Yes (unless script file is provided)The following BeanShell variables are set up for use by the script:
log - (Logger) - can be used to write to the log file ctx - (JMeterContext) - gives access to the context vars - (JMeterVariables) - gives read/write access to variables:vars.get(key); vars.put(key,val); vars.putObject("OBJ1",new Object());props - (JMeterProperties - class java.util.Properties) - e.g. props.get("START.HMS"); props.put("PROP1","1234"); prev - (SampleResult) - gives access to the previous SampleResult data - (byte [])- gives access to the current sample dataFor details of all the methods available on each of the above variables, please check the Javadoc
If the property beanshell.postprocessor.init is defined, this is used to load an initialisation file, which can be used to define methods etc. for use in the BeanShell script.Script fileA file containing the script to run, if a relative file path is used, then it will be relative to directory referenced by "user.dir" System propertyScript compilation cachingUnique String across Test Plan that JMeter will use to cache result of Script compilation if language used supports Compilable interface (Groovy is one of these, java, beanshell and javascript are not)See note in JSR223 Sampler Java System property if you're using Groovy without checking this optionScriptThe script to run.Yes (unless script file is provided)vars - (JMeterVariables) - gives read/write access to variables:vars.get(key); vars.put(key,val); vars.putObject("OBJ1",new Object()); vars.getObject("OBJ2");props - (JMeterProperties - class java.util.Properties) - e.g. props.get("START.HMS"); props.put("PROP1","1234"); prev - (SampleResult) - gives access to the previous SampleResult (if any) sampler - (Sampler)- gives access to the current sampler OUT - System.out - e.g. OUT.println("message")For details of all the methods available on each of the above variables, please check the Javadoc
The JDBC PostProcessor enables you to run some SQL statement just after a sample has run. This can be useful if your JDBC Sample changes some data and you want to reset state to what it was before the JDBC sample run. The JSON PostProcessor enables you extract data from JSON responses using JSON-PATH syntax. This post processor is very similar to Regular expression extractor. It must be placed as a child of HTTP Sampler or any other sampler that has responses. It will allow you to extract in a very easy way text content, see JSON Path syntax.Names of created variablesSemicolon separated names of variables that will contain the results of JSON-PATH expressions (must match number of JSON-PATH expressions)JSON Path ExpressionsSemicolon separated JSON-PATH expressions (must match number of variables)Default ValuesSemicolon separated default values if JSON-PATH expressions do not return any result(must match number of variables)Match NumbersFor each JSON Path Expression, if the JSON Path query leads to many results, you can choose which one(s) to extract as Variables: The numbers have to be given as a Semicolon separated list. The number of elements in that list have to match the number of given JSON Path Expressions. If left empty, the value 0 will be used as default for every expression.Compute concatenation varIf many results are found, plugin will concatenate them using ‘,’ separator and store it in a var named <variable name>_ALLAllows the user to extract values from a server response using left and right boundaries. As a post-processor, this element will execute after each Sample request in its scope, testing the boundaries, extracting the requested values, generate the template string, and store the result into the given variable name.
Matching is applied to all qualifying samples in turn. For example if there is a main sample and 3 sub-samples, each of which contains a single match test, (i.e. 4 matches in total). For match number = 3, Sub-samples only, the extractor will match the 3rd sub-sample. For match number = 3, Main sample and sub-samples, the extractor will match the 2nd sub-sample (1st match is main sample). For match number = 0 or negative, all qualifying samples will be processed. For match number > 0, matching will stop as soon as enough matches have been found.Field to checkThe following fields can be checked: Body (unescaped) - the body of the response, with all Html escape codes replaced. Note that Html escapes are processed without regard to context, so some incorrect substitutions may be made.Note that this option highly impacts performances, so use it only when absolutely necessary and be aware of its impactsBody as a Document - the extract text from various type of documents via Apache Tika (see View Results Tree Document view section).Note that the Body as a Document option can impact performances, so ensure it is OK for your testName of created variableThe name of the JMeter variable in which to store the result. Also note that each group is stored as [refname]_g#, where [refname] is the string you entered as the reference name, and # is the group number, where group 0 is the entire match, group 1 is the match from the first set of parentheses, etc.Left BoundaryLeft boundary of value to findRight BoundaryRight boundary of value to findMatch No. (0 for Random)Indicates which match to use. The boundaries may match multiple times.Use a value of zero to indicate JMeter should choose a match at random. A positive number N means to select the nth match. Negative numbers are used in conjunction with the ForEach Controller - see below.Default ValueIf the boundaries do not match, then the reference variable will be set to the default value. This is particularly useful for debugging tests. If no default is provided, then it is difficult to tell whether the boundaries did not match, or maybe the wrong variable is being used. However, if you have several test elements that set the same variable, you may wish to leave the variable unchanged if the expression does not match. In this case, remove the default value once debugging is complete.No, but recommendedIf the match number is set to a negative number, then all the possible matches in the sampler data are processed. The variables are set as follows: Static variables can be defined for values that are repeated throughout a test, such as server names. For example the variable SERVER could be defined as www.example.com, and the rest of the test plan could refer to it as ${SERVER}. This simplifies changing the name later. If the same variable name is reused on one of more User Defined Variables Configuration elements, the value is set to the last definition in the test plan (reading from top to bottom). Such variables should be used for items that may change between test runs, but which remain the same during a test run.Note that the Test Plan cannot refer to variables it defines.If you need to construct other variables from the Test Plan variables, use a User Defined Variables test element. Selecting Functional Testing instructs JMeter to save the additional sample information - Response Data and Sampler Data - to all result files. This increases the resources needed to run a test, and may adversely impact JMeter performance. If more data is required for a particular sampler only, then add a Listener to it, and configure the fields as required.The option does not affect CSV result files, which cannot currently store such information.Also, an option exists here to instruct JMeter to run the Thread Group serially rather than in parallel.Run tearDown Thread Groups after shutdown of main threads: if selected, the tearDown groups (if any) will be run after graceful shutdown of the main threads. The tearDown threads won't be run if the test is forcibly stopped. Test plan now provides an easy way to add classpath setting to a specific test plan. The feature is additive, meaning that you can add jar files or directories, but removing an entry requires restarting JMeter.
Note that this cannot be used to add JMeter GUI plugins, because they are processed earlier.However it can be useful for utility jars such as JDBC drivers. The jars are only added to the search path for the JMeter loader, not for the system class loader. JMeter properties also provides an entry for loading additional classpaths. In jmeter.properties, edit "user.classpath" or "plugin_dependency_paths" to include additional libraries. See JMeter's Classpath and Configuring JMeter for details.This thread group is experimental, and it might change in the future releases. Please provide your feedback on what works and what could be improved.Open Model Thread Group defines a pool of users that will execute a particular test case against the server. The users are generated according to the schedule. The load profile consists of a sequence of constant, increasing or decreasing load. The basic configuration is rate(1/sec) random_arrivals(2 min) rate(3/sec) which means the load will increase linearly from one request per second to three requests per second during a period of two-minutes. If you omit rate at the end, then it will be set to the same value as that from the start. For example, rate(1/sec) random_arrivals(2 min) is exactly the same as rate(1/sec) random_arrivals(2 min) rate(1/sec). That is why rate(1/sec) random_arrivals(2 min) random_arrivals(3 min) rate(4/sec) is exactly the same as rate(1/sec) random_arrivals(2 min) rate(1/sec) random_arrivals(3 min) rate(4/sec), so the load is one request per second during the first two minutes, after which it increases linearly from one request per second to four requests per second during three minutes.
linearly increase the load from zero requests per second to ten requests per second during one minute rate(0) random_arrivals(1 min) rate(10/sec) random_arrivals(1 min) rate(10/sec) random_arrivals(1 min) rate(0)linearly increase the load from zero requests per second to ten requests per second during one minute, then hold the load during one minute, then linearly decrease the load from ten requests per second to zero during one minute rate(10) random_arrivals(1 min) rate(10/sec) random_arrivals(1 min) rate(10/sec) random_arrivals(1 min) rate(0)linearly increase the load from zero requests per second to ten requests per second during one minute, then hold the load during one minute, then linearly decrease the load from ten requests per second to zero requests per second during one minute rate(10) random_arrivals(1 min) pause(2 sec) random_arrivals(1 min)run with constant load of ten requests per second during one minute, then make two second pause, then resume the load of ten requests per second for one minute configures target load rate. The following time units are supported: ms, sec, min, hour, day. You can omit time unit in case the rate is 0: rate(0) configures random arrivals schedule with the given duration. The starting load rate is configured before random_arrivals, and the finish load rate is configured after random_arrivals. For example, 10 minute test from five requests per second at the beginning to fifteen request per second at the end could be configured as rate(5/sec) random_arrivals(10 min) rate(15/sec). The implicit rate at the beginning of the test is 0. If the finish rate is not provided (or if several random_arrivals steps go one after another), then the load is constant. For instance, rate(3/sec) random_arrivals(1 min) random_arrivals(2 min) rate(6/sec) configures constant rate of three requests per second for the first minute, and then the load increases from three requests per second to six requests per second during the next two minutes. The time units are the same as in rate. configures a pause for the given duration. The rate is restored after the pause, so rate(2/sec) random_arrivals(5 sec) pause(5 sec) random_arrivals(5 sec) generates random arrivals with two requests per second rate, then a pause for five seconds (no new arrivals), then five more seconds with two requests per second rate. Note: pause duration is always honoured, even if all the scenarios are complete, and no new ones will be scheduled. For instance, if you use rate(1/sec) random_arrivals(1 min) pause(1 hour), the thread group would always last for sixty-one minutes no matter how much time do individual scenarios take. The thread groups terminates threads as soon as the schedule ends. In other words, the threads are interrupted after all arrivals and pause intervals. If you want to let the threads complete safely, consider adding pause(5 min) at the end of the schedule. That will add five minutes for the threads to continue.There are no special functions for generating the load profile in a loop, however, the default JMeter templating functions can be helpful for generating the schedule. For example, the following pattern would generate a sequence of 10 steps where each step lasts 10 seconds: 10/sec, 20/sec, 30/sec, ... ${__groovy((1..10).collect { "rate(" + it*10 + "/sec) random_arrivals(10 sec) pause(1 sec)" }.join(" "))} You can get variables from properties as follows: rate(${__P(beginRate,40)}) random_arrivals(${__P(testDuration, 10)} sec) rate(${__P(endRate,40)})
Currently, the load profile is evaluated at the beginning of the test only, so if you use dynamic functions, then only the first result will be used.
ScheduleThe expression that configures schedule. For example: rate(5/sec) random_arrivals(1 min) pause(5 sec)Random Seed (change from 0 to random)Note: different thread groups should better have different seed values. Constant seed ensures thread group generates the same delays each test start. The value of "0" means the schedule is truly random (non-repeatable from one execution to another)..A Thread Group defines a pool of users that will execute a particular test case against your server. In the Thread Group GUI, you can control the number of users simulated (number of threads), the ramp up time (how long it takes to start all the threads), the number of times to perform the test, and optionally, a start and stop time for the test.
See also tearDown Thread Group and setUp Thread Group. When using the scheduler, JMeter runs the thread group until either the number of loops is reached or the duration/end-time is reached - whichever occurs first. Note that the condition is only checked between samples; when the end condition is reached, that thread will stop. JMeter does not interrupt samplers which are waiting for a response, so the end time may be delayed arbitrarily. Validation Mode: This mode enables rapid validation of a Thread Group by running it with one thread, one iteration, no timers and no Startup delay set to 0. Behaviour can be modified with some properties by setting in user.properties: testplan_validation.number_iterationsNumber of iterations to use to validate a Thread Group testplan_validation.tpc_force_100_pct Whether to force Throughput Controller in percentage mode to run as if percentage was 100 %. Defaults to falseAction to be taken after a Sampler errorDetermines what happens if a sampler error occurs, either because the sample itself failed or an assertion failed. The possible choices are: How long JMeter should take to get all the threads started. If there are 10 threads and a ramp-up time of 100 seconds, then each thread will begin 10 seconds after the previous thread started, for a total time of 100 seconds to get the test fully up to speed. The first thread will always start directly, so if you configured one thread, the ramp-up time is effectively zero. For the same reason, the tenth thread in the above example will actually be started after 90 seconds and not 100 seconds.Same user on each iterationIf selected, cookie and cache data from the first sampler response are used in subsequent requests (requires a global Cookie and Cache Manager respectively). If not selected, cookie and cache data from the first sampler response are not used in subsequent requests.If not selected, a new connection will be opened between iterations which will result in increased response times and consume more resources (memory and cpu).Loop CountNumber of times to perform the test case. Alternatively, "infinite" can be selected causing the test to run until manually stopped or end of the thread lifetime is reached.Yes, unless Infinite is selectedSame user on each iterationIf selected, cookie and cache data from the first sampler response are used in subsequent requests (requires a global Cookie and Cache Manager respectively). If not selected, cookie and cache data from the first sampler response are not used in subsequent requests.If not selected, a new connection will be opened between iterations which will result in increased response times and consume more resources (memory and cpu).Delay Thread creation until neededIf selected, threads are created only when the appropriate proportion of the ramp-up time has elapsed. This is most appropriate for tests with a ramp-up time that is significantly longer than the time to execute a single thread. I.e. where earlier threads finish before later ones start. If not selected, all threads are created when the test starts (they then pause for the appropriate proportion of the ramp-up time). This is the original default, and is appropriate for tests where threads are active throughout most of the test.Specify Thread lifetimeIf selected, confines Thread operation time to the given boundsDuration (seconds)If the scheduler checkbox is selected, one can choose a relative end time. JMeter will use this to calculate the End Time.Startup delay (seconds)If the scheduler checkbox is selected, one can choose a relative startup delay. JMeter will use this to calculate the Start Time. The SSL Manager is a way to select a client certificate so that you can test applications that use Public Key Infrastructure (PKI). It is only needed if you have not set up the appropriate System properties. Choosing a Client Certificate You may either use a Java Key Store (JKS) format key store, or a Public Key Certificate Standard #12 (PKCS12) file for your client certificates. There is a feature of the JSSE libraries that require you to have at least a six character password on your key (at least for the keytool utility that comes with your JDK). To select the client certificate, choose Options → SSL Manager from the menu bar. You will be presented with a file finder that looks for PKCS12 files by default. Your PKCS12 file must have the extension '.p12' for SSL Manager to recognize it as a PKCS12 file. Any other file will be treated like an average JKS key store. If JSSE is correctly installed, you will be prompted for the password. The text box does not hide the characters you type at this point -- so make sure no one is looking over your shoulder. The current implementation assumes that the password for the keystore is also the password for the private key of the client you want to authenticate as. Or you can set the appropriate System properties - see the system.properties file. The next time you run your test, the SSL Manager will examine your key store to see if it has at least one key available to it. If there is only one key, SSL Manager will select it for you. If there is more than one key, it currently selects the first key. There is currently no way to select other entries in the keystore, so the desired key must be the first. Things to Look Out For You must have your Certificate Authority (CA) certificate installed properly if it is not signed by one of the five CA certificates that ships with your JDK. One method to install it is to import your CA certificate into a JKS file, and name the JKS file "jssecacerts". Place the file in your JRE's lib/security folder. This file will be read before the "cacerts" file in the same directory. Keep in mind that as long as the "jssecacerts" file exists, the certificates installed in "cacerts" will not be used. This may cause problems for you. If you don't mind importing your CA certificate into the "cacerts" file, then you can authenticate against all of the CA certificates installed. The HTTP(S) Test Script Recorder allows JMeter to intercept and record your actions while you browse your web application with your normal browser. JMeter will create test sample objects and store them directly into your test plan as you go (so you can view samples interactively while you make them). Ensure you read this wiki page to setup correctly JMeter. To use the recorder, add the HTTP(S) Test Script Recorder element. Right-click on the Test Plan element to get the Add menu: (Add → Non-Test Elements → HTTP(S) Test Script Recorder The recorder is implemented as an HTTP(S) proxy server. You need to set up your browser use the proxy for all HTTP and HTTPS requests.Do not use JMeter as the proxy for any other request types - FTP, etc. - as JMeter cannot handle them.Ideally use private browsing mode when recording the session. This should ensure that the browser starts with no stored cookies, and prevents certain changes from being saved. For example, Firefox does not allow certificate overrides to be saved permanently.HTTPS recording and certificates
HTTPS connections use certificates to authenticate the connection between the browser and the web server. When connecting via HTTPS, the server presents the certificate to the browser. To authenticate the certificate, the browser checks that the server certificate is signed by a Certificate Authority (CA) that is linked to one of its in-built root CAs.Browsers also check that the certificate is for the correct host or domain, and that it is valid and not expired.If any of the browser checks fail, it will prompt the user who can then decide whether to allow the connection to proceed. JMeter needs to use its own certificate to enable it to intercept the HTTPS connection from the browser. Effectively JMeter has to pretend to be the target server. JMeter will generate its own certificate(s). These are generated with a validity period defined by the property proxy.cert.validity, default 7 days, and random passwords. If JMeter detects that it is running under Java 8 or later, it will generate certificates for each target server as necessary (dynamic mode) unless the following property is defined: proxy.cert.dynamic_keys=false. When using dynamic mode, the certificate will be for the correct host name, and will be signed by a JMeter-generated CA certificate. By default, this CA certificate won't be trusted by the browser, however it can be installed as a trusted certificate. Once this is done, the generated server certificates will be accepted by the browser. This has the advantage that even embedded HTTPS resources can be intercepted, and there is no need to override the browser checks for each new server.Browsers don't prompt for embedded resources. So with earlier versions, embedded resources would only be downloaded for servers that were already 'known' to the browserUnless a keystore is provided (and you define the property proxy.cert.alias), JMeter needs to use the keytool application to create the keystore entries. JMeter includes code to check that keytool is available by looking in various standard places. If JMeter is unable to find the keytool application, it will report an error. If necessary, the system property keytool.directory can be used to tell JMeter where to find keytool. This should be defined in the file system.properties.Certificate generation can take some while, during which time the GUI will be unresponsive.The cursor is changed to an hour-glass whilst this is happening. When certificate generation is complete, the GUI will display a pop-up dialogue containing the details of the certificate for the root CA. This certificate needs to be installed by the browser in order for it to accept the host certificates generated by JMeter; see below for details. This certificate is not one of the certificates that browsers normally trust, and will not be for the correct host. As a consequence: The browser should display a dialogue asking if you want to accept the certificate or not. For example: 1) The server's name "www.example.com" does not match the certificate's name "_ JMeter Root CA for recording (INSTALL ONLY IF IT S YOURS)". Somebody may be trying to eavesdrop on you. 2) The certificate for "_ JMeter Root CA for recording (INSTALL ONLY IF IT S YOURS)" is signed by the unknown Certificate Authority "_ JMeter Root CA for recording (INSTALL ONLY IF IT S YOURS)". It is not possible to verify that this is a valid certificate. You will need to accept the certificate in order to allow the JMeter Proxy to intercept the SSL traffic in order to record it. However, do not accept this certificate permanently; it should only be accepted temporarily. Browsers only prompt this dialogue for the certificate of the main URL, not for the resources loaded in the page, such as images, CSS or JavaScript files hosted on a secured external CDN. If you have such resources (gmail has for example), you'll have to first browse manually to these other domains in order to accept JMeter's certificate for them. Check in jmeter.log for secure domains that you need to register certificate for.If the browser has already registered a validated certificate for this domain, the browser will detect JMeter as a security breach and will refuse to load the page. If so, you have to remove the trusted certificate from your browser's keystore. Versions of JMeter from 2.10 onwards still support this method, and will continue to do so if you define the following property: proxy.cert.alias The following properties can be used to change the certificate that is used: If your browser currently uses a proxy (e.g. a company intranet may route all external requests via a proxy), then you need to tell JMeter to use that proxy before starting JMeter, using the command-line options -H and -P. This setting will also be needed when running the generated test plan. As mentioned above, when run under Java 8, JMeter can generate certificates for each server. For this to work smoothly, the root CA signing certificate used by JMeter needs to be trusted by the browser. The first time that the recorder is started, it will generate the certificates if necessary. The root CA certificate is exported into a file with the name ApacheJMeterTemporaryRootCA in the current launch directory. When the certificates have been set up, JMeter will show a dialog with the current certificate details. At this point, the certificate can be imported into the browser, as per the instructions below. Note that once the root CA certificate has been installed as a trusted CA, the browser will trust any certificates signed by it. Until such time as the certificate expires or the certificate is removed from the browser, it will not warn the user that the certificate is being relied upon. So anyone that can get hold of the keystore and password can use the certificate to generate certificates which will be accepted by any browsers that trust the JMeter root CA certificate. For this reason, the password for the keystore and private keys are randomly generated and a short validity period used. The passwords are stored in the local preferences area. Please ensure that only trusted users have access to the host with the keystore. The port that the HTTP(S) Test Script Recorder listens to. 8888 is the default, but you can change it. HTTPS DomainsList of domain (or host) names for HTTPS. Use this to pre-generate certificates for all servers you wish to record. For example, *.example.com,*.subdomain.example.com Note that wildcard domains only apply to one level, i.e. abc.subdomain.example.com matches *.subdomain.example.com but not *.example.comTarget ControllerThe controller where the proxy will store the generated samples. By default, it will look for a Recording Controller and store them there wherever it is.GroupingWhether to group samplers for requests from a single "click" (requests received without significant time separation), and how to represent that grouping in the recording: The property proxy.pause determines the minimum gap that JMeter needs between requests to treat them as separate "clicks". The default is 5000 (milliseconds) i.e. 5 seconds. If you are using grouping, please ensure that you leave the required gap between clicks.Capture HTTP HeadersShould headers be added to the plan? If specified, a Header Manager will be added to each HTTP Sampler. The Proxy server always removes Cookie and Authorization headers from the generated Header Managers. By default it also removes If-Modified-Since and If-None-Match headers. These are used to determine if the browser cache items are up to date; when recording one normally wants to download all the content. To change which additional headers are removed, define the JMeter property proxy.headers.remove as a comma-separated list of headers.Add AssertionsAdd a blank assertion to each sampler?Regex MatchingUse Regex Matching when replacing variables? If checked replacement will use word boundaries, i.e. it will only replace word matching values of variable, not part of a word. A word boundary follows Perl5 definition and is equivalent to \b. More information below in the paragraph about "User Defined Variable replacement".Prefix/Transaction nameAdd a prefix to sampler name during recording (Prefix mode). Or replace sampler name by user chosen name (Transaction name)Naming schemeSelect the naming scheme for sampler names during recording. Default is Transaction nameNaming formatIf Use format string is selected as naming scheme, a freestyle format can be given. Placeholders for the transaction name, scheme, host, port, path and counter can be given by #{name}, #{scheme}, #{host}, #{port}, #{path}, #{url} and #{counter}. A simple format could be "#{name}-#{counter}", which would be equivalent to the numbered default naming scheme. For more complex formatting Java formatting for MessageFormat can be used, as in "#{counter,number,000}: #{name}-#{path}", which would print the counter filled with up to three zeroes. Note that scheme is called protocol in the sampler GUI and host is called domain. Default is an empty string.Counter start valueCan be used to reset the counter to a given value. Note, that the next sample will first increment and then use the value. If the first sampler should start with 1, reset the counter to 0.Create new transaction after request (ms)Inactivity time between two requests needed to consider them in two separate groups.Retrieve all Embedded ResourcesSet Retrieve all Embedded Resources in the generated samplers?Content Type filterFilter the requests based on the content-type - e.g. "text/html [;charset=utf-8 ]". The fields are regular expressions which are checked to see if they are contained in the content-type. [Does not have to match the entire field]. The include filter is checked first, then the exclude filter. Samples which are filtered out will not be stored.Note: this filtering is applied to the content type of the responsePatterns to IncludeRegular expressions that are matched against the full URL that is sampled. Allows filtering of requests that are recorded. All requests pass through, but only those that meet the requirements of the Include/Exclude fields are recorded. If both Include and Exclude are left empty, then everything is recorded (which can result in dozens of samples recorded for each page, as images, stylesheets, etc. are recorded). If there is at least one entry in the Include field, then only requests that match one or more Include patterns are recordedPatterns to ExcludeRegular expressions that are matched against the URL that is sampled. Any requests that match one or more Exclude pattern are not recordedNotify Child Listeners of filtered samplersNotify Child Listeners of filtered samplers Any response that match one or more Exclude pattern is not delivered to Child Listeners (View Results Tree)Start ButtonStart the proxy server. JMeter writes the following message to the console once the proxy server has started up and is ready to take requests: "Proxy up and running!".Stop ButtonStop the proxy server.Restart ButtonStops and restarts the proxy server. This is useful when you change/add/delete an include/exclude filter expression.Recording and redirects
During recording, the browser will follow a redirect response and generate an additional request. The Proxy will record both the original request and the redirected request (subject to whatever exclusions are configured). The generated samples have "Follow Redirects" selected by default, because that is generally better.Redirects may depend on the original request, so repeating the originally recorded sample may not always work.Now if JMeter is set to follow the redirect during replay, it will issue the original request, and then replay the redirect request that was recorded. To avoid this duplicate replay, JMeter tries to detect when a sample is the result of a previous redirect. If the current response is a redirect, JMeter will save the redirect URL. When the next request is received, it is compared with the saved redirect URL and if there is a match, JMeter will disable the generated sample. It also adds comments to the redirect chain. This assumes that all the requests in a redirect chain will follow each other without any intervening requests. To disable the redirect detection, set the property proxy.redirect.disabling=falseIncludes and Excludes
The include and exclude patterns are treated as regular expressions (using Jakarta ORO). They will be matched against the host name, port (actual or implied), path and query (if any) of each browser request. If the URL you are browsing is "http://localhost/jmeter/index.html?username=xxxx", then the regular expression will be tested against the string: "localhost:80/jmeter/index.html?username=xxxx". Thus, if you want to include all .html files, your regular expression might look like: ".*\.html(\?.*)?" - or ".*\.html if you know that there is no query string or you only want html pages without query strings. If there are any include patterns, then the URL must match at least one of the patterns , otherwise it will not be recorded. If there are any exclude patterns, then the URL must not match any of the patterns , otherwise it will not be recorded. Using a combination of includes and excludes, you should be able to record what you are interested in and skip what you are not. N.B. the string that is matched by the regular expression must be the same as the whole host+path string. Thus "\.html" will not match localhost:80/index.htmlCapturing binary POST data
JMeter is able to capture binary POST data. To configure which content-types are treated as binary, update the JMeter property proxy.binary.types. The default settings are as follows: # These content-types will be handled by saving the request in a file: proxy.binary.types=application/x-amf,application/x-java-serialized-object # The files will be saved in this directory: proxy.binary.directory=user.dir # The files will be created with this file filesuffix: proxy.binary.filesuffix=.binaryAdding timers
It is also possible to have the proxy add timers to the recorded script. To do this, create a timer directly within the HTTP(S) Test Script Recorder component. The proxy will place a copy of this timer into each sample it records, or into the first sample of each group if you're using grouping. This copy will then be scanned for occurrences of variable ${T} in its properties, and any such occurrences will be replaced by the time gap from the previous sampler recorded (in milliseconds). When you are ready to begin, hit "start".You will need to edit the proxy settings of your browser to point at the appropriate server and port, where the server is the machine JMeter is running on, and the port # is from the Proxy Control Panel shown above.Where Do Samples Get Recorded?
JMeter places the recorded samples in the Target Controller you choose. If you choose the default option "Use Recording Controller", they will be stored in the first Recording Controller found in the test object tree (so be sure to add a Recording Controller before you start recording). If the Proxy does not seem to record any samples, this could be because the browser is not actually using the proxy. To check if this is the case, try stopping the proxy. If the browser still downloads pages, then it was not sending requests via the proxy. Double-check the browser options. If you are trying to record from a server running on the same host, then check that the browser is not set to "Bypass proxy server for local addresses" (this example is from IE7, but there will be similar options for other browsers). If JMeter does not record browser URLs such as http://localhost/ or http://127.0.0.1/, try using the non-loopback hostname or IP address, e.g. http://myhost/ or http://192.168.0.2/.Handling of HTTP Request Defaults
If the HTTP(S) Test Script Recorder finds enabled HTTP Request Defaults directly within the controller where samples are being stored, or directly within any of its parent controllers, the recorded samples will have empty fields for the default values you specified. You may further control this behaviour by placing an HTTP Request Defaults element directly within the HTTP(S) Test Script Recorder, whose non-blank values will override those in the other HTTP Request Defaults. See Best Practices with the HTTP(S) Test Script Recorder for more info.User Defined Variable replacement
Similarly, if the HTTP(S) Test Script Recorder finds User Defined Variables (UDV) directly within the controller where samples are being stored, or directly within any of its parent controllers, the recorded samples will have any occurrences of the values of those variables replaced by the corresponding variable. Again, you can place User Defined Variables directly within the HTTP(S) Test Script Recorder to override the values to be replaced. See Best Practices with the Test Script Recorder for more info.Please note that matching is case-sensitive.Replacement by Variables: by default, the Proxy server looks for all occurrences of UDV values. If you define the variable WEB with the value www, for example, the string www will be replaced by ${WEB} wherever it is found. To avoid this happening everywhere, set the "Regex Matching" check-box. This tells the proxy server to treat values as Regexes (using the perl5 compatible regex matchers provided by ORO). If "Regex Matching" is selected every variable will be compiled into a perl compatible regex enclosed in \b( and )\b. That way each match will start and end at a word boundary. Note that the boundary characters are not part of the matching group, e.g. n.* to match name out of You can call me 'name'. If you don't want your regex to be enclosed with those boundary matchers, you have to enclose your regex within parens, e.g ('.*?') to match 'name' out of You can call me 'name'. The variables will be checked in random order. So ensure, that the potential matches don't overlap. Overlapping matchers would be .* (which matches anything) and www (which matches www only). Non-overlapping matchers would be a+ (matches a sequence of a's) and b+ (matches a sequence of b's). If you want to match a whole string only, enclose it in (^ and $), e.g. (^thus$). The parens are necessary, since the normally added boundary characters will prevent ^ and $ to match. If you want to match /images at the start of a string only, use the value (^/images). Jakarta ORO also supports zero-width look-ahead, so one can match /images/… but retain the trailing / in the output by using (^/images(?=/)). Note that the current version of Jakarta ORO does not support look-behind - i.e. (?<=…) or (?<!…). Look out for overlapping matchers. For example the value .* as a regex in a variable named regex will partly match a previous replaced variable, which will result in something like ${{regex}, which is most probably not the desired result. If there are any problems interpreting any variables as patterns, these are reported in jmeter.log, so be sure to check this if UDVs are not working as expected. When you are done recording your test samples, stop the proxy server (hit the "stop" button). Remember to reset your browser's proxy settings. Now, you may want to sort and re-order the test script, add timers, listeners, a cookie manager, etc.How can I record the server's responses too?
Just place a View Results Tree listener as a child of the HTTP(S) Test Script Recorder and the responses will be displayed. You can also add a Save Responses to a file Post-Processor which will save the responses to files.Associating requests with responses
If you define the property proxy.number.requests=true JMeter will add a number to each sampler and each response. Note that there may be more responses than samplers if excludes or includes have been used. Responses that have been excluded will have labels enclosed in [ and ], for example [23 /favicon.ico]Cookie Manager
If the server you are testing against uses cookies, remember to add an HTTP Cookie Manager to the test plan when you have finished recording it. During recording, the browser handles any cookies, but JMeter needs a Cookie Manager to do the cookie handling during a test run. The JMeter Proxy server passes on all cookies sent by the browser during recording, but does not save them to the test plan because they are likely to change between runs.Authorization Manager
The HTTP(S) Test Script Recorder grabs "Authentication" header, tries to compute the Auth Policy. If Authorization Manager was added to target controller manually, HTTP(S) Test Script Recorder will find it and add authorization (matching ones will be removed). Otherwise Authorization Manager will be added to target controller with authorization object. You may have to fix automatically computed values after recording.Uploading files
Some browsers (e.g. Firefox and Opera) don't include the full name of a file when uploading files. This can cause the JMeter proxy server to fail. One solution is to ensure that any files to be uploaded are in the JMeter working directory, either by copying the files there or by starting JMeter in the directory containing the files.Recording HTTP Based Non Textual Protocols not natively available in JMeter
You may have to record an HTTP protocol that is not handled by default by JMeter (Custom Binary Protocol, Adobe Flex, Microsoft Silverlight, … ). Although JMeter does not provide a native proxy implementation to record these protocols, you have the ability to record these protocols by implementing a custom SamplerCreator. This Sampler Creator will translate the binary format into a HTTPSamplerBase subclass that can be added to the JMeter Test Case. For more details see "Extending JMeter". The HTTP Mirror Server is a very simple HTTP server - it simply mirrors the data sent to it. This is useful for checking the content of HTTP requests. It uses default port 8081.Max Number of threadsIf set to a value > 0, number of threads serving requests will be limited to the configured number, if set to a value ≤ 0 a new thread will be created to serve each incoming request. Defaults to 0Max Queue sizeSize of queue used for holding tasks before they are executed by Thread Pool, when Thread pool is exceeded, incoming requests will be held in this queue and discarded when this queue is full. This parameter is only used if Max Number of Threads is greater than 0. Defaults to 25X-ResponseStatusResponse status, see HTTP Status responses, example 200 OK, 500 Internal Server Error, ….X-ResponseLengthSize of response, this trims the response to the requested size if that is less than the total sizeX-SetHeadersPipe separated list of headers, example: headerA: valueA|headerB: valueB would set headerA to valueA and headerB to valueB.Verbose flag, writes some details to standard output, e.g. first line and redirect location if specifiedThe Debug PostProcessor creates a subSample with the details of the previous Sampler properties, JMeter variables, properties and/or System Properties. The values can be seen in the View Results Tree Listener Response Data pane. When using Test Fragment with Module Controller, ensure you disable the Test Fragment to avoid the execution of Test Fragment itself. This is done by default since JMeter 2.13. A special type of ThreadGroup that can be utilized to perform Pre-Test Actions. The behavior of these threads is exactly like a normal Thread Group element. The difference is that these type of threads execute before the test proceeds to the executing of regular Thread Groups. A special type of ThreadGroup that can be utilized to perform Post-Test Actions. The behavior of these threads is exactly like a normal Thread Group element. The difference is that these type of threads execute after the test has finished executing its regular Thread Groups. Note that by default it won't run if Test is gracefully shutdown, if you want to make it run in this case, ensure you check option "Run tearDown Thread Groups after shutdown of main threads" on Test Plan element. If Test Plan is stopped, tearDown will not run even if option is checked.Apache, Apache JMeter, JMeter, the Apache feather, and the Apache JMeter logo are trademarks of the