Glossary for Performance Testing

  • Capacity Capacity is the ability to handle workload at any given time within the key performance requirements.
  • Capacity testing Capacity testing is complementary to load testing and it determines your server's ultimate failure point, whereas load testing monitors results at various levels of load and traffic patterns. You perform capacity testing in conjunction with capacity planning. You use capacity planning to plan for future growth, such as an increased user base or increased volume of data. For example, to accommodate future loads you need to know how many additional resources (such as Processor, memory, disk space, or network bandwidth) are necessary to support future usage levels. Capacity testing helps you identify a scaling strategy to determine whether you should scale up or scale out.
  • Component Testing Any performance test that targets an architectural component of the application. Commonly tested components include servers, databases, networks, firewalls and storage devices.
  • Endurance Testing A performance test focused on determining or validating performance characteristics of the product under test when subjected to workload models and load volumes anticipated during production operations over an extended period of time. Endurance testing is a subset of load testing.
  • Investigation An activity based on collecting information related to speed, scalability and/or stability characteristics about the product under test that may have value in determining or improving the quality of the product. Investigation is frequently employed to prove or disprove hypothesis regarding the root cause of one or more observed performance issues.
  • Latency Latency is a measure of responsiveness. Latency represents the time it takes to complete the execution of a request. Latency may also represent the sum of several latencies or subtasks.
  • Load testing Use load testing to verify application behavior under normal and peak load conditions. This allows you to verify that your application can meet your desired performance objectives; these performance objectives are often specified in a service level agreement. It enables you to measure response times, throughput rates, resource utilization levels, and to identify your application's breaking point, assuming that breaking point occurs below the peak load condition.
  • Metrics Metrics are the actual measurements obtained by running performance tests. These performance tests include system-related metrics such as processor, memory, disk I/O, network I/O, and resource utilization levels. The performance tests also include application-specific metrics such as performance counters and timing data.
  • Performance budgets Performance budgets are constraints given to developers regarding allowable resource consumption for their components.
  • Performance goal Performance goals are those criteria that are desired for release, but may be negotiable under certain circumstances. For instance, if a response time goal of three seconds is set for a particular transaction, but the actual response time is determined to be 3.3 seconds, it is likely that the stakeholders will choose to release the application and defer performance tuning of that transaction for a future release.
  • Performance Performance refers to information regarding response times, throughput, and resource utilization levels of your application.
  • Performance requirement Performance requirements are those criteria which are absolutely non-negotiable due to contractual obligations, service level agreements, or non-negotiable business needs. Any performance criteria that will not unquestionably lead to a decision to delay a release until the criteria pass is not absolutely required, and therefore, not a requirement.
  • Performance objectives Performance objectives are usually specified in terms of response times, throughput (transactions per second), and resource utilization levels and typically focus on metrics that can be directly related to user satisfaction.
  • Performance targets Performance targets are the desired value for the metrics identified for your project under a particular set of conditions, usually specified in terms of response times, throughput, and resource utilization levels. Resource utilization levels include the amount of processor capacity, memory, disk I/O, and network I/O that your application consumes. These typically equate to goals.
  • Performance testing A technical investigation done to determine or validate speed, scalability and/or stability characteristics of the product under test. Performance testing is the superset containing all of the other sub-categories of performance testing in this chapter
  • Performance testing objectives Performance testing objectives refer to data that is collected through the process of performance testing and that is anticipated to have value in determining or improving the quality of the product. However, these objectives are not necessarily quantitative or directly related to a performance requirement, goal, or stated quality of service (QoS) specification
  • Performance thresholds The maximum values for the metrics. Performance thresholds are the maximum acceptable values for the metrics identified for your project, usually specified in terms of response times, throughput (transactions per second), and resource utilization levels. Resource utilization levels include the amount of processor capacity, memory, disk I/O, and network I/O that your application consumes. These typically equate to requirements.
  • Response time Response time is a measure of how responsive an application or subsystem is to a client request.
  • Resource utilization Resource utilization is the cost in terms of system resources. The primary resources are processor, memory, disk I/O, and network I/O.
  • Saturation When a resource has reached full utilization.
  • Scalability Scalability refers to the ability to handle additional workload, without adversely affecting performance, by adding resources such as processor, memory, and storage capacity.
  • Scenarios Scenarios are a sequence of steps in your application. They can represent a use case or a business function such as searching a product catalog, adding an item to a shopping cart, or placing an order.
  • Smoke testing A performance test designed to determine if your application can successfully perform all of its operations under a normal load condition for a short time.
  • Spike testing A performance test focused on determining or validating performance characteristics of the product under test when subjected to workload models and load volumes that repeatedly increase beyond anticipated production operations for short periods of time. Spike testing is a subset of stress testing.
  • Stability Stability is the overall reliability, robustness, functional and data integrity, availability, and/or consistency of responsiveness for your system under a variety conditions.
  • Stress testing Use stress testing to evaluate your application's behavior when it is pushed beyond the normal or peak load conditions. The goal of stress testing is to unearth application bugs that surface only under high load conditions. These can include such things as synchronization issues, race conditions, and memory leaks. Stress testing enables you to identify your application's weak points, and how it behaves under extreme load conditions.
  • Throughput Throughput is the number of units of work that can be handled per unit of time. For instance: requests per second, calls per day, hits per second, reports per year etc.
  • Unit testing Any performance test that targets a module of code where that module is any logical sub-set of the entire existing code base of the application. Commonly tested modules include functions, procedures, routines, objects, methods and classes. Performance Unit Tests are frequently created and conducted by the developer who wrote the module of code being tested.
  • Utilization Percentage of time a resource is busy servicing user requests. The remaining percentage of time is idle time.
  • Validation testing An activity that compares speed, scalability and/or stability characteristics of the product under test to the expectations that have been set or presumed for that product.
  • Workload Workload is the stimulus to system, application or component to simulate a usage pattern, in regards to concurrency and/or data inputs. The workload includes total numbers of users, concurrent active users, data volumes, and transaction volumes, along with the transaction mix. For performance modeling, you associate a workload with an individual scenario.

Cem Kaner

  • Software Testing A technical investigation done to expose quality-related information about the product under test. - Cem Kaner

James Bach

  • Bug Anything that threatens the value of the product. Something that bugs someone whose opinion matters. - James Bach
  • Exploratory Testing An interactive process of simultaneous learning, test design, and test execution. - James Bach
  • Heuristic Testing An approach to test design that employs heuristics to enable rapid development of test cases. - James Bach
  • Risk-Based Testing Any testing organized to explore specific product risks. - James Bach
  • Test Design The process of creating tests. - James Bach
  • Test Execution The process of configuring, operating, and observing a product for the purpose of evaluating it. - James Bach
  • Test Logistics The set of ideas that guide how resources are applied to fulfill the test strategy. - James Bach
  • Test Plan The set of ideas that guide or represent the intended test process. - James Bach
  • Test Strategy The way tests will be designed and executed to support an effective quality assessment.. - James Bach

Jerry Weinberg

  • Quality Something of value to some person. - Jerry Weinberg

Scott Barber

  • Application Scalability Characteristics of the product under test related to the number of users the product can support. These Characteristics or Qualities of Service may be related to user load, network or data capacity and/or product failure modes related to the product's inability to scale beyond a particular level. - Scott Barber
  • Application Speed Characteristics of the product under test related to the product's overall speed of response, or sub-system's speed of response, to a user initiated activity - Scott Barber
  • Application Stability Characteristics of the product under test related to the product's overall reliability, robustness, functional and data integrity, availability and/or consistency of responsiveness under a variety of expected and unexpected conditions. - Scott Barber
  • Application Usage Profile One or more descriptions of how the product under test is, or is anticipated to be, used during production operations. Usage profiles are typically expressed in terms of business activities and usage scenarios. - Scott Barber
  • Component Performance Test Any performance test that targets an architectural component of the application. Commonly tested components include servers, databases, networks, firewalls and storage devices. - Scott Barber
  • Endurance Test A performance test focused on determining or validating performance characteristics of the product under test when subjected to workload models and load volumes anticipated during production operations over an extended period of time. Endurance testing is a subset of load testing. - Scott Barber
  • Load Test A performance test focused on determining or validating performance characteristics of the product under test when subjected to workload models and load volumes anticipated during production operations. - Scott Barber
  • Performance Targets The desired value for a resource of interest under a particular set of conditions, usually specified in terms of response times, throughput (transactions per second), and resource utilization levels. Resource utilization levels include the amount of CPU capacity, memory, disk I/O, and network I/O that your application consumes. - Scott Barber
  • Performance Testing Objective Information to be collected through the process of performance testing that is anticipated to have value in determining or improving the quality of the product, but are not necessarily quantitative or directly related to a performance requirement, goal or stated Quality of Service. - Scott Barber
  • Performance Thresholds The maximum acceptable value for a resource of interest, usually specified in terms of response times, throughput (transactions per second), and resource utilization levels. Resource utilization levels include the amount of CPU capacity, memory, disk I/O, and network I/O that your application consumes. - Scott Barber
  • Performance Unit Test Any performance test that targets a module of code where that module is any logical sub-set of the entire existing code base of the application. Commonly tested modules include functions, procedures, routines, objects, methods and classes. Performance Unit Tests are frequently created and conducted by the developer who wrote the module of code being tested. - Scott Barber
  • Software Performance Goals Performance related characteristics of the product under test that are desired to be met prior to product release, but which are not strictly mandatory. - Scott Barber
  • Software Performance Investigation An activity based on collecting information related to speed, scalability and/or stability characteristics about the product under test that may have value in determining or improving the quality of the product. - Scott Barber
  • Software Performance Requirements Performance related characteristics of the product under test that must be met in order for the product to be released. Performance requirements are mandated via legal contract or service level agreement. - Scott Barber
  • Software Performance Testing A technical investigation done to determine or validate speed, scalability and/or stability characteristics of the product under test. - Scott Barber
  • Software Performance Validation An activity that compares speed, scalability and/or stability characteristics of the product under test to the expectations that have been set or presumed for that product. - Scott Barber
  • Spike Test A performance test focused on determining or validating performance characteristics of the product under test when subjected to workload models and load volumes that repeatedly increase beyond anticipated production operations for short periods of time. Spike testing is a subset of stress testing. - Scott Barber
  • Stress Test A performance test focused on determining or validating performance characteristics of the product under test when subjected to workload models, and load volumes beyond those anticipated during production operations. Stress tests may also include tests focused on determining or validating performance characteristics of the product under test when subjected to workload models and load volumes when the product is subjected to other stressful conditions, such as limited memory, insufficient disk space or server failure. - Scott Barber
  • User Community Model Models that enhance the application usage profile(s) by adding distribution of activities, hourly usage volume and other necessary variables to design realistic performance tests. - Scott Barber

Last edited Jul 10, 2007 at 6:16 PM by mycodeplexuser, version 10

Comments

No comments yet.