Jump to content

Apdex

From Wikipedia, the free encyclopedia

Apdex (Application Performance Index) is an open standard developed by an alliance of companies for measuring performance of software applications in computing. Its purpose is to convert measurements into insights about user satisfaction, by specifying a uniform way to analyze and report on the degree to which measured performance meets user expectations. It is based on counts of "satisfied", "tolerating", and "frustrated" users, given a maximum satisfactory response time tof , a maximum tolerable response time of 4t, and where users are assumed to be frustrated above 4t. The score is equivalent to a weighted average of these user counts with weights 1, 0.5, and 0, respectively.

Problems addressed

[edit]

When engaging in application performance management, for example in the course of website monitoring, enterprises collect many measurements of the performance of information technology applications. However, this measurement data may not provide a clear and simple picture of how well those applications are performing from a business point of view, a characteristic desired in metrics that are used as key performance indicators. Reporting several different kinds of data can confuse. Reducing measurement data to a single well understood metric is a convenient way to track and report on quality of experience.

Measurements of application response times, in particular, may be difficult to evaluate because:

  • Viewed alone, they do not reveal whether people using the application consider its behavior to be highly responsive to their particular needs, merely tolerable, or frustratingly slow.
  • Using averages to summarize many measurement samples washes out important details in the measurement distribution, and may obscure evidence that many users may have been frustrated with application response times that were significantly slower than the average value.
  • The objectives (or goals or targets) set for response time values are not uniform across different applications. This makes it difficult to view comparable data for several applications side-by-side (such as in a digital dashboard), and see quickly which are meeting their objectives and which are not.

The Apdex method seeks to address these problems.

Apdex method

[edit]

Proponents of the Apdex standard believe that it offers a better way to "measure what matters". The Apdex method converts many measurements into one number on a uniform scale of 0 to 1 (0 = no users satisfied, 1 = all users satisfied). The resulting Apdex score is a numerical measure of user satisfaction with the performance of enterprise applications. This metric can be used to report on any source of end-user performance measurements for which a performance objective has been defined.

The Apdex formula is the number of satisfied samples plus half of the tolerating samples plus none of the frustrated samples, divided by all the samples:

where the sub-script t is the target time, and the tolerable time is assumed to be 4 times the target time. So it is easy to see how this ratio is always directly related to users' perceptions of satisfactory application responsiveness.

Example: assuming a performance objective of 3 seconds or better, and a tolerable standard of 12 seconds or better, given a dataset with 100 samples where 60 are below 3 seconds, 30 are between 3 and 12 seconds, and the remaining 10 are above 12 seconds, the Apdex score is:

The Apdex formula is equivalent to a weighted average, where a satisfied user is given a score of 1, a tolerating user is given a score of 0.5, and a frustrated user is given a score of 0.

Apdex Alliance

[edit]

The Apdex Alliance, headquartered in Charlottesville, Virginia, was founded in 2004 by Peter Sevcik, President of NetForecast, Inc. The Alliance is a group of companies that are collaborating to establish the Apdex standard. These companies have perceived the need for a simple and uniform way to report on application performance, are adopting the Apdex method in their internal operations or software products, and are participating in the work of refining and extending the definition of the Apdex specifications. Alliance contributing members who incorporate the standard into their products may use the Apdex name or logo where the Alliance has certified them as compliant.

In January 2007, the Alliance comprised 11 contributing member companies, and over 200 individual members. While the number of contributing companies has remained relatively stable, individual membership grew to over 800 by December 2008, and reached 2000 in 2010. In 2008 the Alliance began publishing a blog, the Apdex Exchange, and in 2010, began offering educational Webinars. These activities address performance management topics, with an emphasis on how to apply the Apdex methodology.

[edit]