XWiki Performance Evaluation

Last modified by mflorea on 2011/04/04 11:11

Performance Tests

Before we start doing any performance improvements we need to define a set of tests, automated as much as possible, to:

  • evaluate the current state
  • prove that we're making improvements
  • catch regressions

These tests should be grouped based on what they measure:

Response time

Response time is a user concern and it must be addressed both on the server and on the client side. It's not enough to have a highly responsive server if the browser spends a lot of time rendering complex CSS.

Server response time

The first step is to define the test cases. Here's a list of things we need to test:

  • XWiki actions (e.g. login, view, edit, save)
  • REST resources
  • Services (e.g. livetable results)
  • Standard wiki pages (e.g. Main.WebHome)

We also need to test if XWiki behaves well when there many concurrent requests.

We can use JMeter to record such tests and run them on our CI server. Automated performance testing using JMeter and Maven is an interesting article on this subject.

Note that these tests are not meant to show us what part of the code is wrong but to signal performance problems as soon as they appear. We can use profiling tools like YourKit to further investigate the problems and fix them, as shown in the following sections.

Browser render time

Writing automated tests for browser render time is not easy, first of all because there are many browsers (with many versions) to test and then because there aren't many tools available. At first I though about using Selenium but it seems it's not reliable (see Performance testing using Selenium).

Test cases for browser render time should cover standard wiki pages.

Network throughput

The goal is to have less requests and smaller response size.

Number of requests

Usually a request for a wiki page is followed by many requests to style sheets, external JavaScript code, images, data (AJAX requests) or even other pages (loaded in frames). In order to measure the number of these requests we need a tool that integrates with the browser.

Response size

These tests are similar to those that measure server response time, only that they measure response size.

Resource usage

Resource usage should be addressed both on the server and on the client side.

Server CPU and memory usage

JMeter can be used to monitor a Tomcat server. Maybe we can port Tomcat's status servlet (which uses JMX) to Jetty so that we can write automated tests for the standard XWiki Enterprise distribution. 

Browser CPU and memory usage

We need to evaluate how efficient is our JavaScript code in terms of CPU and memory usage. We also need to catch memory leaks in our JavaScript code. For this we need a tool that monitors the browser.

Profiling Tools

Response time

Server response time

We can use YourKit to profile the server side code. YourKit has a Probe API which can be used to instrument specific methods. The advantage over writing aspects is that we don't have to compile the probe in the presence of the source code or byte code of the instrumented method. I've already written 2 simple probes that show:

  • which velocity templates are evaluated and the amount of time spent
  • which velocity macros (those defined using the #macro directive) are executed and the amount of time spent.