Analysis Services Testing Best Practices

An important but forgotten part of building Analysis Services cubes is the performance testing part. Yes, we frequently are able to test queries return the correct results, but generally not enough performance and regression (performance) testing is done.

Relatively briefly I will try to outline some good practices for this.


  1. A real set of production queries:
    1. Identify if these queries are from different applications that query the cube and split them up as well as create a “mix” set which includes a sample of all
    2. Make sure that when you run these queries they will actually “hit” data in your cube. No point having many useful queries if they are for a date which isn’t in your cube.
  2. What concurrency i.e. How many users do you want to test with
    1. Pick REALISTIC numbers. Don’t say 1000 if the most you will ever have running on the cube is 20. This will likely be a waste of your time
    2. If you do think you will have higher concurrency, then chose intermediate numbers. So if max concurrency will be 1000, try testing for 10, 100, 400, 1000, etc. This will give you an indication of how your cube performs as the concurrency increases for your different application queries.
  3. A query test tool
    1. This could be something like the ASCMD_StressTestingScripts on Analysis Services codeplex project here (Some more info by Stacia Misner here also on ASCMD usage)
    2. Build your own.. This is not a simple undertaking, but if it’s for a long term project then it may well be worth it in the long run.
  4. A Perfmon collector Set
    1. This is in case you see any surprises in the testing results. Here is a minimum sample of what should be included.

    \MSOLAP$MSAS11:Storage Engine Query\*
    \Process(msmdsrv)\% Privileged Time
    \Process(msmdsrv)\% Processor Time
    \Process(msmdsrv)\% User Time
    \Process(*)\% Privileged Time
    \Process(*)\% Processor Time
    \Process(*)\% User Time
    \Processor(_Total)\% Privileged Time
    \Processor(_Total)\% Processor Time
    \Processor(_Total)\% User Time
    \System\Context Switches/sec

    You may also want to think about network just to check you aren’t getting slow network response time or maxing out the network connection (On both the query server and the SSAS server.

  5. Caching
    1. Test on a Cold Cache and a Warm cache. I would recommend a couple of runs on each. I prefer cold warm cold warm for this for practical reasons. To clear the cache you can use a couple of methods:
      1. This could be the ASSP codeplex project which contains assp.ClearAllCaches() which clears all SSAS Caches as well as the File System Cache
      2. RAMMAP To use this:
        1. Open RAMMAP Can be left open during testing, and will take quite a long time (5 minutes) to open on system with lots of memory (1/2 TB)
        2. Stop SSAS Service
        3. In RAMMAP on
        4. Empty Menu, chose “Empty System Working Set”, then “Empty Modified Page List” then “Empty Standby List”
        5. Start SSAS Service
  6. Plan your Tests before starting
    1. I find Excel pretty good for logging all the test and you can generate the test names as you go. For example, include application the queries are from, number of queries, concurrency, warm/cold cache, run number etc..
    2. Testing can actually be quite boring (surprise) so it can be easy to lose concentration, especially considering can take quite a long time depending on how many tests, queries, cube size, etc..
    3. Name your tests and make sure to keep your output from the tests organised. It is quite annoying if you complete your tests and realise you’ve lost track of where you were and have to start again.
  7. Test Results
    1. Consider not including the top and bottom 1% of your query results. You may find that there is one query that just continues to run at the end and scews the test time.


When running your tests, make sure you have a strict pattern to follow. For example:

  • Update test log to indicate current test in progress
  • Create folder for current test
  • Clear SSAS cache
  • Start Perfrmon
  • Start test and wait to complete
  • Stop Perfmon, rename to test name
  • Copy perfmon and query results to the folder for the current test
  • Update log to say the test is completed
  • Go back to start of pattern..

As you can see, having something like this can really help reduce the number of mistakes during testing, and there will be some regardless of how well you keep track.

Above testing pattern can be used along with the ssas performance/operations guide to really tune SSAS for your environment, or just to give you confidence that the changes you’ve made are going to fall off a cliff as soon as people start querying the cube post release.

Last but certainly not least:

BACKUP your test results!

Make sure that no one else is running queries on the server at the same time!


  1. Weelin
    November 21, 2012 at 10:54 am

    Nice article, Chris!

  1. August 15, 2016 at 12:22 pm

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: