Archive

Archive for the ‘Analysis Services’ Category

SQL Agent and SSAS Commands

May 30, 2011 Leave a comment

SQL Agent can be used to execute SSAS commands such as processing of a dimension and i can see that it is useful when you want to quickly schedule an adhoc processing task but there is a major drawback you need to be aware of.

You are not able to specify the database to connect to and whilst you may think that it does not matter because the database is specified in the connection but it really does matter. Below are extracts of a processing task from profiler which was executed using a SQLAgent task.

image

So, what is happening here? Read more…

SSAS Exposes a W2K3 bug

April 6, 2011 Leave a comment

I recently came across a bug in Windows 2003 SP2 which causes a blue screen of death and SSAS was exposing the bug. In my scenario SSAS was the match, a file system cache function was the fuse and a BSOD the payload…..  If your an all Windows 2008 shop then there is no need to read on and i am envious Winking smile

One of the configurations i always set on a SSAS server is LimitFileSystemCache and one reason is that if you do not it can grow to the point where it has a negative impact on SSAS. When you set this configuration SSAS will use the API functions “GetSystemFileCacheSize” and “SetSystemFileCacheSize” and this is where the problem lies.

When SSAS starts for the first time it will set the file system cache and your server will be fine until you have to stop and then start the SSAS service without rebooting the server. The problem lies in the fact that when SSAS stops the operating system cleans up the memory in relation to the file system cache and a structure is not reset and as soon as its accessed it causes the blue screen. The article and hotfix for this problem is KB979128.

So, if you have PreAllocated your SSAS memory which is something i often do then your quite likely to get the blue screen when SSAS starts but you might not as its very hit and miss. I found if i did not get it at startup i would definitely get it when i pushed the server to its limits.

Now should you rush out and apply this patch? Well, its always good to ensure stability but i have been supporting SSAS servers for years and the first time i saw the blue screen was a few months ago in test so you can at least take your time to test the patch before rolling it out…… Smile

Its also worth noting that other users of tools that set the file system cache may well experience this problem and i mention this because its not uncommon to limit the file system cache for SQL Database deployments…..

Below is the OS bug check we got when we hit the problem.

Reason Code: 0x805000f
Bugcheck String: 0x0000000a (0x0000000000000028, 0x0000000000000002, 0x0000000000000000, 0xfffff800010e9a8b)

SSAS Deployment essentials

March 30, 2011 Leave a comment

Over the recent years i have enjoyed the privilege of working on a number of different SSAS deployments. Some are huge, some are complex and some of them are huge and complex and most interestingly they all behave differently.

What i want to share today is what i consider to be essential for an SSAS installation. This covers what i expect to see installed to compliment SSAS, configuration settings to be changed and the literature that should be at your finger tips.

Read more…

SSAS 2008 R2– Little Gems

December 3, 2010 Leave a comment

I have spent the last few days working with SSAS 2008 R2 and noticed a few small enhancements which many people probably won’t notice but i will list them here and why they are important to me.

New profiler events

Commit: This is a new sub class event  for “progress report end”. This represents the elapsed time taken for the server to commit your data. It is important because for the duration of this event a server level lock will be in place blocking all incoming connections and causing time out. You can read about the server level lock here. Prior to 2008 R2 the only way to calculate the duration of the commit is to subtract the end time of the batch completion event from the event immediately preceding it. You want to be looking out for this event if people are complaining about connectivity……

File Load and Save: This is a whole new category with its own events but you must remember to tick the boxes to show all the categories to see it though otherwise its hidden. Anyway as soon as i saw it i thought awesome! As the name describes it exposes the file activity and gives a whole new level of insight into the black box known as SSAS Open-mouthed smile. You may be wondering how it is useful to know what files are being accessed but when your trying to understand what’s going on under the hood and where a bottleneck might be its invaluable and i have been using it these past 2 days whilst working on a problem which i will discuss in a future post.

Vertipaq SE Query: Not much to say here other than i would expect them to be fast…..

Other enhancements

In august 2009 i posted about The cost of SSAS Metadata management and discussed a hidden performance penalty. Well i am pleased to say that Microsoft have fixed this particular problem and when you process a partition it will not check for dependencies across every partition in the database any more…… Now before you get excited and decide to create thousands of partitions and throw away your partition merging scripts you should wait for the post i allude to earlier in this post as there are other significant costs & penalties for having to much metadata………

Last but not least a big thank you to MS for improving the time it takes to expand an object in SSMS! With SSAS 2005 & 2008 i could be waiting 30 seconds to a couple of minutes to expand a node in SSMS which is very frustrating when your in a rush but with SSAS 2008 R2 node expansion is now instantaneous! So thank you again, it may be a small fix but its a big time saver!

SSAS Native v .net Provider

November 30, 2010 Leave a comment

Recently I was investigating why a new server which is in its parallel running phase was taking significantly longer to process the daily data than the server its due to replace.

The server has SQL & SSAS installed so the problem was not likely to be in the network transfer as its using shared memory. As i dug around the SQL dmv’s i noticed in sys.dm_exec_connections that the SSAS connection had a packet size of 8000 bytes instead of the usual 4096 bytes and from there i found that the datasource had been configured with the wrong provider but what was really interesting and the point of the blog is the performance difference which i have shown below.

Rows per second
.Net SqlClient Data Provide 30,000
SQL Server Native Client 10 240,000

That’s right! For a single partition, the native client was able to process 240,000 rows per second where as the .net client maxes out at 30,000. That means the SQL Native Client is 800% faster! I knew that the .net providers were slower but I had never gathered any metrics before. If your looking after a SSAS server I would definitely recommend taking a few minutes to check which provider is configured in the datasource.

Another point to consider is you may have a custom solution that is doing your ETL and utilising the .net providers. This would also be impacted by the .net provider throughput limits and a switch over to SSIS could dramatically improve your ETL.

SSAS <PreAllocate>: What you should know

July 18, 2010 2 comments

Preallocating memory for SSAS running on Windows 2003 is a good thing but as with all good things it is essential to know the behavioural changes you will experience, some of which may not be so obvious.

My observations relate to SSAS 2008 SP1 CU8 running on Windows 2003 SP2.

Why PreAllocate?

In my opinion there are 2 reasons which i detail below.

  • The first is the widely stated performance reason surrounding Windows 2003 memory performance. In a nut shell, Windows 2003 did not scale well with many small memory allocations due to fragmentation etc so allocate it up front. Life gets better in Windows 2008 as detailed by the SQLCAT team.
  • The second reason is to ensure SSAS is going to get a slice of the memory and this is very important if your not running on a dedicated SSAS box.

So, what i should i know?

  • When the service starts (don’t forget server startup), if you have assigned "Lock Pages in Memory" to your service account, expect your server to be totally unresponsive for a period of time. Do not panic, the duration of the freeze depends on the amount of memory preallocated but once its done the server becomes responsive again. Make sure the people working with the server know this……
  • Never ever set PreAllocate equal to or greater than <LowMemoryLimit> because if you do the memory cleaner thread is going to spin up and remove data pretty much as soon as it gets into memory. This will seriously hurt performance as your effectively disabling any caching.
  • The shrinkable and nonshrinkable perfmon memory counters are no longer accurate. The counters have “value” when troubleshooting but you must factor in the fact that at least their starting points are wrong.
  • When a full memory dump occurs that dump will be at least the size of the preallocated memory. So, if you preallocate 40gb but SSAS has only written to 2GB of memory its still going to be a 40GB dump so make sure you have the disk space! Hopefully though this is not a situation you should find yourself in very often.

I hope you find this information useful!

SSAS – Synchronisation performance

April 8, 2010 Leave a comment

I’ve always thought of SSAS synchronisation as a clever file mirroring utility built into SSAS and i have never considered the technology as bringing any performance gains to the table. So, its a good job I like to revisit areas…. 🙂

I decided to compare the performance of robocopy and SSAS Synchronisation between 2 Windows 2003 servers running SSAS 2008 SP1 CU7 with 1gb network links. For the robocopy of the data directory i used the SQLCat Robocopy Script. The results are shown below.
   

  SSAS Sync Robocopy
Full DB (138gb) 58 min 96 min
Single Partition (10gb) 3 min 37s 7 min 46s

The Full DB copy is as it says, no data is present so synchronise the lot.

Single Partition is where after the full db is processed i then process a single partition (and yes the partition is around 10gb) and then run the synchronisation.

The results left me gobsmacked! At first i simply could not believe that the SSAS sync could copy the data that much faster than robocopy but i repeated my tests a number of times and it does!

Yes, SSAS Synchronisation is between 100% and 50% faster than Robocopy… I’ve got to say that when Microsoft did the rewrite for 2008 they certainly did a fine job!

Before I finish up with this blog entry I must say there are always considerations as to whether synchronisation is appropriate for you and i will be posting soon about all the different options and the considerations.