Archive

Archive for the ‘Configuration’ Category

SSAS Deployment essentials

March 30, 2011 Leave a comment

Over the recent years i have enjoyed the privilege of working on a number of different SSAS deployments. Some are huge, some are complex and some of them are huge and complex and most interestingly they all behave differently.

What i want to share today is what i consider to be essential for an SSAS installation. This covers what i expect to see installed to compliment SSAS, configuration settings to be changed and the literature that should be at your finger tips.

Read more…

SSAS <PreAllocate>: What you should know

July 18, 2010 2 comments

Preallocating memory for SSAS running on Windows 2003 is a good thing but as with all good things it is essential to know the behavioural changes you will experience, some of which may not be so obvious.

My observations relate to SSAS 2008 SP1 CU8 running on Windows 2003 SP2.

Why PreAllocate?

In my opinion there are 2 reasons which i detail below.

  • The first is the widely stated performance reason surrounding Windows 2003 memory performance. In a nut shell, Windows 2003 did not scale well with many small memory allocations due to fragmentation etc so allocate it up front. Life gets better in Windows 2008 as detailed by the SQLCAT team.
  • The second reason is to ensure SSAS is going to get a slice of the memory and this is very important if your not running on a dedicated SSAS box.

So, what i should i know?

  • When the service starts (don’t forget server startup), if you have assigned "Lock Pages in Memory" to your service account, expect your server to be totally unresponsive for a period of time. Do not panic, the duration of the freeze depends on the amount of memory preallocated but once its done the server becomes responsive again. Make sure the people working with the server know this……
  • Never ever set PreAllocate equal to or greater than <LowMemoryLimit> because if you do the memory cleaner thread is going to spin up and remove data pretty much as soon as it gets into memory. This will seriously hurt performance as your effectively disabling any caching.
  • The shrinkable and nonshrinkable perfmon memory counters are no longer accurate. The counters have “value” when troubleshooting but you must factor in the fact that at least their starting points are wrong.
  • When a full memory dump occurs that dump will be at least the size of the preallocated memory. So, if you preallocate 40gb but SSAS has only written to 2GB of memory its still going to be a 40GB dump so make sure you have the disk space! Hopefully though this is not a situation you should find yourself in very often.

I hope you find this information useful!

SSAS 2008 – INI Files and in place upgrades

September 24, 2009 Leave a comment

Being the suspicious person i am i wondered if there would be any differences in the MSMDSRV.ini of an instance upgraded from 2005 as opposed to a clean install.

Now obviously i expect an in place upgrade to preserve my settings and add any new ones because it should not overwrite anything since i might have change from defaults for a good reason…….

Below is what i found followed by my thoughts….

IN Place Upgrade Value (Effectively 2005)

<ServerSendTimeout>-1</ServerSendTimeout>

<ServerReceiveTimeout>-1</ServerReceiveTimeout>

<AggregationPerfLog>1</AggregationPerfLog>

<DataStoreStringPageSize>8192</DataStoreStringPageSize>

<MinCalculationLRUSize>20</MinCalculationLRUSize>

<MdxSubqueries>3</MdxSubqueries>

2008 Clean Install Value

<ServerSendTimeout>60000</ServerSendTimeout>

<ServerReceiveTimeout>60000</ServerReceiveTimeout>

DELETED

<DataStoreStringPageSize>65536</DataStoreStringPageSize>

DELETED

<MdxSubqueries>15</MdxSubqueries>

Looking at what has changed they appear to be settings which may well have been tuned as a result of lessons learnt at Microsoft. The removal of AggregationPerfLog i suspect is cosmetic and the setting probably does nothing since there is another one called AggregationPerfLog2 which i suspect replaces it. Its also quite likely the same is the case with MinCalculationLRUSize.

An important thing to take away here is that an in place upgrade may not perform/behave the same way as a clean install because by default its ini file is going to be different. In my case i`m checking out the impact of the settings with a view to adding a step to our upgrade path to change them to the clean install values.

One more thing to get you thinking. If the settings did change based on lessons learnt, maybe its worth porting these back to 2005 and taking them for a spin…………. Test test and test some more!

Enter the SSAS server level lock……

September 23, 2009 6 comments

Ok, so your reaction to the title is probably the same as mine when i found out about SSAS server level locks! So, i will give you the scripts to reproduce the server level lock but first lets get down to business….  🙂

Server locks were introduced in one of the 2005 SP2 cumulative updates. At the moment i can say it was pre CU12. I`m not sure why it was introduced but it likely to be in response to a “feature” 🙂

Fortunately the lock only appears at the end of processing when SSAS commits its data and commits are usually quick so depending when you do your processing you might never see it. So why am i so horrified by the existence of this lock other than its simply wrong to prevent connections to the server? Below are my concerns….

  • If a query is running when processing comes to commit it must queue behind the query for a default of 30 seconds but processing still gets the server level lock granted meaning no one gets to connect for up to 30 seconds + commit time and users get connection errors!
  • ForceCommitTimeout is the setting that controls the duration a commit job waits to kill the queries ahead of it. People should now think of this setting not only as the time your allowing queries to complete before being killed but also the additional duration of time your prepared to deny users access to the server.
  • The real kick in the pants comes along when you find out that there are scenarios where a query will not respond to the query cancel invoked by ForceCommitTimeout. The obvious one is when there is a bug but there are others. This means that the commit can’t kill the query and your server is effectively hung and the users are screaming. What’s worse is the SYSADMIN can’t connect to the server to diagnose the problem because the server lock blocks them!
  • I have seen connections error when connecting to the server due to the server level lock which is even worse. Unfortunately i have not managed to identify the repo (yet).

Read more…

Changing the Data Files Location after Installation

August 10, 2008 Leave a comment

The other day i wanted to change the “Data Files” location for a 2005 database engine installation and a 2005 Analysis Services installation which you can specify under the advanced options during installation. I quickly found out that there appears to be no documented ways to do this other than uninstall SQL Server and install again specifying a new location for data files. It’s also not as simple as moving your system databases as “Data files” covers things like server errors logs, sql agent logs, replication default directory etc. So, as the uninstall route was not one i was prepared to go down i sat down and worked out how to do it and below are the results.

Read more…