Archive

Archive for the ‘Analysis Services’ Category

SSAS 2008 – INI Files and in place upgrades

September 24, 2009 Leave a comment

Being the suspicious person i am i wondered if there would be any differences in the MSMDSRV.ini of an instance upgraded from 2005 as opposed to a clean install.

Now obviously i expect an in place upgrade to preserve my settings and add any new ones because it should not overwrite anything since i might have change from defaults for a good reason…….

Below is what i found followed by my thoughts….

IN Place Upgrade Value (Effectively 2005)

<ServerSendTimeout>-1</ServerSendTimeout>

<ServerReceiveTimeout>-1</ServerReceiveTimeout>

<AggregationPerfLog>1</AggregationPerfLog>

<DataStoreStringPageSize>8192</DataStoreStringPageSize>

<MinCalculationLRUSize>20</MinCalculationLRUSize>

<MdxSubqueries>3</MdxSubqueries>

2008 Clean Install Value

<ServerSendTimeout>60000</ServerSendTimeout>

<ServerReceiveTimeout>60000</ServerReceiveTimeout>

DELETED

<DataStoreStringPageSize>65536</DataStoreStringPageSize>

DELETED

<MdxSubqueries>15</MdxSubqueries>

Looking at what has changed they appear to be settings which may well have been tuned as a result of lessons learnt at Microsoft. The removal of AggregationPerfLog i suspect is cosmetic and the setting probably does nothing since there is another one called AggregationPerfLog2 which i suspect replaces it. Its also quite likely the same is the case with MinCalculationLRUSize.

An important thing to take away here is that an in place upgrade may not perform/behave the same way as a clean install because by default its ini file is going to be different. In my case i`m checking out the impact of the settings with a view to adding a step to our upgrade path to change them to the clean install values.

One more thing to get you thinking. If the settings did change based on lessons learnt, maybe its worth porting these back to 2005 and taking them for a spin…………. Test test and test some more!

Advertisements

Enter the SSAS server level lock……

September 23, 2009 7 comments

Ok, so your reaction to the title is probably the same as mine when i found out about SSAS server level locks! So, i will give you the scripts to reproduce the server level lock but first lets get down to business….  🙂

Server locks were introduced in one of the 2005 SP2 cumulative updates. At the moment i can say it was pre CU12. I`m not sure why it was introduced but it likely to be in response to a “feature” 🙂

Fortunately the lock only appears at the end of processing when SSAS commits its data and commits are usually quick so depending when you do your processing you might never see it. So why am i so horrified by the existence of this lock other than its simply wrong to prevent connections to the server? Below are my concerns….

  • If a query is running when processing comes to commit it must queue behind the query for a default of 30 seconds but processing still gets the server level lock granted meaning no one gets to connect for up to 30 seconds + commit time and users get connection errors!
  • ForceCommitTimeout is the setting that controls the duration a commit job waits to kill the queries ahead of it. People should now think of this setting not only as the time your allowing queries to complete before being killed but also the additional duration of time your prepared to deny users access to the server.
  • The real kick in the pants comes along when you find out that there are scenarios where a query will not respond to the query cancel invoked by ForceCommitTimeout. The obvious one is when there is a bug but there are others. This means that the commit can’t kill the query and your server is effectively hung and the users are screaming. What’s worse is the SYSADMIN can’t connect to the server to diagnose the problem because the server lock blocks them!
  • I have seen connections error when connecting to the server due to the server level lock which is even worse. Unfortunately i have not managed to identify the repo (yet).

Read more…

Analysis Server appears to hang…..

April 5, 2009 4 comments

We had an on going problem where by users would suddenly start complaining that the Analysis Server was hanging. When investigating the box we could see that there appeared to be no physical constraints. Disks were a bit above average, CPU was very low and there was plenty of memory.

When looking at the server through the SSAS activity viewer (part of the community samples) we could see that users were definitely queuing up and many of the queries should have returned in less than a second but were hanging around for ever (30 minutes or more). It was as if we were experiencing some form of blocking…….

To compliment our monitoring we use ASTrace (also part of the community sample) to do real time logging of what’s happening on the SSAS server to a SQL Server and amongst the capabilities it gives us is the ability to run a little procedure to show us what mdx/xmla is running with duration etc (its much more friendly than the activity viewer). So, when we were experiencing the trouble our script showed that an MDX query that touched multiple partitions totalling 100’s of gb’s appeared to be at the head of the chain every time.

Read more…

Changing the Data Files Location after Installation

August 10, 2008 Leave a comment

The other day i wanted to change the “Data Files” location for a 2005 database engine installation and a 2005 Analysis Services installation which you can specify under the advanced options during installation. I quickly found out that there appears to be no documented ways to do this other than uninstall SQL Server and install again specifying a new location for data files. It’s also not as simple as moving your system databases as “Data files” covers things like server errors logs, sql agent logs, replication default directory etc. So, as the uninstall route was not one i was prepared to go down i sat down and worked out how to do it and below are the results.

Read more…

SSAS 2005 – Server side tracing starter kit

April 7, 2008 2 comments

Analysis services 2005 (SSAS) added the ability to trace server side events and i have used this feature a number of times. To date i had always used the profiler gui to do the SSAS tracing but today i found myself needing to initiate and manage a trace with scripts.

The good news is that it can be done! It did take a while to piece together how to do it though and i found some of the information quite a challenge to find so i am sharing with you the results and have attached a zip file with the necessary scripts.

So, what did i want to achieve?

  • A script that would create a trace on the server and log to a specified directory similar to the way you can with a SQL Server trace.
  • A script that would list all running traces on an analysis server.
  • A script that would destroy a named running trace, in my case the one i created.

Now, the script that creates the trace is likely to require editing each time to add new events as the script i am attaching only captures command events. The easiest way to define your events is detailed below.

  1. Open SQL Server profiler and define the SSAS trace you require.
  2. Next script the trace by going to “File – Export – Script Trace Definition – For Analysis Services 2005”.
  3. Open the script file and cut & copy the Events & Filter elements into my attached script ensuring you replace the existing Events & Filter elements.

Some people might be wondering why i needed to create the script file if i can script it from profiler? Well, profiler only scripts the events and filters and excludes options such as LogFileName, AutoRestart etc.

So, with the events in place you should now update the LogFileName element with your filename & path and check the LogFileSize element is appropriate. Finally, there is a StopTime element that you can uncomment and set which sets a time for the trace to automatically close but do not forget its the time at the server you are setting not the time where you are.

With all the updating done just run the script to create your own server side SSAS trace. It does not end here though because you will need to stop the trace manually if you have not enabled a StopTime. This is where “Delete Named Trace.xmla” comes in. Simply update the name element and run the script to delete the trace. Unlike SQL Server you do not need to stop and then close the trace. If you are not sure of the name of the trace you can run the script “List all server side traces.xmla” which is also useful for validating that you have removed the trace or that it auto closed. The list traces also gives useful information such as where the traces are outputing their results.

The trace script was amended from an example in Analysis Services Processing Best Practices and i would definitely recommend reading the article. The other scripts i hacked together and are very simple as i am a xmla novice.

I hope you find this information useful, you can locate the scripts here.