Archive

Archive for the ‘Tips’ Category

Automating process monitor data collection

July 10, 2014 Leave a comment

Recently I needed to investigate why SSAS synchronisation commands were intermittently failing and the errors pointed to a file lock on a SSAS data file. To identify the file lock I needed to run process monitor for as long as it takes to capture the problem even if it takes weeks or months for it to occur. This presents the following problems

  • How to prevent the process monitor capture from consuming all available disk space.
  • How to ensure we do not over write the data we are looking for.
  • How to ensure the capture stops once we have the data we are looking for.

Read more…

Exposing the SSAS file system with the Metadata Sniffer

April 19, 2014 Leave a comment

Very recently I encountered a production issue and to assist me in the root cause analysis I needed information/clues that are not exposed by any SSAS interfaces so I created the metadata sniffer.

Read more…

SSAS Free drive space command

October 19, 2012 3 comments

Today i was reviewing some traces and saw 2 commands i had never noticed before which had been issued by SSMS, these are shown below.

SystemGetLogicalDrives

SystemGetAllowedFolders

The commands can be executed in a MDX query session.

SystemGetLogicalDrives returns the free space in MB for the disk drives that are contained within the server property  “AllowedBrowsingFolders”, see below for an example.

clip_image001

I have checked and the commands work in all versions of SSAS back to 2005 and anyone can run them.

Personally i think this is really useful and here a few reasons why.

  • Cut sysadmins out of the loop when maintaining space (this is a good thing Smile).
  • If your managing a remote dedicated SSAS server and have no access to the OS its a quick way to check space just like SQL Servers xp_fixeddrives & fn_fixeddrives()
  • You might use it when collecting space stats or alerting.
    SystemGetAllowedFolders gives you the directories configured in “AllowedBrowsingFolders”.
    I hope you find this useful! I passed the info onto one of the Dev teams i work with and they certainly will make good use of it.

Windows 2008 file transfer performance improvements

October 9, 2012 Leave a comment

I have been reading about Server Message Block (SMB) 3 and thought to myself it would be good to remind people about SMB 2 as many systems have not upgraded to Windows 2008 to take advantage of it.

For more background on SMB, please see this article.

When Microsoft introduced SMB 2 they pretty much started again. During one of my recent migrations I found that the file copy improvements from Windows 2008 R2 have literally been jaw dropping and the highlights are. Read more…

Working with SSAS traces and the dreaded Textdata in SSMS

February 4, 2012 Leave a comment

For good or bad i regularly work with SSAS traces that i have saved to a table and many years ago i was regularly frustrated by the fact that the contents of the textdata column containing the query is always truncated at 65535 characters.

You may think 65535 characters should cover most queries but once Excel gets to work it generates some huge queries!

Fortunately i came across a solution by MVP Adam Machanic that uses a very nifty trick to convert the output to XML and this has the benefit that XML column data is presented as hyperlinks in grid view!

Read more…

SSAS Crash Dump behavior

January 17, 2012 Leave a comment

In SSAS 2005, when an exception occurs, you will always get a crash dump generated in the log directory of the instance and this could potentially lead to a lot of crash dumps consuming resources and a lot of work for the person who has to analyse the contents of each dump. One of the benefits of this approach is that if a user reported a serious error a quick check of the logs directory was a good indicator that there might be a problem.

In SSAS 2008 this behaviour changes and by default SSAS will generate one unique dump and then suppress all instances of duplicates until the service is restarted although the number of duplicates to be generated can be controlled by <MaxDuplicateDumps>

This is a good enhancement and it reduces the need to analyse a dump to confirm its a duplicate as well as introducing a mechanism to reduce the resource impact of dumps but its not perfect.

The biggest problem i have with this implementation is that there is no logging anywhere that a dump has been supressed.

In an ideal world there would only be one exception occurring in your system at any one time and monitoring would always catch that first exception but this is not an ideal world and IMHO there is no substitute for good logging.

I have raised a connect item to improve the crash dump behaviour and made the suggestions below. If you think the enhancements should be made then vote now

  • When duplicate occurs log to event log dump signature, location & filename of last known dump generated, total duplicates to date.
  • Introduce a mechanism to reset duplicate dump without restarting service.
  • If an exception leads to a service failure a dump should always be generated
  • Consider writing a stub to the log directory.
Categories: Analysis Services, Tips

Extreme load and the “operation has been cancelled”

September 18, 2011 Leave a comment

There are many reasons why SSAS may generate an operation has been cancelled but i want to take you through a new, although rare, one that i recently discovered.

To set the scene I had been tuning the Process Index phase on one of our servers so we could fit a full reindex of a 1.8TB cube comfortably into a maintenance slot over the weekend. I will write another post about the actual performance and how we achieved it.

My tuning was going very well and we were pushing the box to its limits which is exactly what you want to do in a maintenance window Smile. I was running the index build server side and suddenly i received “The operation has been cancelled” message. I immediately went to look at my server trace and found that it had stopped recording at exactly the same time, how annoying.

With the current index build configuration every time i ran the build it failed with a cancelled error yet if i changed the configuration and eased of the throttle it would complete. This was unacceptable as i wanted every last drop out of the server. It was clear to me “something” was causing the cancel when i pushed the servers CPU’s to 100% so i dug out my friend process monitor and below is a screen shot of my “moment of discovery”.

image

Read more…

Two new Analysis Services Blogs

September 18, 2011 Leave a comment

Please welcome two new bloggers Pete Adshead and Christian Bracchi to the blogosphere!

These guys truly know what it means to build enterprise class highly scalable Analysis Services Solutions and you now how the opportunity to read about their experiences.

Don’t miss out!

Categories: Analysis Services, Tips

SSAS 2008 R2 Operations Guide

June 2, 2011 Leave a comment

What can i say but WOO HOO! The SSAS 2008 R2 Operations Guide has been published so download your copy now!

This document should be considered essential reading and contains over 100 pages of high quality information for configuring, testing and operating an Analysis Services server.

Finally i consider this document so important i have made a rare exception and gone back and added it to my Deployment Essentials post.

Categories: Analysis Services, Tips

SQL Agent and SSAS Commands

May 30, 2011 Leave a comment

SQL Agent can be used to execute SSAS commands such as processing of a dimension and i can see that it is useful when you want to quickly schedule an adhoc processing task but there is a major drawback you need to be aware of.

You are not able to specify the database to connect to and whilst you may think that it does not matter because the database is specified in the connection but it really does matter. Below are extracts of a processing task from profiler which was executed using a SQLAgent task.

image

So, what is happening here? Read more…

SSAS Deployment essentials

March 30, 2011 Leave a comment

Over the recent years i have enjoyed the privilege of working on a number of different SSAS deployments. Some are huge, some are complex and some of them are huge and complex and most interestingly they all behave differently.

What i want to share today is what i consider to be essential for an SSAS installation. This covers what i expect to see installed to compliment SSAS, configuration settings to be changed and the literature that should be at your finger tips.

Read more…

SSAS 2008 R2– Little Gems

December 3, 2010 Leave a comment

I have spent the last few days working with SSAS 2008 R2 and noticed a few small enhancements which many people probably won’t notice but i will list them here and why they are important to me.

New profiler events

Commit: This is a new sub class event  for “progress report end”. This represents the elapsed time taken for the server to commit your data. It is important because for the duration of this event a server level lock will be in place blocking all incoming connections and causing time out. You can read about the server level lock here. Prior to 2008 R2 the only way to calculate the duration of the commit is to subtract the end time of the batch completion event from the event immediately preceding it. You want to be looking out for this event if people are complaining about connectivity……

File Load and Save: This is a whole new category with its own events but you must remember to tick the boxes to show all the categories to see it though otherwise its hidden. Anyway as soon as i saw it i thought awesome! As the name describes it exposes the file activity and gives a whole new level of insight into the black box known as SSAS Open-mouthed smile. You may be wondering how it is useful to know what files are being accessed but when your trying to understand what’s going on under the hood and where a bottleneck might be its invaluable and i have been using it these past 2 days whilst working on a problem which i will discuss in a future post.

Vertipaq SE Query: Not much to say here other than i would expect them to be fast…..

Other enhancements

In august 2009 i posted about The cost of SSAS Metadata management and discussed a hidden performance penalty. Well i am pleased to say that Microsoft have fixed this particular problem and when you process a partition it will not check for dependencies across every partition in the database any more…… Now before you get excited and decide to create thousands of partitions and throw away your partition merging scripts you should wait for the post i allude to earlier in this post as there are other significant costs & penalties for having to much metadata………

Last but not least a big thank you to MS for improving the time it takes to expand an object in SSMS! With SSAS 2005 & 2008 i could be waiting 30 seconds to a couple of minutes to expand a node in SSMS which is very frustrating when your in a rush but with SSAS 2008 R2 node expansion is now instantaneous! So thank you again, it may be a small fix but its a big time saver!

SSAS Native v .net Provider

November 30, 2010 Leave a comment

Recently I was investigating why a new server which is in its parallel running phase was taking significantly longer to process the daily data than the server its due to replace.

The server has SQL & SSAS installed so the problem was not likely to be in the network transfer as its using shared memory. As i dug around the SQL dmv’s i noticed in sys.dm_exec_connections that the SSAS connection had a packet size of 8000 bytes instead of the usual 4096 bytes and from there i found that the datasource had been configured with the wrong provider but what was really interesting and the point of the blog is the performance difference which i have shown below.

Rows per second
.Net SqlClient Data Provide 30,000
SQL Server Native Client 10 240,000

That’s right! For a single partition, the native client was able to process 240,000 rows per second where as the .net client maxes out at 30,000. That means the SQL Native Client is 800% faster! I knew that the .net providers were slower but I had never gathered any metrics before. If your looking after a SSAS server I would definitely recommend taking a few minutes to check which provider is configured in the datasource.

Another point to consider is you may have a custom solution that is doing your ETL and utilising the .net providers. This would also be impacted by the .net provider throughput limits and a switch over to SSIS could dramatically improve your ETL.

SSAS 2008 – INI Files and in place upgrades

September 24, 2009 Leave a comment

Being the suspicious person i am i wondered if there would be any differences in the MSMDSRV.ini of an instance upgraded from 2005 as opposed to a clean install.

Now obviously i expect an in place upgrade to preserve my settings and add any new ones because it should not overwrite anything since i might have change from defaults for a good reason…….

Below is what i found followed by my thoughts….

IN Place Upgrade Value (Effectively 2005)

<ServerSendTimeout>-1</ServerSendTimeout>

<ServerReceiveTimeout>-1</ServerReceiveTimeout>

<AggregationPerfLog>1</AggregationPerfLog>

<DataStoreStringPageSize>8192</DataStoreStringPageSize>

<MinCalculationLRUSize>20</MinCalculationLRUSize>

<MdxSubqueries>3</MdxSubqueries>

2008 Clean Install Value

<ServerSendTimeout>60000</ServerSendTimeout>

<ServerReceiveTimeout>60000</ServerReceiveTimeout>

DELETED

<DataStoreStringPageSize>65536</DataStoreStringPageSize>

DELETED

<MdxSubqueries>15</MdxSubqueries>

Looking at what has changed they appear to be settings which may well have been tuned as a result of lessons learnt at Microsoft. The removal of AggregationPerfLog i suspect is cosmetic and the setting probably does nothing since there is another one called AggregationPerfLog2 which i suspect replaces it. Its also quite likely the same is the case with MinCalculationLRUSize.

An important thing to take away here is that an in place upgrade may not perform/behave the same way as a clean install because by default its ini file is going to be different. In my case i`m checking out the impact of the settings with a view to adding a step to our upgrade path to change them to the clean install values.

One more thing to get you thinking. If the settings did change based on lessons learnt, maybe its worth porting these back to 2005 and taking them for a spin…………. Test test and test some more!

Analysis Server appears to hang…..

April 5, 2009 4 comments

We had an on going problem where by users would suddenly start complaining that the Analysis Server was hanging. When investigating the box we could see that there appeared to be no physical constraints. Disks were a bit above average, CPU was very low and there was plenty of memory.

When looking at the server through the SSAS activity viewer (part of the community samples) we could see that users were definitely queuing up and many of the queries should have returned in less than a second but were hanging around for ever (30 minutes or more). It was as if we were experiencing some form of blocking…….

To compliment our monitoring we use ASTrace (also part of the community sample) to do real time logging of what’s happening on the SSAS server to a SQL Server and amongst the capabilities it gives us is the ability to run a little procedure to show us what mdx/xmla is running with duration etc (its much more friendly than the activity viewer). So, when we were experiencing the trouble our script showed that an MDX query that touched multiple partitions totalling 100’s of gb’s appeared to be at the head of the chain every time.

Read more…

Changing the Data Files Location after Installation

August 10, 2008 Leave a comment

The other day i wanted to change the “Data Files” location for a 2005 database engine installation and a 2005 Analysis Services installation which you can specify under the advanced options during installation. I quickly found out that there appears to be no documented ways to do this other than uninstall SQL Server and install again specifying a new location for data files. It’s also not as simple as moving your system databases as “Data files” covers things like server errors logs, sql agent logs, replication default directory etc. So, as the uninstall route was not one i was prepared to go down i sat down and worked out how to do it and below are the results.

Read more…