SSAS Maestro and the first challenge

January 24, 2013 1 comment

Its official, 2012 ended on a high for me, I received confirmation that I have passed the Microsoft SSAS Maestro certification!!! I won’t repeat the details of the what the Maestro certification is but you can find details here and here.

Before I go on, I want to stop and say I loved the whole Maestro experience, i found it inspirational and invigorating! Thank you for the opportunity.

Now, what is the challenge i`m referring to? Well, I announced to the team of highly skilled BI developers I regularly work with that I was now a Maestro so they set me my first Maestro task to further optimise their already highly optimised process update and the results are really really interesting and I will be posting a 1 or 2 part blog entitled “The Anatomy of a process update” which will be coming shortly so stay tuned!

As a teaser, during the process of pulling apart process update some of the moments of discovery I had went like this.

“That explains why x appeared to yield no benefit….”

“Really!?!?! I never saw that coming”

“Wow! That’s one for the blog!”

“Maestro Moment — Realising the true potential and implications of the discovery…”

Open-mouthed smile

Stay tuned!

Categories: Analysis Services

SSAS Free drive space command

October 19, 2012 3 comments

Today i was reviewing some traces and saw 2 commands i had never noticed before which had been issued by SSMS, these are shown below.

SystemGetLogicalDrives

SystemGetAllowedFolders

The commands can be executed in a MDX query session.

SystemGetLogicalDrives returns the free space in MB for the disk drives that are contained within the server property  “AllowedBrowsingFolders”, see below for an example.

clip_image001

I have checked and the commands work in all versions of SSAS back to 2005 and anyone can run them.

Personally i think this is really useful and here a few reasons why.

  • Cut sysadmins out of the loop when maintaining space (this is a good thing Smile).
  • If your managing a remote dedicated SSAS server and have no access to the OS its a quick way to check space just like SQL Servers xp_fixeddrives & fn_fixeddrives()
  • You might use it when collecting space stats or alerting.
    SystemGetAllowedFolders gives you the directories configured in “AllowedBrowsingFolders”.
    I hope you find this useful! I passed the info onto one of the Dev teams i work with and they certainly will make good use of it.

Windows 2008 file transfer performance improvements

October 9, 2012 Leave a comment

I have been reading about Server Message Block (SMB) 3 and thought to myself it would be good to remind people about SMB 2 as many systems have not upgraded to Windows 2008 to take advantage of it.

For more background on SMB, please see this article.

When Microsoft introduced SMB 2 they pretty much started again. During one of my recent migrations I found that the file copy improvements from Windows 2008 R2 have literally been jaw dropping and the highlights are. Read more…

SSAS Multithreaded sync with Windows 2008 R2

July 7, 2012 10 comments

We have been happily running some of our systems on Wndows 2003 and have had an upgrade to W2K8 R2 on the list for quite some time. The upgrade has now completed and we can start taking advantage of some of the new features which is the reason for this post.

For a long time we have used the sample Robocopy script from the SQLCat team to synchronize some of our larger SSAS databases. If your wondering what i mean by large, around 5 TB with a good few thousand partitions.

The script works like a dream but at this scale you soon find that the copy process starts to chug when it hits the fact data as this fantastic multi threaded script now becomes single threaded again as the bulk of the data being transferred is fact data.

Read more…

SQL Server 2008 uses half the CPU’s

July 6, 2012 Leave a comment

I recently got my hands on a couple of 4 socket servers with Intel E7-4870’s (10 cores per cpu) and with hyper threading enabled that gave me 80 logical CPU’s.

The server has Windows 2008 R2 SP1 along with SQL 2008 (Currently we can not deploy SQL 2008 R2 for the application being hosted).

When SQL Server started I noticed only 2 NUMA nodes were configured and 40 logical cores where there should have been 4 NUMA nodes and 80 logical cores (see below).

clip_image001

Read more…

Categories: Configuration, Numa, SQL Server

Working with SSAS traces and the dreaded Textdata in SSMS

February 4, 2012 Leave a comment

For good or bad i regularly work with SSAS traces that i have saved to a table and many years ago i was regularly frustrated by the fact that the contents of the textdata column containing the query is always truncated at 65535 characters.

You may think 65535 characters should cover most queries but once Excel gets to work it generates some huge queries!

Fortunately i came across a solution by MVP Adam Machanic that uses a very nifty trick to convert the output to XML and this has the benefit that XML column data is presented as hyperlinks in grid view!

Read more…

SSAS Crash Dump behavior

January 17, 2012 Leave a comment

In SSAS 2005, when an exception occurs, you will always get a crash dump generated in the log directory of the instance and this could potentially lead to a lot of crash dumps consuming resources and a lot of work for the person who has to analyse the contents of each dump. One of the benefits of this approach is that if a user reported a serious error a quick check of the logs directory was a good indicator that there might be a problem.

In SSAS 2008 this behaviour changes and by default SSAS will generate one unique dump and then suppress all instances of duplicates until the service is restarted although the number of duplicates to be generated can be controlled by <MaxDuplicateDumps>

This is a good enhancement and it reduces the need to analyse a dump to confirm its a duplicate as well as introducing a mechanism to reduce the resource impact of dumps but its not perfect.

The biggest problem i have with this implementation is that there is no logging anywhere that a dump has been supressed.

In an ideal world there would only be one exception occurring in your system at any one time and monitoring would always catch that first exception but this is not an ideal world and IMHO there is no substitute for good logging.

I have raised a connect item to improve the crash dump behaviour and made the suggestions below. If you think the enhancements should be made then vote now

  • When duplicate occurs log to event log dump signature, location & filename of last known dump generated, total duplicates to date.
  • Introduce a mechanism to reset duplicate dump without restarting service.
  • If an exception leads to a service failure a dump should always be generated
  • Consider writing a stub to the log directory.
Categories: Analysis Services, Tips