Archive

Posts Tagged ‘Configuration’

Targeting CHECKDB SQL Agent jobs for resource governing

March 6, 2014 Leave a comment

Some time ago I was looking into the performance of DBCC CHECKDB on one of our servers and came to the conclusion that the best Degree of Parallelism (DOP) for CHECKDB was not the same as our server level setting and since CHECKDB does not accept a DOP hint the only option to limit DOP when the server level setting is not appropriate is to use the resource governor.

In our environment CHECKDB is scheduled using SQL Agent and I could have chosen to simply target all SQL Agent jobs and have them all assigned to a single pool but this was not appropriate for all my SQL Agent jobs so I needed a solution targeting individual jobs. I  thought this would be a relatively straight forward task but there were some challenges so i wanted to share them and my solution.

Read more…

SSAS Native v .net Provider

November 30, 2010 Leave a comment

Recently I was investigating why a new server which is in its parallel running phase was taking significantly longer to process the daily data than the server its due to replace.

The server has SQL & SSAS installed so the problem was not likely to be in the network transfer as its using shared memory. As i dug around the SQL dmv’s i noticed in sys.dm_exec_connections that the SSAS connection had a packet size of 8000 bytes instead of the usual 4096 bytes and from there i found that the datasource had been configured with the wrong provider but what was really interesting and the point of the blog is the performance difference which i have shown below.

Rows per second
.Net SqlClient Data Provide 30,000
SQL Server Native Client 10 240,000

That’s right! For a single partition, the native client was able to process 240,000 rows per second where as the .net client maxes out at 30,000. That means the SQL Native Client is 800% faster! I knew that the .net providers were slower but I had never gathered any metrics before. If your looking after a SSAS server I would definitely recommend taking a few minutes to check which provider is configured in the datasource.

Another point to consider is you may have a custom solution that is doing your ETL and utilising the .net providers. This would also be impacted by the .net provider throughput limits and a switch over to SSIS could dramatically improve your ETL.

Server Side Aliases

November 16, 2010 Leave a comment

Over the years i have come across a few situations where server side connections to SQL server fail when you use a DNS alias that points back to the server your initiating the connection from but you can connect remotely.

Its an annoying problem which has a very unhelpful error message that changed in different versions of SQL. In SQL 2000 you are presented with

"Login failed for user ‘(null)’. Reason: Not associated with a trusted SQL Server connection."  and in SQL 2005 + SQL 2008 its “Login failed. The login is from an untrusted domain and cannot be used with Windows authentication”.

You will also see eventid 537 in the security logs

One of the most common reasons a system is setup with an alias pointing back on itself is because a consolidation has taken place and you don’t want to change the connection strings. However some people simply got burnt when Microsoft first released the security patch which introduced this change and i still find people being burnt today.

Cause

NTLM reflection protection was introduced as part of security fix MS08-068. This causes a local authentication failure when using a dns alias which bubbles up and becomes the error described above.

Relevant MS Articles are MS08-068 & http://support.microsoft.com/kb/926642 and cause extract is:

This problem occurs because of the way that NT LAN Manager (NTLM) treats different naming conventions as remote entities instead of as local entities. A local authentication failure might occur when the client calculates and caches the correct response to the NTLM challenge that is sent by the server in local "lsass" memory before the response is sent back to the server. When the server code for NTLM finds the received response in the local "lsass" cache, the code does not honour the authentication request and treats it as a replay attack. This behaviour leads to a local authentication failure.

Solution

You either need to use the local name rather than DNS alias or there are steps described in the resolutions section of the articles to disable the protection totally or for a specific alias.

SSAS <PreAllocate>: What you should know

July 18, 2010 2 comments

Preallocating memory for SSAS running on Windows 2003 is a good thing but as with all good things it is essential to know the behavioural changes you will experience, some of which may not be so obvious.

My observations relate to SSAS 2008 SP1 CU8 running on Windows 2003 SP2.

Why PreAllocate?

In my opinion there are 2 reasons which i detail below.

  • The first is the widely stated performance reason surrounding Windows 2003 memory performance. In a nut shell, Windows 2003 did not scale well with many small memory allocations due to fragmentation etc so allocate it up front. Life gets better in Windows 2008 as detailed by the SQLCAT team.
  • The second reason is to ensure SSAS is going to get a slice of the memory and this is very important if your not running on a dedicated SSAS box.

So, what i should i know?

  • When the service starts (don’t forget server startup), if you have assigned "Lock Pages in Memory" to your service account, expect your server to be totally unresponsive for a period of time. Do not panic, the duration of the freeze depends on the amount of memory preallocated but once its done the server becomes responsive again. Make sure the people working with the server know this……
  • Never ever set PreAllocate equal to or greater than <LowMemoryLimit> because if you do the memory cleaner thread is going to spin up and remove data pretty much as soon as it gets into memory. This will seriously hurt performance as your effectively disabling any caching.
  • The shrinkable and nonshrinkable perfmon memory counters are no longer accurate. The counters have “value” when troubleshooting but you must factor in the fact that at least their starting points are wrong.
  • When a full memory dump occurs that dump will be at least the size of the preallocated memory. So, if you preallocate 40gb but SSAS has only written to 2GB of memory its still going to be a 40GB dump so make sure you have the disk space! Hopefully though this is not a situation you should find yourself in very often.

I hope you find this information useful!

SSAS 2008 – INI Files and in place upgrades

September 24, 2009 Leave a comment

Being the suspicious person i am i wondered if there would be any differences in the MSMDSRV.ini of an instance upgraded from 2005 as opposed to a clean install.

Now obviously i expect an in place upgrade to preserve my settings and add any new ones because it should not overwrite anything since i might have change from defaults for a good reason…….

Below is what i found followed by my thoughts….

IN Place Upgrade Value (Effectively 2005)

<ServerSendTimeout>-1</ServerSendTimeout>

<ServerReceiveTimeout>-1</ServerReceiveTimeout>

<AggregationPerfLog>1</AggregationPerfLog>

<DataStoreStringPageSize>8192</DataStoreStringPageSize>

<MinCalculationLRUSize>20</MinCalculationLRUSize>

<MdxSubqueries>3</MdxSubqueries>

2008 Clean Install Value

<ServerSendTimeout>60000</ServerSendTimeout>

<ServerReceiveTimeout>60000</ServerReceiveTimeout>

DELETED

<DataStoreStringPageSize>65536</DataStoreStringPageSize>

DELETED

<MdxSubqueries>15</MdxSubqueries>

Looking at what has changed they appear to be settings which may well have been tuned as a result of lessons learnt at Microsoft. The removal of AggregationPerfLog i suspect is cosmetic and the setting probably does nothing since there is another one called AggregationPerfLog2 which i suspect replaces it. Its also quite likely the same is the case with MinCalculationLRUSize.

An important thing to take away here is that an in place upgrade may not perform/behave the same way as a clean install because by default its ini file is going to be different. In my case i`m checking out the impact of the settings with a view to adding a step to our upgrade path to change them to the clean install values.

One more thing to get you thinking. If the settings did change based on lessons learnt, maybe its worth porting these back to 2005 and taking them for a spin…………. Test test and test some more!

Changing the Data Files Location after Installation

August 10, 2008 Leave a comment

The other day i wanted to change the “Data Files” location for a 2005 database engine installation and a 2005 Analysis Services installation which you can specify under the advanced options during installation. I quickly found out that there appears to be no documented ways to do this other than uninstall SQL Server and install again specifying a new location for data files. It’s also not as simple as moving your system databases as “Data files” covers things like server errors logs, sql agent logs, replication default directory etc. So, as the uninstall route was not one i was prepared to go down i sat down and worked out how to do it and below are the results.

Read more…