SSMS Restore backup error

September 14, 2007 Leave a comment

We had a requirement to allow someone to create and restore databases on a test server today and i thought to myself "Thats easy, i`ll just grant the "Create Any Database" right to the appropriate user, thats when the pain began!

The user was using SSMS connecting using a SQL login to restore a database and when they went to specify the backup location they got an error to the effect "Cannot access the specified path or file on the server". After clicking ok the tree view in the locate backup dialog was empty and if you typed in the path and filename manually you still recieved an error.
So, i dug out the profiler and found that xp_fixeddrives was being called and decided to check it out. It turns out that when executing xp_fixeddrives using a SQL login it returns no results! Because of this the error is generated in SSMS and the tree view is not populated.
I spent some time investigating xp_fixeddrives and came to the following conclusions regarding its behaviour.

  • Currently the only way to get xp_fixeddrives to return results when using SQL authentication is to add it to the sysadmin role.
  • I extensively explored proxy account configurations to get xp_fixeddrives to work under SQL authentication and i could find no way to get it working.
  • If you use Windows authentication xp_fixeddrives works as expected. You do not need sysadmin privledges nor do you have to do any proxy configuration.
  • Some poking around with process monitor showed that when using windows authentication or the sysadmin process that SQLSERVR.exe spawns processes to gather the details using the service account. When using SQL Authentication with sysadmin SQLSERVR.exe does not spawn anything or generate any security errors so what ever is going on is beyond what i can work out (My skills with a debugger are to limited :)  ).

At the end of all this i went down the Windows authentication root to work around the issue. For some reason SSMS still generates the error but the tree view is then populated and you can select the file so i can live with that as i suspect exploring why its still generating an error may end up being a bottomless pit of time.

Finally, this only affects SSMS. If the user had done the restore using T-SQL it would have worked (despite a warning about updating the restore history in msdb).

Categories: Configuration, SQL Server

Server level VLF report

July 25, 2007 Leave a comment

I read Tony Rogerson’s blog on Virtual Log Files today and it reminded me that i really should knock up a little report to list all databases on a server and the number of VLF’s per database. Since I had been busy writing some other Operational reports I was in the right frame of mind so knocked up what you see below.

The Report code

CREATE TABLE #VLFS (fileid INT,filesize bigint,startoffset bigint,fseqno bigint,status INT,parity INT,createlsn VARCHAR(1000))

CREATE TABLE #Results (SRV_Name NVARCHAR(500),Database_Name NVARCHAR(500),VLFS INT)

EXEC MASTER.dbo.sp_msforeachdb

    @command1 = 'USE ? INSERT INTO #VLFS EXEC(''DBCC LOGINFO WITH TABLERESULTS'')',

    @command2 = 'insert into #Results SELECT @@SERVERNAME,''?'',count(*) from #vlfs',

    @command3 = 'truncate table #vlfs'

--For this example i`ve just returned the results but you can

--just as easily write the results to a local or central server.

--I write mine to a central server for reporting on.

SELECT * FROM #Results

DROP TABLE #vlfs

DROP TABLE #Results

Example output

Results

For me the beauty of this little report is that i have set it up so that it runs on all 150 of my servers distributed around the world and logs to my central logging server. From here I get one report that tells me the databases and the servers they are hosted on that have excessive VLF’s. Having said that in a well managed environment i should never find any databases with lots of VLF’s……..

You may also be wondering "How many VLF’s are too many?". Tony did not cover this but Kimberly Tripp did in a post she put up a few years ago (See point 8) which was when I first learned about VLF’s. I don’t however think this is a hard and fast number and obviously the performance gains will be less if your not far off this number.

The last thing I am going to mention in this post is part of the code I used in the report. I used the undocumented procedure sp_msforeachdb. This is a great little procedure that will cycle through your databases executing up to 3 commands. To use it, where you would have put a database name you put a ‘?’ and its as simple as that! Incidentally there is also a sp_msforeachtable.

The joy of template parameters

July 15, 2007 Leave a comment

I thought I would do a quick blog about template parameters since I have been writing allot of standard deployment scripts for our 2005 builds and have used them extensively.

So what are they? Well they are place holders and you would you use them in the same places that you would probably put a variable. The key difference comes when you assign the values. With variables you work though the code setting the values as appropriate, with template parameters all you do is press CTRL-SHIFT-M and you get the dialogue box shown below. Simply fill in the values, click ok and your ready to run your script.

The format of a parameter is <Parameter Name, SQL Data type, default value> so to get the Mirrored DB name shown in the screen shot you would enter <Mirrored_DB_Name,nvarchar(128),‘My_Mirror’> where you wanted the value to appear.

This feature is available in Management Studio and Query Analyser, if you have ever used one of the standard Microsoft query templates you may have noticed that parameters are used in them as well.

I only realised template parameters existed a few years back because someone pointed it out to me. If this is the first time you have come across them (or had forgotten about them) I hope you find them as useful as I do.

Categories: SQL Server, SSMS

The x64 experience

July 10, 2007 Leave a comment

The 64 bit beast has been out there for a while now and new servers are generally 64 bit compatible. The first thing that often comes into my mind when i think 64 bit is performance & memory with "compatibility" hot its heels (and recollection of a good article by Linchi Shea).

My fears around compatibility have been pretty much put to rest and we now recommend SQL 2005 x64 as our base build. However it does seem that 64 bit builds need a bit more attention. I say this because recently i’ve been having to register 64 bit dll’s again and having mentioned this to a few DBA’s the conversations went a bit like "Really? So was it x.dll or y.dll". Fortunately though the problems seem to only be affecting the tools (touch wood) but i am curious as to what experiences others are having with 64 bit servers so please do leave some comments.

Now, to help others who run head first into the very misleading errors i experienced i`ll detail the errors and the fixes below.

The first problem

This one occurred when we were doing a server build. One of our final tasks was to deploy a SSIS package so we went to open the package in BIDS and were greeted with the following error when loading the package.

"The package failed to load due to error 0xC0010014 "One or more error occurred. There should be more specific errors preceding this one that explains the details of the errors. This message is used as a return value from functions that encounter errors.". This occurs when CPackage::LoadFormXML fails."

Some research threw up a few red herrings but the next clue came when i decided to connect to the SSIS instance using SSMS. At this point i got an error stating

"Error loading type library/DLL"

Further research threw up the following document which gave me a few options none of which i liked and more importantly the top solution was use the latest SP which i was already running so i decided to go hunting and use the sysinternals process monitor to see if i could find the issue. 15 minutes later the culprit was found to be the DTS.dll which process monitor identified as throwing some unusual errors so i located the 64 bit version and use REGSVR32 to register the DTS.DLL again and this resolved my problems.

The second problem

This one came a few weeks down the line on the same server. I was working in SSMS and connected to SSIS, when expanding the tree view i got the following error.

"Library not registered. (Exception from HRESULT: 0x8002801D (TYPE_E_LIBNOTREGISTERED))"

This time i went straight to my trusty process monitor and found that the server was trying to use the 32 bit version of msdtssrvrutil.dll. I located the 64 bit version and registered it and everything worked again. This time i also took to opportunity to check as best i could that everything else was working.

Its been a few weeks since the issues now and all’s well so hopefully the server will behave, until the next time………

Categories: SQL Server

Dotty about maintenance plans

June 22, 2007 Leave a comment

I went to remotely edit a maintenance plan the other day and found it took a long time to open any of the objects in the plan. I also found that I was getting errors when clicking the drop down to select a database.

The reason for the problem was that the local server connection embedded in the maintenance plan had the server name defined as "." rather than an actual server name. Because of this the objects were trying to connect to a default instance of SQL on my PC rather than the server. The delay in opening was down to the timeout and the subsequent errors because there was no server to talk to.

The reason this occurred was because when the maintenance plan was created the server was registered as "." rather than <server_name> in SQL Server Management Studio (SSMS) and the maintenance plan uses the registered name in SSMS to populate server name in the connection details. Unfortunately you can not edit the local server connection details so probably the quickest way to fix the problem is to recreate the plan with the full server name registered in SSMS. I say probably the quickest because there may be a way to fix it quickly using the Business Intelligence Development Studio to update the package but getting the package back in is awkward.

The real killer

In many ways I think I got off lightly with this "feature". Why? Well, lets say I was editing the maintenance plan from another location that had got a default instance of SQL installed…… Yep! You guessed it, the maintenance plan loads quickly and without error because its connected to the local instance. When you select databases from the drop down list you are seeing the databases from your local instance and not the remote server and you could then select to run tasks on a database that does not exist on the remote server.

I`m going to flag this on connect and add a check to our install documents to ensure that instances are not registered with a "." in SSMS on the server.

Categories: SQL Server

SQL 2005: Procedure cache cleared

June 5, 2006 Leave a comment

I don’t normally write about a technet article but i must say that 917828 caught my eye and i felt that it was worth highlighting it.

The reason it caught my eye is that it describes a number of scenarios where by the whole procedure cache is flushed which will clearly lead to a decrease in query performance whilst the cache builds up again. The most alarming point was that there are at least 2 scenarios that a DBA may carry out and even schedule during core production hours without realising the impact until the users start complaining about the mysterious slow downs…..

These are

  • Deleting a database snapshot.
  • Restoring a database backup.

I’d recommend reading the article as there are 5 other situations in which this occurs that may affect you.

Categories: Performance, SQL Server

SQL 2005 – Database Snapshots

April 3, 2006 Leave a comment

Today was a big day for me that made me proud as my first article was published on the Simple Talk website! Its about database snapshots in SQL 2005 and you can find it here.

I really enjoyed writing the article and must say that although i thought i knew how much effort it took to write an article its only now that i truly understand how much work people put into their articles!

In the mean time i intend to continue submitting articles and look forward to any comments you may have about my first one.

Categories: SQL Server

What is that SPID doing?

March 27, 2006 Leave a comment

This is certainly not a new topic but something that i felt was worth revisiting and reminding people about, especially because i find many people are not aware of the ability to extract the statement within a stored procedure being executed.

So, prior to SQL 2000 SP3, when we want to know what code a SPID is executing we have to use DBCC INPUTBUFFER (). The problem with this command is that its restricted to the first 256 characters and only returns the outermost statement executing. That means that if we have stored procedure 1 calling stored procedure 2 the input buffer will only tell us about stored procedure 1 and it may even get truncated!

I clearly remember reading the SP3 readme and thinking Christmas has come early as i found the entry about a new function being introduced called ::fn_get_sql. I’ve got to say right now this is the best present yet! It has made trouble shooting so much easier. Why? Well read on.

Running the code below will now show you the inner most statement executing and we now get up to 4000 characters. In the case of a stored procedure you see the whole procedure.

DECLARE @Handle BINARY(252)
SELECT @Handle = sql_handle FROM MASTER..sysprocesses WHERE spid = 109 --CHANGE SPID NUMBER
SELECT * FROM ::fn_get_sql(@Handle)

The part that really got me? When it returns the code of a stored procedure you could be left wondering what statement within the stored procedure is executing. Fear not! They have that covered to. All you have to do is run the modified code below and the exact statement causing you a problem is identified.

DECLARE @Handle BINARY(20), @start INT, @end INT
SELECT
@Handle = sql_handle, @start = stmt_start, @end = stmt_end
FROM MASTER..sysprocesses WHERE sql_handle <> 0x00 AND spid = 226  --CHANGE SPID NUMBER
IF NOT EXISTS (SELECT * FROM ::fn_get_sql(@Handle))  PRINT 'Handle not found in cache'
ELSE
SELECT 
'Current statement'=
SUBSTRING(TEXT, (@start + 2)/2,
                                
CASE @end
                                     
WHEN -1 THEN (DATALENGTH(TEXT))
ELSE (@end -@start + 2)/2
                                   
END)

FROM ::fn_get_sql(@Handle)

So, now i’ve reminded you of this really useful tool its time for a few "Battle Tips".

1) Make sure in query analyser your "maximum characters per column" is set to 4000. Otherwise you are only going to get 256….
2) If  SQL generates a "zero cost plan" it will not put it in cache because its not worth it. As its not in cache ::fn_get_sql won’t return any information.
3) This one comes with a health warning….. If for some reason you need SQL to cache the zero cost plans so that ::fn_get_sql will always work then you can enable trace flag 2861. I would recommend you do not use this in production as the overhead is substantial and profiler should be used instead but some people might find it beneficial on a dev box.
4) I’ve not adjusted the script above to deal with parallel spids where you get multiple handles for 1 spid. Just a note, i keep on forgetting.  😀

There are many uses for this function and i encourage you to explore them. SQL 2005 makes life even easier through DMV’s but that’s another blog……

SQL 2005 – Log Shipping Monitoring?

March 12, 2006 Leave a comment

I took a look at log shipping in 2005 for the first time the other day and as i was curious to see how the monitor functionality had been implemented in SQL Server Management Studio i went hunting for it. I’ve got to say i was greeted with nothing but disappointment and left feeling that i now have a huge mountain to climb before it can be used in production!

We currently have 15 databases shipped to multiple destinations and in SQL 2000 we monitor and manage from a central monitor server. In Enterprise Manager there is a log shipping monitor node under management and any databases that have failed are marked with a red cross and we can right click and suppress the alerts whilst we investigate and the investigation often starts with a right click to view the relevant history for the database in question. Amongst other things we can also automatically suppress alerts for periods of time.

Now, what did i find in 2005? After a good hunt around i found a report that you get to by going to the summary page at the server level and choose "Transaction Log Shipping status". I then found a single job on the monitor server responsible for generating the alerts for all the databases. As for viewing the histories, its a case of find the jobs or run a query against the relevant monitor tables!

In SQL 2005, log shipping has been wrapped up into its own executable and divorced from maintenance plans. I was expecting to see a number of improvements over SQL 2000 or, as a minimum, be on the same level but i was not prepared to find that out of the box it was going to be far more difficult to use! Why? Because the monitor tool in 2000 was simple and with reasonable instruction anyone could use it. I now feel i need to write at least 1 more report and i need to come up with a way of suppressing individual alerts and introducing suppressions for certain time periods to meet our business needs and ensure its maintainable by people who are not DBA’s.

I just don’t understand why Microsoft have done this! Sad smile

Categories: SQL Server

2005 – Alter failed for server XXX

February 13, 2006 Leave a comment

That’s exactly what i thought!! What the hell does that mean?

Well, for me this occurred in the context of running the maintenance plan integrity check task. It had been running fine for weeks but suddenly boom every time it ran it would fail with

"Alter failed for server XXX".

There was no obvious reason for this error so i had to dig out profiler to see what was going on. It turns out that one of the first statements executed by the plan is "sp_configure ‘User Options’,xxxx" followed by "RECONFIGURE".  The reconfigure statement was then generating an error which caused my problem.  But why was it generating an error? That was because a script had been run on the server that changed the "Allow Updates" value from 0 to 1. Changing the value back to 0 resolved the problem.

BOL states that changing the Allow Updates option has no effect since you can’t update system tables in 2005 but it clearly does have an effect since the behaviour of reconfigure changes.

There’s a few points to be taken from this.

1) There actually is no benefit to changing "Allow Updates" so make sure you remove it from your scripts, i have.
2) Why was the "sp_configure ‘User Options’, xxxx" followed by "RECONFIGURE" being executed anyway? Its not relevant to integrity checks and why is it updating server wide settings? I noted the setting was simply set to what it already was, fortunately.
3) Bear this little gotcha in mind because who knows where you will first meet the problem and what message will you get?

Categories: Configuration, SQL Server

Configuring a secure domain account for ASP.NET on Windows 2000

January 26, 2006 Leave a comment

I recently had to enable the ASP.NET account on a Windows 2000 machine to use a domain account so we could use NT Authentication to the SQL backend and the implementation of the domain account had to be secure. The hardest bit was securing the account, I wanted to give the account as few privileges as possible and make sure the account details could not be easily compromised.

It took me a little while to gather the information I needed so I’ve pulled it together below so as to help anyone else faced with this task. The end result is that the asp.net domain account has minimal privileges on the IIS server and SQL server and the account details are encrypted and stored in the registry.

The encryption of the account details is handled by aspnet_setrep.exe and the tool is described here. Its worth mentioning that the utility can be used for encrypting data in other configuration sections.

The steps to a secure world are:

Read more…

A performance analysis example

January 15, 2006 2 comments

Last week i was dragged kicking and screaming away from my allocated SQL 2005 days to spend 2 days diagnosing performance problems on one of the firms Systems and write a report.
Due to the sensitive nature of the data on the system it has been managed by another team and this was the first time i had looked at it. I thought i’d share my thought process to identifying the problem on the server and reference tools i used. Its the first iteration of the review and once my recommendations are implemented there will be another review to see what else may need dealing with. Please note however i have not included every single step i took otherwise i’d be writing a novel!

Scenario

The users of the system have been reporting intermittent time outs occurring in the front end application. Current system administrators have noted periods of high cpu usage. The server is Windows 2000 SP4 running SQL 2000 SP3a with 2 physical processors, hyperthreading enabled, 2gb memory, SAN attached (EVA 5000).

Analysis

The first port of call was to review the event logs and sql server logs for any errors that may indicate what is happening. These logs revealed nothing. Next port of call was to connect via Query Analyser and see what was going on. A quick look at current activity showed nothing much going on, so i did the standard SELECT * FROM SYSPROCESSES. Now this revealed that 80% of the connected spids last wait times had been for a latch and specifically the waitresource was 2:3:69 or 2:1:24.

Read more…

Categories: Performance, SQL Server

SQL 2005 – Maintenance Plans

January 3, 2006 Leave a comment

I made a decision that to really get things going and identify “trouble spots” that i would continue my quick win approach.

So basically i`m working through installing and configuring our test SQL 2005 box following our SQL 2000 install instructions. Through out this approach i`m documenting and making necessary changes so i can have first drafts of install docs which i`m sure will evolve into excellent documents by the end of the project Open-mouthed smile

So, having done the basic install which i talked about previously i decided lets tackle maintenance plans as there should not be to many surprises here………

I have got to say that the new maintenance plans are far more flexible than ever before with a lot of added functionality and over all i’d say its a job well done. However the new maintenance plans have inflicted much pain for me.

Read more…

Categories: SQL Server