Archive

Archive for the ‘SQL Server’ Category

The Job whose owner kept coming back……

December 30, 2007 Leave a comment

I thought i would share this little quirk about the SQL Agent jobs for maintenance plans.

One of our members of staff had left and we had the usual case of a few jobs failing with:

"Unable to determine if the owner (DOMAIN\xxx) of job <JOB_Name> has server access (reason: error code 0x534. [SQLSTATE 42000] (Error 15404))."

So, we went around and updated the job owners to one of our appropriate generic admin accounts. A few days later some of the jobs started to fail again with the same error, since we knew we had performed the update previously it was time to investigate how the job had been set back to the old user account.

It was quickly determined that the only jobs that had reverted back to the old owner were the jobs created by maintenance plans so we focused our attention here. It turns out that when you save a change to a maintenance plan the job owners are reset to the owner of the maintenance plan. The owner of the maintenance plan will be the account used to connect to the server in SSMS when creating the plan.

With this determined a slight variation of our fix was deployed. First we changed the job owners and next we updated the owner of the maintenance plan using the script at the end of the post. The script is in two parts, the first part shows you who owns what and the second updates the owner to an account you specify.

Agent jobs being created with a user account have always been a procedural problem. This is simply another variation on the problem that we need to take into consideration and put a process in place to deal with. The most likely processes are either to only create a maintenance plan when logged on with a generic account or run the script after creating the maintenance plan.

I am however curious why Microsoft have implemented updating the jobs in this manner and see it as having the potential to cause significant problems in environments that may not be monitoring their jobs as closely as is required and end up with maintenance tasks not running for some time. How to get around this? Well, given the nature of maintenance plans and the fact you must be a sysadmin to see or create them, surely it makes sense to have the owner as the SQL Service account or a user created by SQL for maintenance plans? Currently someone has posted this feature to connect here and i’ve added my two pennies worth so if you feel it should change then have you say too!

--See who owns which packages

SELECT

name, description,SUSER_SNAME(ownersid)

FROM

msdb.dbo.sysdtspackages90

--Now we update the owner to an appropriate domain account. Either the service account or a generic admin account is good.

UPDATE

msdb.dbo.sysdtspackages90

SET

OWNERSID = SUSER_SID('YOUR_DOMAIN\APPROPRIATE_ACCOUNT')

WHERE

OWNERSID = SUSER_SID('YOUR_DOMAIN\OLD_ACCOUNT')

Advertisements

My old mate sp_recompile

October 12, 2007 Leave a comment

As soon as i saw the error messages in the logs i thought to myself "Oh my, that did not happen in testing" (ok, maybe it was more colourful than that).

We were creating a clustered index on a tiny little table and the index went through fine. However, the application started to generate the message "Could not complete cursor operation because the table schema changed after the cursor was declared". My gut reaction was to restart each application server in the cluster but having restarted the first one it made no difference. It suddenly clicked that SQL Server must be dishing out the cursor plan from cache.

Now, I did not want to restart the SQL servers because only a small part of the application was affected and I did not want a more significant outage. So, how do we get the plan out of cache? The table below details your options with the corresponding impact.

Action

Pros

Cons

EXEC sp_recompile ‘object’

Minimal impact. When passing a table name all procedures referencing it will be recompiled. Plans in cache not referencing the table will stay in cache.

You have to know the name of the object(s) needing to be recompiled.

DBCC FREEPROCCACHE

Quick and dirty.

The procedure cache for the server is cleared so the server will slow down whilst the cache populates again.

Restart SQL

I suppose you could say you are 100% sure you have nailed the sucker.

You have a system outage and you have to wait for your procedure and buffer cache to repopulate.

The lesson to take away here is to always use sp_recompile when making any kind of DDL changes, i also tend to use it on stored procs & views too. I normally always have it in my scripts so believe you me i gave myself a good talking to about forgetting to put it in this time Open-mouthed smile

And on a related note, have you come across sp_refreshview? No? Well, its worth knowing about.

Categories: Performance, SQL Server

SSMS Restore backup error

September 14, 2007 Leave a comment

We had a requirement to allow someone to create and restore databases on a test server today and i thought to myself "Thats easy, i`ll just grant the "Create Any Database" right to the appropriate user, thats when the pain began!

The user was using SSMS connecting using a SQL login to restore a database and when they went to specify the backup location they got an error to the effect "Cannot access the specified path or file on the server". After clicking ok the tree view in the locate backup dialog was empty and if you typed in the path and filename manually you still recieved an error.
So, i dug out the profiler and found that xp_fixeddrives was being called and decided to check it out. It turns out that when executing xp_fixeddrives using a SQL login it returns no results! Because of this the error is generated in SSMS and the tree view is not populated.
I spent some time investigating xp_fixeddrives and came to the following conclusions regarding its behaviour.

  • Currently the only way to get xp_fixeddrives to return results when using SQL authentication is to add it to the sysadmin role.
  • I extensively explored proxy account configurations to get xp_fixeddrives to work under SQL authentication and i could find no way to get it working.
  • If you use Windows authentication xp_fixeddrives works as expected. You do not need sysadmin privledges nor do you have to do any proxy configuration.
  • Some poking around with process monitor showed that when using windows authentication or the sysadmin process that SQLSERVR.exe spawns processes to gather the details using the service account. When using SQL Authentication with sysadmin SQLSERVR.exe does not spawn anything or generate any security errors so what ever is going on is beyond what i can work out (My skills with a debugger are to limited :)  ).

At the end of all this i went down the Windows authentication root to work around the issue. For some reason SSMS still generates the error but the tree view is then populated and you can select the file so i can live with that as i suspect exploring why its still generating an error may end up being a bottomless pit of time.

Finally, this only affects SSMS. If the user had done the restore using T-SQL it would have worked (despite a warning about updating the restore history in msdb).

Categories: Configuration, SQL Server

Server level VLF report

July 25, 2007 Leave a comment

I read Tony Rogerson’s blog on Virtual Log Files today and it reminded me that i really should knock up a little report to list all databases on a server and the number of VLF’s per database. Since I had been busy writing some other Operational reports I was in the right frame of mind so knocked up what you see below.

The Report code

CREATE TABLE #VLFS (fileid INT,filesize bigint,startoffset bigint,fseqno bigint,status INT,parity INT,createlsn VARCHAR(1000))

CREATE TABLE #Results (SRV_Name NVARCHAR(500),Database_Name NVARCHAR(500),VLFS INT)

EXEC MASTER.dbo.sp_msforeachdb

    @command1 = 'USE ? INSERT INTO #VLFS EXEC(''DBCC LOGINFO WITH TABLERESULTS'')',

    @command2 = 'insert into #Results SELECT @@SERVERNAME,''?'',count(*) from #vlfs',

    @command3 = 'truncate table #vlfs'

--For this example i`ve just returned the results but you can

--just as easily write the results to a local or central server.

--I write mine to a central server for reporting on.

SELECT * FROM #Results

DROP TABLE #vlfs

DROP TABLE #Results

Example output

Results

For me the beauty of this little report is that i have set it up so that it runs on all 150 of my servers distributed around the world and logs to my central logging server. From here I get one report that tells me the databases and the servers they are hosted on that have excessive VLF’s. Having said that in a well managed environment i should never find any databases with lots of VLF’s……..

You may also be wondering "How many VLF’s are too many?". Tony did not cover this but Kimberly Tripp did in a post she put up a few years ago (See point 8) which was when I first learned about VLF’s. I don’t however think this is a hard and fast number and obviously the performance gains will be less if your not far off this number.

The last thing I am going to mention in this post is part of the code I used in the report. I used the undocumented procedure sp_msforeachdb. This is a great little procedure that will cycle through your databases executing up to 3 commands. To use it, where you would have put a database name you put a ‘?’ and its as simple as that! Incidentally there is also a sp_msforeachtable.

The joy of template parameters

July 15, 2007 Leave a comment

I thought I would do a quick blog about template parameters since I have been writing allot of standard deployment scripts for our 2005 builds and have used them extensively.

So what are they? Well they are place holders and you would you use them in the same places that you would probably put a variable. The key difference comes when you assign the values. With variables you work though the code setting the values as appropriate, with template parameters all you do is press CTRL-SHIFT-M and you get the dialogue box shown below. Simply fill in the values, click ok and your ready to run your script.

The format of a parameter is <Parameter Name, SQL Data type, default value> so to get the Mirrored DB name shown in the screen shot you would enter <Mirrored_DB_Name,nvarchar(128),‘My_Mirror’> where you wanted the value to appear.

This feature is available in Management Studio and Query Analyser, if you have ever used one of the standard Microsoft query templates you may have noticed that parameters are used in them as well.

I only realised template parameters existed a few years back because someone pointed it out to me. If this is the first time you have come across them (or had forgotten about them) I hope you find them as useful as I do.

Categories: SQL Server, SSMS

The x64 experience

July 10, 2007 Leave a comment

The 64 bit beast has been out there for a while now and new servers are generally 64 bit compatible. The first thing that often comes into my mind when i think 64 bit is performance & memory with "compatibility" hot its heels (and recollection of a good article by Linchi Shea).

My fears around compatibility have been pretty much put to rest and we now recommend SQL 2005 x64 as our base build. However it does seem that 64 bit builds need a bit more attention. I say this because recently i’ve been having to register 64 bit dll’s again and having mentioned this to a few DBA’s the conversations went a bit like "Really? So was it x.dll or y.dll". Fortunately though the problems seem to only be affecting the tools (touch wood) but i am curious as to what experiences others are having with 64 bit servers so please do leave some comments.

Now, to help others who run head first into the very misleading errors i experienced i`ll detail the errors and the fixes below.

The first problem

This one occurred when we were doing a server build. One of our final tasks was to deploy a SSIS package so we went to open the package in BIDS and were greeted with the following error when loading the package.

"The package failed to load due to error 0xC0010014 "One or more error occurred. There should be more specific errors preceding this one that explains the details of the errors. This message is used as a return value from functions that encounter errors.". This occurs when CPackage::LoadFormXML fails."

Some research threw up a few red herrings but the next clue came when i decided to connect to the SSIS instance using SSMS. At this point i got an error stating

"Error loading type library/DLL"

Further research threw up the following document which gave me a few options none of which i liked and more importantly the top solution was use the latest SP which i was already running so i decided to go hunting and use the sysinternals process monitor to see if i could find the issue. 15 minutes later the culprit was found to be the DTS.dll which process monitor identified as throwing some unusual errors so i located the 64 bit version and use REGSVR32 to register the DTS.DLL again and this resolved my problems.

The second problem

This one came a few weeks down the line on the same server. I was working in SSMS and connected to SSIS, when expanding the tree view i got the following error.

"Library not registered. (Exception from HRESULT: 0x8002801D (TYPE_E_LIBNOTREGISTERED))"

This time i went straight to my trusty process monitor and found that the server was trying to use the 32 bit version of msdtssrvrutil.dll. I located the 64 bit version and registered it and everything worked again. This time i also took to opportunity to check as best i could that everything else was working.

Its been a few weeks since the issues now and all’s well so hopefully the server will behave, until the next time………

Categories: SQL Server

Dotty about maintenance plans

June 22, 2007 Leave a comment

I went to remotely edit a maintenance plan the other day and found it took a long time to open any of the objects in the plan. I also found that I was getting errors when clicking the drop down to select a database.

The reason for the problem was that the local server connection embedded in the maintenance plan had the server name defined as "." rather than an actual server name. Because of this the objects were trying to connect to a default instance of SQL on my PC rather than the server. The delay in opening was down to the timeout and the subsequent errors because there was no server to talk to.

The reason this occurred was because when the maintenance plan was created the server was registered as "." rather than <server_name> in SQL Server Management Studio (SSMS) and the maintenance plan uses the registered name in SSMS to populate server name in the connection details. Unfortunately you can not edit the local server connection details so probably the quickest way to fix the problem is to recreate the plan with the full server name registered in SSMS. I say probably the quickest because there may be a way to fix it quickly using the Business Intelligence Development Studio to update the package but getting the package back in is awkward.

The real killer

In many ways I think I got off lightly with this "feature". Why? Well, lets say I was editing the maintenance plan from another location that had got a default instance of SQL installed…… Yep! You guessed it, the maintenance plan loads quickly and without error because its connected to the local instance. When you select databases from the drop down list you are seeing the databases from your local instance and not the remote server and you could then select to run tasks on a database that does not exist on the remote server.

I`m going to flag this on connect and add a check to our install documents to ensure that instances are not registered with a "." in SSMS on the server.

Categories: SQL Server