Ajarn Mark Caldwell Blog

Bringing Business Sense to the IT World…

More Performance Notes from PASS

As I was reviewing my notes from the PASS Community Summit, a couple more things from Greg Low's talk on Avoiding Recompiles stood out as worth repeating here.

First, was his comment that using single-part names when referencing your objects causes a performance hit because they cannot be resolved until execution time.  SQL Server does not know, when you refer to the table MyTable which one you mean.  You may know that there is only one, for example, if all your objects are owned by DBO (in SQL 2000 and earlier) or are in the DBO schema (SQL 2005 and later).  But SQL Server has to check, and the first check is to look for YourName.MyTable, and then, if it doesn't find it, SQL Server will fall back to checking for dbo.MyTable.  This is similar to how SQL Server always looks in the Master database first for stored procedures when the name starts with "sp_".  So to get better performance, you should always use at least 2-part naming of your objects.  (There might be an argument for 3-part naming for performance, but I have not heard it yet.)  I have to admit to having gotten sloppy in my code because all of our objects are owned by DBO.  Thankfully, I have been using SQLPrompt lately, and there is an option to have it always qualify your object names, so that will cover me if I slip up.

Second, and more intriguing in my opinion, is that when using ADO (for example in your web application) you should be sure to define the actual size of your parameters when calling SQL Server.  I do this as a matter of style and habit, but it is not required.  But if you do not, then, apparently, ADO sets the size information based on the actual value that you are passing.  And the next time your program calls SQL, passing different values (in particular Strings/VarChars), the size part of the parameter definition will be different, and so SQL Server will not be able to match it against the cached query plan from the previous call, and it will have to generate the plan all over again.  If I understood what Greg was saying, this is inherent with using LINQ, too.

It just goes to show you that SQL Server is very forgiving in terms of what it allows you to get away with, and still get accurate results; but if you are writing for very high performance, just because you can do something, it does not mean that you should do it.

Legacy Comments


Adam Machanic
2008-11-26
re: More Performance Notes from PASS
Yep, huge problem with LINQ:

https://connect.microsoft.com/VisualStudio/feedback/ViewFeedback.aspx?FeedbackID=363290


balaji
2008-12-08
re: More Performance Notes from PASS
hi