More Performance Notes from PASS

As I was reviewing my notes from the PASS Community Summit, a couple more things from Greg Low's talk on Avoiding Recompiles stood out as worth repeating here.

First, was his comment that using single-part names when referencing your objects causes a performance hit because they cannot be resolved until execution time.  SQL Server does not know, when you refer to the table MyTable which one you mean.  You may know that there is only one, for example, if all your objects are owned by DBO (in SQL 2000 and earlier) or are in the DBO schema (SQL 2005 and later).  But SQL Server has to check, and the first check is to look for YourName.MyTable, and then, if it doesn't find it, SQL Server will fall back to checking for dbo.MyTable.  This is similar to how SQL Server always looks in the Master database first for stored procedures when the name starts with "sp_".  So to get better performance, you should always use at least 2-part naming of your objects.  (There might be an argument for 3-part naming for performance, but I have not heard it yet.)  I have to admit to having gotten sloppy in my code because all of our objects are owned by DBO.  Thankfully, I have been using SQLPrompt lately, and there is an option to have it always qualify your object names, so that will cover me if I slip up.

Second, and more intriguing in my opinion, is that when using ADO (for example in your web application) you should be sure to define the actual size of your parameters when calling SQL Server.  I do this as a matter of style and habit, but it is not required.  But if you do not, then, apparently, ADO sets the size information based on the actual value that you are passing.  And the next time your program calls SQL, passing different values (in particular Strings/VarChars), the size part of the parameter definition will be different, and so SQL Server will not be able to match it against the cached query plan from the previous call, and it will have to generate the plan all over again.  If I understood what Greg was saying, this is inherent with using LINQ, too.

It just goes to show you that SQL Server is very forgiving in terms of what it allows you to get away with, and still get accurate results; but if you are writing for very high performance, just because you can do something, it does not mean that you should do it.

PASS 2008: Days 2 & 3

Thursday and Friday of the PASS Community Summit blurred together a bit for me.  Not because of the parties Thursday night which apparently left many people bleary-eyed, but the cold medicine I was taking sure didn't help things.  But primarily because I bounced in and out of a few sessions and had some work stuff mixed in.  And on top of that, my 5-month old waking up every couple of hours throughout the night did not make for very restful nights.  Oh well, the things I endure for personal development.  ;-)

Here are a few of the miscellaneous lessons I learned or was reminded of:

  • When attending a conference, be sure to read the detailed description in the program guide of the sessions you are planning to attend.  There were a couple of times where I went to a session based primarily on the Title of the session, only to find out that I had misinterpreted it, and the session was not at all about the topic I was expecting.  Unfortunately, by the time I figured it out, it was a toss-up whether I could get into another session that interested me.  There were several sessions that were Standing Room Only, so if you weren't there at the beginning of the session, you might be out of luck.
  • XML is not necessarily evil.  OK, I didn't really think it was evil, but I have been very skeptical of its value in the database.  I am not one to abandon my Relational designs and thought patterns on a whim.  But Denise gave a great presentation on her experience implementing XML for a large project, and I completely understand and agree with her reasoning for doing so.  There were specific conditions in her project which made using XML palatable, and even attractive.  In particular, she had collections of element and attribute data that was hierarchical, had to be portable, and the required structure would change (in fact she already had multiple "schemas" she needed to represent for different records).  And best of all, in this particular system, the data that was being stored in XML would not be queried in segments.  Primarily she would want all of the XML returned based on other data elements in the record.  Sounds perfect to me.
  • In SQL 2005, the Execution Plan when shown in XML will display information on indexes that SQL Server thinks would have been helpful, but are missing.  In SQL 2008, this information is also visible in the graphical execution plan.
  • If your Stored Procedure is only going to return one row (e.g. GetRecordByPK) then you should consider using Output Parameters instead.  There is some overhead to producing a recordset and sending it down the line; and a handful of Output Parameters may be a lot more efficient.
  • Beware of Implicit Data Conversions in your JOINS and WHERE clauses (e.g. comparing a VARCHAR column to an NVARCHAR variable or a DATETIME column to a string literal date).  In SQL 2008, the impact of this was greatly lessened, but in SQL 2005 and earlier, the impact is similar to placing a function in the statement which is to say that it can seriously hurt performance.
  • Remember in SQL 2005 that you can build covering indexes by "INCLUDE"ing columns in the index that are not actually key parts of the index.  For example, you may build and index on LastName, FirstName and INCLUDE MiddleName so that whenever you need to search and display the full name, you don't have to do a Key Lookup to get the middle name, but you also will not have so many splits occurring because your index is keyed on just LastName, FirstName.
  • The new DATETIMEOFFSET data type in SQL 2008, while being Time Zone aware, it is NOT intrinsically aware of Daylight Savings Time.
  • Microsoft is considering a FOREACH construct in T-SQL for some future release of SQL Server.  On the one hand, I like how much simpler it is to understand than the current CURSOR constructs.  On the other hand, given the frequency with which they perform poorly, I don't really want to make creating cursors any easier for developers, lest we encourage them to do it more often.
  • ENUM is another feature under consideration for some future release of SQL Server (T-SQL).  This one I like.  I can definitely see a few places where I would like to use an ENUM in some procedure code.

There was so much more to the conference than this, but here are few things to whet your appetite and help you see the benefit of attending next year.  Remember that on top of this, there was a huge track on BI stuff, several presentations by Microsoft folks, and easy access to the SQL CAT group.

I hope to see you all there next year!

«November»
SunMonTueWedThuFriSat
2627282930311
2345678
9101112131415
16171819202122
23242526272829
30123456