TRUNCATE TABLE master..sysdatabases

...and other neat SQL Server tricks
posts - 54, comments - 60, trackbacks - 14

Tuesday, November 10, 2015

T-SQL Tuesday #72: Data Modeling

It's that time again: T-SQL Tuesday! This month the lovely and amazing Mickey Stuewe (b | t) is hosting and asks us to write about the trials and tribulations of data modeling.

Below is a tale that occurs somewhat frequently (in my humble experience anyway), I’ve changed some of the details so as not to embarrass anyone besides myself. Any resemblance to actual data models, living or dead, is really unfortunate.

(Yes, the excessive code is deliberate)

I really love ice cream!

I decided to open my own ice cream parlor and design the database I'd need to support it:

CREATE TABLE IceCream(IceCreamID int NOT NULL IDENTITY(1,1) PRIMARY KEY, IceCreamName nvarchar(255) NOT NULL);

Naturally we need to support some accessories:

CREATE TABLE IceCreamToppings(IceCreamToppingID int NOT NULL IDENTITY(1,1) PRIMARY KEY, IceCreamToppingName nvarchar(255) NOT NULL);

And of course we need data:

INSERT IceCream(IceCreamName) VALUES ('vanilla'),('chocolate'),('strawberry'),('Neopolitano'),('pistachio'),('mint chocolate chip');
INSERT IceCreamToppings(IceCreamToppingName) VALUES ('chocolate sprinkles'),('candy sprinkles'),('chocolate chips'),
('blueberry sauce'),('whipped cream'),('butterscotch'),('m&ms'),('oreo crumbles');

Business took off into the stratosphere!

My customers also love putting ice cream on breakfast foods, so I offered pancakes and waffles!

CREATE TABLE Pancakes(PancakeID int NOT NULL IDENTITY(1,1) PRIMARY KEY, PancakeName nvarchar(255) NOT NULL);
CREATE TABLE PancakeToppings(PancakeToppingID int NOT NULL IDENTITY(1,1) PRIMARY KEY, PancakeToppingName nvarchar(255) NOT NULL);
CREATE TABLE Waffles(WaffleID int NOT NULL IDENTITY(1,1) PRIMARY KEY, WaffleName nvarchar(255) NOT NULL);
CREATE TABLE WaffleToppings(WaffleToppingID int NOT NULL IDENTITY(1,1) PRIMARY KEY, WaffleToppingName nvarchar(255) NOT NULL);

INSERT Pancakes(PancakeName) VALUES ('Homemade'),('frozen');
INSERT Waffles(WaffleName) VALUES ('Belgian'),('Eggo round'),('Eggo Square');
INSERT PancakeToppings(PancakeToppingName) VALUES ('chocolate chips'),('blueberries'),('blueberry sauce'),('maple syrup'),('apples'),('cinnamon sugar'),('powdered sugar');
INSERT WaffleToppings(WaffleToppingName) VALUES ('chocolate chips'),('blueberries'),('blueberry sauce'),('maple syrup'),('apples'),('cinnamon sugar'),('powdered sugar');


They loved this so much I started offering other breakfast options:

CREATE TABLE Omelettes(OmeletteID int NOT NULL IDENTITY(1,1) PRIMARY KEY, OmeletteName nvarchar(255) NOT NULL);
CREATE TABLE OmeletteStuffings(OmeletteStuffingID int NOT NULL IDENTITY(1,1) PRIMARY KEY, OmeletteStuffingName nvarchar(255) NOT NULL);

INSERT Omelettes(OmeletteName) VALUES ('whole eggs'),('egg whites');
INSERT OmeletteStuffings(OmeletteStuffingName) VALUES ('sausage'),('mushrooms'),('green peppers'),('ham'),('spinach'),('feta cheese'),('onion'),('red peppers');

And then lunch offerings soon followed:

CREATE TABLE Pizza(PizzaID int NOT NULL IDENTITY(1,1) PRIMARY KEY, PizzaName nvarchar(255) NOT NULL);
CREATE TABLE PizzaToppings(PizzaToppingID int NOT NULL IDENTITY(1,1) PRIMARY KEY, PizzaToppingName nvarchar(255) NOT NULL);

CREATE TABLE Burgers(BurgerID int NOT NULL IDENTITY(1,1) PRIMARY KEY, BurgerName nvarchar(255) NOT NULL);
CREATE TABLE BurgerToppings(BurgerToppingID int NOT NULL IDENTITY(1,1) PRIMARY KEY, BurgerToppingName nvarchar(255) NOT NULL);

CREATE TABLE Sandwiches(SandwichID int NOT NULL IDENTITY(1,1) PRIMARY KEY, SandwichName nvarchar(255) NOT NULL);
CREATE TABLE SandwichToppings(SandwichToppingID int NOT NULL IDENTITY(1,1) PRIMARY KEY, SandwichToppingName nvarchar(255) NOT NULL);

INSERT Pizza(PizzaName) VALUES ('deep dish'),('sicilian'),('regular');
INSERT PizzaToppings(PizzaToppingName) VALUES ('pepperoni'),('sausage'),('mushrooms'),('green peppers'),('ham'),('spinach'),('feta cheese'),('onion'),('pineapple');
INSERT Burgers(BurgerName) VALUES ('beef'),('kobe'),('angus'),('turkey'),('chicken');
INSERT BurgerToppings(BurgerToppingName) VALUES ('bacon'),('mayo'),('mustard'),('lettuce'),('tomato'),('red onion'),('white onion'),('sauteed onion'),('mushrooms'),('sauteed mushrooms'),('pickles');
INSERT Sandwiches(SandwichName) VALUES ('roast beef'),('ham'),('turkey'),('pastrami'),('baloney');
INSERT SandwichToppings(SandwichToppingName) VALUES ('bacon'),('mayo'),('mustard'),('lettuce'),('tomato'),('red onion'),('white onion'),('sauteed onion'),('mushrooms'),('sauteed mushrooms'),('pickles');

And then, my business went international!

People wanted my food, but didn't speak English, so I offered translations for everything:

CREATE TABLE IceCreamToppingsTranslation(LanguageID int NOT NULL, IceCreamToppingID int NOT NULL, TranslatedText nvarchar(4000) NOT NULL);
CREATE TABLE WaffleToppingsTranslation(LanguageID int NOT NULL, WaffleToppingID int NOT NULL, TranslatedText nvarchar(4000) NOT NULL);
CREATE TABLE PancakeToppingsTranslation(LanguageID int NOT NULL, PancakeToppingID int NOT NULL, TranslatedText nvarchar(4000) NOT NULL);
CREATE TABLE PizzaToppingsTranslation(LanguageID int NOT NULL, PizzaToppingID int NOT NULL, TranslatedText nvarchar(4000) NOT NULL);
CREATE TABLE BurgerToppingsTranslation(LanguageID int NOT NULL, BurgerToppingID int NOT NULL, TranslatedText nvarchar(4000) NOT NULL);
-- repeat ad nauseum

And then...

At some point, I was too busy serving food to design databases anymore, and couldn't deal with the performance issues that were cropping up, so I hired ace database designer Portia N. Carrera, and she promptly replied:

"This design is ridiculous! And it will only get worse as time goes on! No one in their right mind 
would ever do such a thing!" (edited to remove excessive but totally justified profanity) 

Well, they would. They did. **

"Bull----! How did it get like this?" 

Well, that's kinda the thing. I started small, and added stuff. It's easy and natural, and fairly sensible as long as things stay small. Unfortunately as things grew I never went back to look at what's there and question whether this is the "right" way or if there's a better way. This became technical debt and the effort needed to change it was insurmountable.

"Here's a good take on technical debt:" 

It describes the reasoning/mentality around these decisions, with much of it focused on time constraints. Portia calls this "5 minute/5 second design" and curses everyone's lack of foresight (when she's in a bad mood). ***

Well Portia, how would you fix it?

"Like this:" 

CREATE TABLE FoodTypes(FoodTypeID int NOT NULL PRIMARY KEY, FoodType nvarchar(64) NOT NULL);
INSERT FoodTypes(FoodType) VALUES(N'Ice Cream'),(N'Pancakes'),(N'Waffles'),(N'Omelettes'),(N'Pizza'),(N'Burgers'),(N'Sandwiches');

CREATE TABLE Foods(FoodID int NOT NULL PRIMARY KEY, FoodTypeID int NOT NULL, Food nvarchar(64) NOT NULL);
INSERT Foods(FoodTypeID,Food) 
VALUES(1,N'Vanilla'),(1,N'Chocolate'),(1,N'Strawberry') -- Ice cream
,(2,N'Homemade'),(2,N'Frozen') -- Pancakes
,(3,N'Belgian'),(3,N'Eggo Round'),(3,N'Eggo Square') -- Waffles
,(4,N'Whole Eggs'),(4,N'Egg Whites') -- Omelettes
,(5,N'Deep Dish'),(5,N'Sicilian'),(5,N'Regular');  -- Pizza

CREATE TABLE FoodExtras(FoodExtraID int NOT NULL PRIMARY KEY, FoodTypeID int NOT NULL, FoodExtraName nvarchar(64) NOT NULL);
INSERT FoodExtras(FoodTypeID,FoodExtraName) 
VALUES(1,N'Chocolate sprinkles'),(1,N'Candy sprinkles'),(1,N'Chocolate chips'),(1,N'Blueberry sauce') -- Ice cream toppings
,(2,N'Chocolate chips'),(2,N'Blueberries'),(2,N'Blueberry sauce'),(2,N'Maple syrup') -- Pancake toppings
,(3,N'Chocolate chips'),(3,N'Blueberries'),(3,N'Blueberry sauce'),(3,N'Maple syrup') -- Waffle toppings
,(4,N'Sausage'),(4,N'Mushrooms'),(4,N'Green Peppers'),(4,N'Ham') -- Omelette stuffings
,(5,N'Pepperoni'),(5,N'Sausage'),(5,N'Mushrooms'),(5,N'Green Peppers'),(5,N'Ham'); -- Pizza toppings

CREATE TABLE FoodExtraTranslations(FoodExtraID int NOT NULL, LanguageID int NOT NULL, TranslatedText nvarchar(4000) NOT NULL);
INSERT FoodExtraTranslations(FoodExtraID,LanguageID,TranslatedText)
VALUES(1,1031,N'German Chocolate Sprinkles'),(1,1036,N'French Chocolate Sprinkles')
,(2,1031,N'German Candy Sprinkles'),(1,1036,N'French Candy Sprinkles')
,(13,1031,N'Sausage auf Deutsch'),(13,1036,N'Sausage au français');

She continued:

"If this looks suspiciously like the dreaded Entity-Attribute-Value model, that's because it is. 
There are definite disadvantages to EAV, but for this particular case it's a reasonable option to support 
this data. The biggest advantage is that no new structure is required; your data becomes row-based rather than 
table-based. If you get a new type of food, you just add a row to the FoodTypes table, then the associated rows 
for extras and their translations. You'll probably wonder why Food and FoodExtras are separate, and they really 
don't have to be. I'll leave that redesign for you to figure out, you're not paying me enough for that." 

But I can't rewrite all my application code! They're expecting tables named Waffles, Pancakes, etc. and it's too much work to change it!

"Oh give me a ------- break! Just create views with the same name that only SELECT those rows:" 

CREATE VIEW Pancakes(PancakeID,PancakeName) AS SELECT FoodID, Food FROM Foods WHERE FoodTypeID=2 WITH CHECK OPTION; -- Pancakes
CREATE VIEW Waffles(WaffleID,WaffleName) AS SELECT FoodID, Food FROM Foods WHERE FoodTypeID=3 WITH CHECK OPTION; -- Waffles
CREATE VIEW Pizza(PizzaID,PizzaName) AS SELECT FoodID, Food FROM Foods WHERE FoodTypeID=5 WITH CHECK OPTION; -- Pizza

"And if you're doing things correctly, you're calling a stored procedure to do all your INSERT/UPDATE/DELETE operations, 
those can be modified easily:" 

CREATE PROCEDURE AddFood @foodType int, @foodName nvarchar(64) AS
INSERT Foods(FoodTypeID,Food) VALUES(@foodType,@foodName);

ALTER PROCEDURE AddPizza @pizzaName nvarchar(64) AS
EXEC AddFood 5, @pizzaName;
-- original AddPizza code commented out now...

Portia continued:

"BTW, you shouldn't hard-code your IDs either, but I'll let you fix that. And if you're doing it with ad-hoc SQL, 
or an ORM of some kind, you can still use the views if you add an INSTEAD OF trigger to handle INSERTs and UPDATEs. You're 
not paying me enough to do that either, but it's easy enough to write a code generator to do all that." 

Lessons Learned

If you can’t make or take the time to design them up front, take time every few months to review your table designs. If you see a lot of similar or identical tables, look to consolidate them. See if a row-based approach can be done. Also make sure to test performance, sometimes it may not improve with a new design.

But more importantly, TAKE MORE THAN 5 MINUTES to consider your database designs! It's too easy to "get it done right now" and leave it to someone else to fix later, if ever. Technical debt compounds faster than credit card debt.

** This is based on a model I've seen in my career, with about 130 "types" of things that all have an associated translation table attached.

*** And if you think that technical debt is unavoidable, or not that bad, consider working at Facebook:

posted @ Tuesday, November 10, 2015 10:45 AM | Feedback (0) |

Saturday, October 24, 2015

Notes from Speaker Idol 2014

When last I blogged, I recommended everyone who was eligible to sign up for PASS Summit Speaker Idol 2015. The contestants have been announced:

They also announced the judges in case you want to try the bribery route:

To help this year's participants I will be blogging my notes from when I presented for Speaker Idol 2014, and I'll also add some specifics about my experience in a separate post. I also recommend that you read up on the previous contest and some follow up from a few of the judges:

The notes that follow are transcribed from handwritten notes. I'm listing anonymized comments from the judges, primarily because I haven't gotten their permission, but also to avoid bias on either theirs or the contestant's side.  In no particular order (red were items to improve, green were items that were well done, my personal comments in parentheses):
  • Need a story about how (your presentation topic) saved you time or improved your job
  • Mentioned great benefits (of presentation topic) without providing evidence or comparison
  • Too much information for a 5 minute presentation
  • Don't condense a 1 hour presentation to 5 minutes
  • The faster you need to go, the slower you should talk (don't rush through explanation)
  • Voice cut out when turning head away from microphone, turn whole body instead
  • Test demos, make sure they don't fail
  • Use scripts only, DO NOT TYPE CODE (typing is not the same as a demo)
  • No bullets on slide template (plain formatting)
  • Too many bullet points on slides
  • Avoid using periods at end of bullet points (they aren't sentences)
  • Small graphics on slides
  • Good to paste code on slides to avoid needing SSMS, also good font and color for code
  • Great time management with 23 seconds left (should finish between 4:30 and 5:00 minutes)
  • Not sure what topic was about, need to make clear in beginning; not much meat in first 2 minutes
  • Good to engage audience at beginning of 5 minute presentation ("ballsy" according to 1 judge, but not generally recommended)
  • If asking question, be sure to answer/respond, repeat question for audio/audience, and repeat show of hands to audience
  • Make sure images on your slides have proper attribution (author/source and license type), also for quotations/citations (several contestants were missing these)
  • Good recovery from technical problems (projector, mouse/remote, audio, demo issues; 3 speakers were commended for this)
  • Presentation felt like "Books Online"
  • Presented on floor rather than podium
  • ZoomIT use mostly poor (only 2 speakers had favorable comments, 3 had negative)
  • Never really explained/defined the topic, rushed to demo after long intro, could have explained better within time limit
  • Font and background color on slides hard to read (it was noted that slides were templates and presenter may not be able to change)
  • Allow enough time for audience to read slides
  • Don't repeat slides verbatim
  • Good eye contact and gestures (not too big, not too small), good energy and enthusiasm
  • Make sure to move with purpose, avoid moving without purpose
  • Careful of posture and foot placement, don't rock on feet, don't lean against objects; don't put hands in pockets
  • Presentation agenda was disjointed (should follow logical flow)
  • Too much code on one slide, hard to read
  • Wait for answers to questions (asked audience question and moved on too quickly)
  • Laser pointer discipline, do not move it excessively, use as little as possible (more comments from me in future post)
  • Engage audience before presentation if possible (say hello, ask questions)
  • Inconsistent capitalization on slides
  • Set up Top 5/Countdown but didn't emphasize during presentation
  • Use SmartArt in Powerpoint for arrows and symbols on slides, rather than raster graphics
  • Do not grip lectern, do not hold hands behind back, keep hands loose, use gestures, don't stand in one place
  • Include quotations in quotation marks (make sure to include attribution)
  • Do not wear anything that covers your face or a hat can cast shadows
  • If you do an "About Me" slide, keep it short, include at end and have some contact info (email, Twitter)
  • For a 5 minute presentation, don't have as much breadth, go into more depth on fewer items/topics (e.g. mention 3 utilities rather than 7 utilities)
  • Including cost/benefit analysis was good touch
  • Do not look at slides on projection screen, maintain eye contact with audience (know your slide sequence)
  • Spell out term before using its acronym (TDE, PLE, etc.)
  • Check text visibility especially when using predetermined slide template
  • Do not repeat slides verbatim
  • Be sure of technical details/facts (one slide had incorrect information) and level (including regedit was too deep for 5 mins)
  • Command prompt font and background colors (default B&W not visible)
Final round:
  • Improved timing for several contestants
  • Improved body movement & gestures
  • Zoomit still needs work for some folks
  • Still some ums/uhs
  • Improved slides (fewer words, fewer bullets)
Here's feedback I received during my first presentation:
  • Ran out of time, had several slides of material I didn't cover
  • Um, uh in speech
  • Stood in one spot
  • Needs more humor
  • Stood in front of PA speaker, caused muddy audio
  • Wore microphone on lanyard, picked up fabric/jingly sounds
  • Looked back at slides on projection screen too often
I don't have notes on my specific feedback during the final round, except that my timing improved dramatically (due to rehearsal). I'll go into more detail on that in my other post.

I hope this helps all the contestants for 2015, and good luck to you all!

posted @ Saturday, October 24, 2015 7:09 PM | Feedback (0) | Filed Under [ Coolness ]

Monday, September 28, 2015

PASS Speaker Idol 2015

In 2014, PASS offered a new type of session at their Global Summit event: Speaker Idol.

Initiated by Denny Cherry (b | t), the concept is:
  • To provide an opportunity for community speakers
  • Who haven't yet presented a regular session at PASS Summit
  • A chance to present to a panel of judges - like American Idol, but without singing. (Sadly. Or not.)
  • For a guaranteed regular session at the next PASS Summit!
This is a FANTASTIC idea, and I was fortunate enough to participate that year. And I encourage EVERYONE who is eligible to participate this year and every year they can.

  • You get a chance to present at Summit, even if you don't get to the final round
  • You are getting free feedback from PASS Summit speakers
  • Who know exactly how you feel up on the stage
  • Who WANT to see you deliver a full session next year
You can find out more about the 2015 Speaker Idol here:

There is absolutely no downside.

Even if you don't get chosen, I recommend that you attend as many Speaker Idol sessions as you can. The feedback from the judges will help you become a better speaker, and would cost you hundreds of dollars from a professional speaking coach (and may not be as useful since they're probably not a PASS speaker).

What are you waiting for? You can enter using this form:


posted @ Monday, September 28, 2015 2:47 PM | Feedback (0) | Filed Under [ Coolness ]

Tuesday, April 08, 2014

T-SQL Tuesday #53-Matt's Making Me Do This!

Hello everyone! It's that time again, time for T-SQL Tuesday, the wonderful blog series started by Adam Machanic (b|t).

This month we are hosted by Matt Velic (b|t) who asks the question, "Why So Serious?", in celebration of April Fool's Day. He asks the contributors for their dirty tricks. And for some reason that escapes me, he and Jeff Verheul (b|t) seem to think I might be able to write about those. Shocked, I am!

Nah, not really. They're absolutely right, this one is gonna be fun!

I took some inspiration from Matt's suggestions, namely Resource Governor and Login Triggers.  I've done some interesting login trigger stuff for a presentation, but nothing yet with Resource Governor. Best way to learn it!

One of my oldest pet peeves is abuse of the sa login. Don't get me wrong, I use it too, but typically only as SQL Agent job owner. It's been a while since I've been stuck with it, but back when I started using SQL Server, EVERY application needed sa to function. It was hard-coded and couldn't be changed. (welllllll, that is if you didn't use a hex editor on the EXE file, but who would do such a thing?)

My standard warning applies: don't run anything on this page in production. In fact, back up whatever server you're testing this on, including the master database. Snapshotting a VM is a good idea. Also make sure you have other sysadmin level logins on that server.

So here's a standard template for a logon trigger to address those pesky sa users:


-- interesting stuff goes here



What can you do for "interesting stuff"? Books Online limits itself to merely rolling back the logon, which will throw an error (and alert the person that the logon trigger fired).  That's a good use for logon triggers, but really not tricky enough for this blog.  Some of my suggestions are below:

WAITFOR DELAY '23:59:59';



EXEC sp_MSforeach_db 'EXEC sp_detach_db ''?'';'



EXEC msdb.dbo.sp_add_job @job_name=N'`', @enabled=1, @start_step_id=1, @notify_level_eventlog=0, @delete_level=3;
EXEC msdb.dbo.sp_add_jobserver @job_name=N'`', @server_name=@@SERVERNAME;
EXEC msdb.dbo.sp_add_jobstep @job_name=N'`', @step_id=1, @step_name=N'`', @command=N'SHUTDOWN;';
EXEC msdb.dbo.sp_start_job @job_name=N'`';


Really, I don't want to spoil your own exploration, try it yourself!  The thing I really like about these is it lets me promote the idea that "sa is SLOW, sa is BUGGY, don't use sa!".  Before we get into Resource Governor, make sure to drop or disable that logon trigger. They don't work well in combination. (Had to redo all the following code when SSMS locked up)

Resource Governor is a feature that lets you control how many resources a single session can consume. The main goal is to limit the damage from a runaway query. But we're not here to read about its main goal or normal usage! I'm trying to make people stop using sa BECAUSE IT'S SLOW! Here's how RG can do that:

USE master;

RETURNS sysname
    ELSE N'default' END

--  ,MIN_IOPS_PER_VOLUME = 1 ,MAX_IOPS_PER_VOLUME = 1  -- uncomment for SQL Server 2014

      ,MAX_DOP = 1



From top to bottom:

  1. Create a classifier function to determine which pool the session should go to. More info on classifier functions.
  2. Create the pool and provide a generous helping of resources for the sa login.
  3. Create the workload group and further prioritize those resources for the sa login.
  4. Apply the classifier function and reconfigure RG to use it.

I have to say this one is a bit sneakier than the logon trigger, least of all you don't get any error messages.  I heartily recommend testing it in Management Studio, and click around the UI a lot, there's some fun behavior there. And DEFINITELY try it on SQL 2014 with the IO settings included!  You'll notice I made allowances for SQL Agent jobs owned by sa, they'll go into the default workload group.  You can add your own overrides to the classifier function if needed.

Some interesting ideas I didn't have time for but expect you to get to before me:

  • Set up different pools/workgroups with different settings and randomize which one the classifier chooses
  • Do the same but base it on time of day (Books Online example covers this)...
  • Or, which workstation it connects from. This can be modified for certain special people in your office who either don't listen, or are attracted (and attractive) to you.

And if things go wrong you can always use the following from another sysadmin or Dedicated Admin connection:



That will let you go in and either fix (or drop) the pools, workgroups and classifier function.

So now that you know these types of things are possible, and if you are tired of your team using sa when they shouldn't, I expect you'll enjoy playing with these quite a bit!

Unfortunately, the aforementioned Dedicated Admin Connection kinda poops on the party here.  Books Online for both topics will tell you that the DAC will not fire either feature. So if you have a crafty user who does their research, they can still sneak in with sa and do their bidding without being hampered.

Of course, you can still detect their login via various methods, like a server trace, SQL Server Audit, extended events, and enabling "Audit Successful Logins" on the server.  These all have their downsides: traces take resources, extended events and SQL Audit can't fire off actions, and enabling successful logins will bloat your error log very quickly.  SQL Audit is also limited unless you have Enterprise Edition, and Resource Governor is Enterprise-only.  And WORST OF ALL, these features are all available and visible through the SSMS UI, so even a doofus developer or manager could find them.

Fortunately there are Event Notifications!

Event notifications are becoming one of my favorite features of SQL Server (keep an eye out for more blogs from me about them). They are practically unknown and heinously underutilized.  They are also a great gateway drug to using Service Broker, another great but underutilized feature. Hopefully this will get you to start using them, or at least your enemies in the office will once they read this, and then you'll have to learn them in order to fix things.

So here's the setup:

USE msdb;

DECLARE @x XML, @message nvarchar(max);
IF @x.value('(//LoginName)[1]','sysname')=N'sa' AND @x.value('(//ApplicationName)[1]','sysname') NOT LIKE N'SQL Agent%'
    -- interesting activation procedure stuff goes here



TO SERVICE N'SA_LOGIN_PRIORITY_s', N'current database'


From top to bottom:

  1. Create activation procedure for event notification queue.
  2. Create queue to accept messages from event notification, and activate the procedure to process those messages when received.
  3. Create service to send messages to that queue.
  4. Create event notification on AUDIT_LOGIN events that fire the service.

I placed this in msdb as it is an available system database and already has Service Broker enabled by default. You should change this to another database if you can guarantee it won't get dropped.

So what to put in place for "interesting activation procedure code"?  Hmmm, so far I haven't addressed Matt's suggestion of writing a lengthy script to send an annoying message:

SET @message=@x.value('(//HostName)[1]','sysname') + 
N' tried to log in to server ' + @x.value('(//ServerName)[1]','sysname') + 
N' as SA at ' + @x.value('(//StartTime)[1]','sysname') + 
N' using the ' + @x.value('(//ApplicationName)[1]','sysname') + 
N' program. That''s why you''re getting this message and the attached pornography which' + 
N' is bloating your inbox and violating company policy, among other things. If you know' + 
N' this person you can go to their desk and hit them, or use the following SQL to end their session:

KILL ' + @x.value('(//SPID)[1]','sysname') +   

Hopefully they''re in the middle of a huge query that they need to finish right away.'

EXEC msdb.dbo.sp_send_dbmail @recipients=N'', @subject=N'SA Login Alert', @query_result_width=32767, 
@body=@message, @query=N'EXEC sp_readerrorlog;', @attach_query_result_as_file=1, @query_attachment_filename=N'UtterlyGrossPorn_SeriouslyDontOpenIt.jpg'

I'm not sure I'd call that a lengthy script, but the attachment should get pretty big, and I'm sure the email admins will love storing multiple copies of it.  The nice thing is that this also fires on Dedicated Admin connections! You can even identify DAC connections from the event data returned, I leave that as an exercise for you. You can use that info to change the action taken by the activation procedure, and since it's a stored procedure, it can pretty much do anything!

Except KILL the SPID, or SHUTDOWN the server directly.  I'm still working on those.

posted @ Tuesday, April 08, 2014 5:18 PM | Feedback (1) | Filed Under [ Coolness ]

Tuesday, September 17, 2013

Full-text Indexing Books Online

While preparing for a recent SQL Saturday presentation, I was struck by a crazy idea (shocking, I know): Could someone import the content of SQL Server Books Online into a database and apply full-text indexing to it?  The answer is yes, and it's really quite easy to do.

The first step is finding the installed help files.  If you have SQL Server 2012, BOL is installed under the Microsoft Help Library.  You can find the install location by opening SQL Server Books Online and clicking the gear icon for the Help Library Manager.  When the new window pops up click the Settings link, you'll get the following:

You'll see the path under Library Location.

Once you navigate to that path you'll have to drill down a little further, to C:\ProgramData\Microsoft\HelpLibrary\content\Microsoft\store.  This is where the help file content is kept if you downloaded it for offline use.

Depending on which products you've downloaded help for, you may see a few hundred files.  Fortunately they're named well and you can easily find the "SQL_Server_Denali_Books_Online_" files.  We are interested in the .MSHC files only, and can skip the Installation and Developer Reference files.

Despite the .MHSC extension, these files are compressed with the standard Zip format, so your favorite archive utility (WinZip, 7Zip, WinRar, etc.) can open them.  When you do, you'll see a few thousand files in the archive.  We are only interested in the .htm files, but there's no harm in extracting all of them to a folder.  7zip provides a command-line utility and the following will extract to a D:\SQLHelp folder previously created:

7z e –oD:\SQLHelp "C:\ProgramData\Microsoft\HelpLibrary\content\Microsoft\store\SQL_Server_Denali_Books_Online_B780_SQL_110_en-us_1.2.mshc" *.htm

Well that's great Rob, but how do I put all those files into a full-text index?

I'll tell you in a second, but first we have to set up a few things on the database side.  I'll be using a database named Explore (you can certainly change that) and the following setup is a fragment of the script I used in my presentation:

USE Explore;

-- Create default fulltext catalog for later FT indexes

CREATE TABLE help.files(file_id int not null IDENTITY(1,1) 
    path varchar(256) not null CONSTRAINT UNQ_help_files_path UNIQUE,
    doc_type varchar(6) DEFAULT('.xml'),
    content varbinary(max) not null);

CREATE FULLTEXT INDEX ON help.files(content TYPE COLUMN doc_type LANGUAGE 1033) 
    KEY INDEX PK_help_files;

This will give you a table, default full-text catalog, and full-text index on that table for the content you're going to insert.  I'll be using the command line again for this, it's the easiest method I know:

for %a in (D:\SQLHelp\*.htm) do sqlcmd -S. -E -d Explore -Q"set nocount on;insert help.files(path,content) select '%a', cast(c as varbinary(max)) from openrowset(bulk '%a', SINGLE_CLOB) as c(c)"

You'll need to copy and run that as one line in a command prompt.  I'll explain what this does while you run it and watch several thousand files get imported:

The "for" command allows you to loop over a collection of items.  In this case we want all the .htm files in the D:\SQLHelp folder.  For each file it finds, it will assign the full path and file name to the %a variable.  In the "do" clause, we'll specify another command to be run for each iteration of the loop.  I make a call to "sqlcmd" in order to run a SQL statement.  I pass in the name of the server (-S.), where "." represents the local default instance. I specify -d Explore as the database, and -E for trusted connection.  I then use -Q to run a query that I enclose in double quotes.

The query uses OPENROWSET(BULK…SINGLE_CLOB) to open the file as a data source, and to treat it as a single character large object.  In order for full-text indexing to work properly, I have to convert the text content to varbinary. I then INSERT these contents along with the full path of the file into the help.files table created earlier.  This process continues for each file in the folder, creating one new row in the table.

And that's it! 5 SQL Statements and 2 command line statements to unzip and import SQL Server Books Online!  In case you're wondering why I didn't use FILESTREAM or FILETABLE, it's simply because I haven't learned them…yet. I may return to this blog after I figure that out and update it with the steps to do so.  I believe that will make it even easier.

In the spirit of exploration, I'll leave you to work on some fulltext queries of this content.  I also recommend playing around with the sys.dm_fts_xxxx DMVs (I particularly like sys.dm_fts_index_keywords, it's pretty interesting).  There are additional example queries in the download material for my presentation linked above.

Many thanks to Kevin Boles (t) for his advice on (re)checking the content of the help files.  Don't let that .htm extension fool you! The 2012 help files are actually XML, and you'd need to specify '.xml' in your document type column in order to extract the full-text keywords.  (You probably noticed this in the default definition for the doc_type column.)  You can query sys.fulltext_document_types to get a complete list of the types that can be full-text indexed.

I also need to thank Hilary Cotter for giving me the original idea. I believe he used MSDN content in a full-text index for an article from waaaaaaaaaaay back, that I can't find now, and had forgotten about until just a few days ago.  He is also co-author of Pro Full-Text Search in SQL Server 2008, which I highly recommend.  He also has some FTS articles on Simple Talk:,-part-2/

posted @ Tuesday, September 17, 2013 6:33 PM | Feedback (1) | Filed Under [ Coolness ]

Wednesday, June 19, 2013

Criminals and Other Illegal Characters

SQLTeam's favorite Slovenian blogger Mladen (b | t) had an interesting question on Twitter:

I liked Kendal Van Dyke's (b | t) reply:

And he was right!  This is one of those pretty-useless-but-sounds-interesting propositions that I've based all my presentations on, and most of my blog posts.

If you read all the replies you'll see a lot of good suggestions.  I particularly like Aaron Bertrand's (b | t) idea of going into the Unicode character set, since there are over 65,000 characters available.  But how to find an illegal character?  Detective work?

I'm working on the premise that if SQL Server will reject it as a name it would throw an error.  So all we have to do is generate all Unicode characters, rename a database with that character, and catch any errors.

It turns out that dynamic SQL can lend a hand here:


WHILE @c<65536 BEGIN
        SET @sql=N'alter database ' + 
            QUOTENAME(CASE WHEN @c=1 THEN N'a' ELSE NCHAR(@c-1) END) +
            N' modify name=' + QUOTENAME(NCHAR(@c));
        RAISERROR(N'*** Trying %d',10,1,@c) WITH NOWAIT;
        SET @c+=1;
        SET @err=ERROR_MESSAGE();
        RAISERROR(N'Ooops - %d - %s',10,1,@c,@err) WITH NOWAIT;
SET @sql=N'alter database ' + QUOTENAME(NCHAR(@c-1)) + N' modify name=[a]';

The script creates a dummy database "a" if it doesn't already exist, and only tests single characters as a database name.  If you have databases with single character names then you shouldn't run this on that server.

It takes a few minutes to run, but if you do you'll see that no errors are thrown for any of the characters.  It seems that SQL Server will accept any character, no matter where they're from.  (Well, there's one, but I won't tell you which. Actually there's 2, but one of them requires some deep existential thinking.)

The output is also interesting, as quite a few codes do some weird things there.  I'm pretty sure it's due to the font used in SSMS for the messages output window, not all characters are available.  If you run it using the SQLCMD utility, and use the -o switch to output to a file, and -u for Unicode output, you can open the file in Notepad or another text editor and see the whole thing.

I'm not sure what character I'd recommend to answer Mladen's question.  I think the standard tab (ASCII 9) is fine.  There's also several specific separator characters in the original ASCII character set (decimal 28-31).

But of all the choices available in Unicode whitespace, I think my favorite would be the Mongolian Vowel Separator.  Or maybe the zero-width space. (that'll be fun to print!)  And since this is Mladen we're talking about, here's a good selection of "intriguing" characters he could use.

posted @ Wednesday, June 19, 2013 4:50 PM | Feedback (0) | Filed Under [ Coolness Hotness ]

Tuesday, May 07, 2013

SQL Saturday #220 - Atlanta - Pre-Con Scholarship Winners!

A few weeks ago, AtlantaMDF offered scholarships for each of our upcoming Pre-conference sessions at SQL Saturday #220. We would like to congratulate the winners!

David Thomas SQL Server Security
Vince Bible Surfing the Multicore Wave: Processors, Parallelism, and Performance
Mostafa Maged Languages of BI
Daphne Adams Practical Self-Service BI with PowerPivot for Excel
Tim Lawrence The DBA Skills Upgrade Toolkit

Thanks to everyone who applied! And once again we must thank Idera's generous sponsorship, and the time and effort made by Bobby Dimmick (w|t) and Brian Kelley (w|t) of Midlands PASS for judging all the applicants.

Don't forget, there's still time to attend the Pre-Cons on May 17, 2013! Click on the EventBrite links for more details and to register!


posted @ Tuesday, May 07, 2013 3:18 PM | Filed Under [ Coolness Hotness ]

Monday, April 22, 2013

SQL Saturday #220 - Atlanta - Pre-Conference Scholarships!


We Want YOU…To Learn!

AtlantaMDF and Idera are teaming up to find a few good people. If you are:

  • A student looking to work in the database or business intelligence fields
  • A database professional who is between jobs or wants a better one
  • A developer looking to step up to something new
  • On a limited budget and can’t afford professional SQL Server training
  • Able to attend training from 9 to 5 on May 17, 2013

AtlantaMDF is presenting 5 Pre-Conference Sessions (pre-cons) for SQL Saturday #220! And thanks to Idera’s sponsorship, we can offer one free ticket to each of these sessions to eligible candidates! That means one scholarship per Pre-Con!

One Recipient Each will Attend:

Denny Cherry: SQL Server Security
Adam Machanic: Surfing the Multicore Wave: Processors, Parallelism, and Performance
Stacia Misner: Languages of BI
Bill Pearson: Practical Self-Service BI with PowerPivot for Excel
Eddie Wuerch: The DBA Skills Upgrade Toolkit

If you are interested in attending these pre-cons send an email by April 30, 2013 to and tell us:

  • Why you are a good candidate to receive this scholarship
  • Which sessions you’d like to attend, and why (list multiple sessions in order of preference)
  • What the session will teach you and how it will help you achieve your goals

The emails will be evaluated by the good folks at Midlands PASS in Columbia, SC. The recipients will be notified by email and announcements made on May 6, 2013.


P.S. - Don't forget that SQLSaturday #220 offers free* training in addition to the pre-cons!
You can find more information about SQL Saturday #220 at
View the scheduled sessions at and register for them at

* Registration charges a $10 fee to cover lunch expenses.

posted @ Monday, April 22, 2013 3:59 PM | Filed Under [ Coolness Hotness ]

Monday, April 01, 2013

Database Mirroring on SQL Server Express Edition

Like most SQL Server users I'm rather frustrated by Microsoft's insistence on making the really cool features only available in Enterprise Edition.  And it really doesn't help that they changed the licensing for SQL 2012 to be core-based, so now it's like 4 times as expensive!  It almost makes you want to go with Oracle.  That, and a desire to have Larry Ellison do things to your orifices.

And since they've introduced Availability Groups, and marked database mirroring as deprecated, you'd think they'd make make mirroring available in all editions.  Alas…they don't…officially anyway.  Thanks to my constant poking around in places I'm not "supposed" to, I've discovered the low-level code that implements database mirroring, and found that it's available in all editions!

It turns out that the query processor in all SQL Server editions prepends a simple check before every edition-specific DDL statement:

IF CAST(SERVERPROPERTY('Edition') as nvarchar(max)) NOT LIKE '%e%e%e% Edition%'
print 'Lame'
print 'Cool'

If that statement returns true, it fails. (the print statements are just placeholders)  Go ahead and test it on Standard, Workgroup, and Express editions compared to an Enterprise or Developer edition instance (which support everything).

Once again thanks to Argenis Fernandez (b | t) and his awesome sessions on using Sysinternals, I was able to watch the exact process SQL Server performs when setting up a mirror.  Surprisingly, it's not actually implemented in SQL Server!  Some of it is, but that's something of a smokescreen, the real meat of it is simple filesystem primitives.

The NTFS filesystem supports links, both hard links and symbolic, so that you can create two entries for the same file in different directories and/or different names.  You can create them using the MKLINK command in a command prompt:

mklink /D D:\SkyDrive\Data D:\Data
mklink /D D:\SkyDrive\Log D:\Log

This creates a symbolic link from my data and log folders to my Skydrive folder.  Any file saved in either location will instantly appear in the other.  And since my Skydrive will be automatically synchronized with the cloud, any changes I make will be copied instantly (depending on my internet bandwidth of course).

So what does this have to do with database mirroring?  Well, it seems that the mirroring endpoint that you have to create between mirror and principal servers is really nothing more than a Skydrive link.  Although it doesn't actually use Skydrive, it performs the same function.  So in effect, the following statement:


Is turned into:

mklink /D "D:\Data" "\\\5022$"

The 5022$ "port" is actually a hidden system directory on the principal and mirror servers. I haven't quite figured out how the log files are included in this, or why you have to SET PARTNER on both principal and mirror servers, except maybe that mklink has to do something special when linking across servers.  I couldn't get the above statement to work correctly, but found that doing mklink to a local Skydrive folder gave me similar functionality.

To wrap this up, all you have to do is the following:

  1. Install Skydrive on both SQL Servers (principal and mirror) and set the local Skydrive folder (D:\SkyDrive in these examples)
  2. On the principal server, run mklink /D on the data and log folders to point to SkyDrive: mklink /D D:\SkyDrive\Data D:\Data
  3. On the mirror server, run the complementary linking: mklink /D D:\Data D:\SkyDrive\Data
  4. Create your database and make sure the files map to the principal data and log folders (D:\Data and D:\Log)
  5. Viola! Your databases are kept in sync on multiple servers!

One wrinkle you will encounter is that the mirror server will show the data and log files, but you won't be able to attach them to the mirror SQL instance while they are attached to the principal. I think this is a bug in the Skydrive, but as it turns out that's fine: you can't access a mirror while it's hosted on the principal either.  So you don't quite get automatic failover, but you can attach the files to the mirror if the principal goes offline.  It's also not exactly synchronous, but it's better than nothing, and easier than either replication or log shipping with a lot less latency.

I will end this with the obvious "not supported by Microsoft" and "Don't do this in production without an updated resume" spiel that you should by now assume with every one of my blog posts, especially considering the date.

posted @ Monday, April 01, 2013 7:00 PM | Feedback (0) | Filed Under [ Coolness Stupidness Oddball ]

Wednesday, March 20, 2013

SQL Saturday #220 Atlanta May 2013!

If you love SQL Server training and are near the Atlanta area, or just love us so much you're willing to travel here, please come join us for:


The main event is Saturday, May 18.  The event is free, with a $10.00 lunch fee.  The main page has more details here:

We are also offering pre-conference sessions on Friday, May 17, by 5 world-renowned presenters:





We have an early bird registration price of $119 until noon EST Friday, March 22.  After that the price goes to $149, a STEAL when you compare it to the PASS Summit price. :)

Please click on the links to register and for more information.  You can also follow the hash tag #SQLSatATL on Twitter for more news about this event.

Can't wait to see you all there!

posted @ Wednesday, March 20, 2013 6:52 PM | Filed Under [ Coolness Hotness ]

Powered by:
Powered By Subtext Powered By ASP.NET