Blog Stats

  • Blogs - 50
  • Posts - 2051
  • Articles - 63
  • Comments - 12360
  • Trackbacks - 443

Bloggers

Database Mirroring in Is It SQL

posted @ 11/10/2016 3:40 PM by Bill Graziano (1 comment)

Release 1.0.22 is out and available at the Is It SQL home page.  The biggest feature here is database mirroring. 

There are two ways to see database mirroring.  The first is on a server page it will show the databases that are mirrored.  There is also a global database mirroring page that shows all mirrored databases.  There’s an added column named “priority” that highlights databases that aren’t connected and synchronized.  It also shows shows databases that have send or receive queues.  Since the column is sortable it should always show databases with “stuff” going on at the top.

Please note that you’ll need to sign up for the newsletter for instructions on how to enable the beta features.  You can find the link on Is It SQL page.

Data that is pulled from servers when the page is refreshed is now identified with a little lightning bolt.  Most data is cached every minute but some data refreshes with each page load.  This helps identify which is which.  The basic rule of thumb is that everything is cached except what has the lightning bolt.

Tagging Servers in Is It SQL

posted @ 9/20/2016 8:40 AM by Bill Graziano (0 comments)

Tag-pageRelease 1.0.20 is out.  You can download it from my corporate site at http://www.scalesql.com/isitsql/. This release adds a beta feature to allow you tag servers and then filter the list by tag.

Instructions for enabling beta features are available when you sign up for the newsletter.  The newsletter is limited to Is It SQL features and releases.

The current release automatically tags servers based on their version.  This release also tags servers based on their domain too.  It allows you to enter user-defined tags for each server.  Common tags I’ve been using include development, test, production, disaster recovery, physical location, virtual vs. physical, application support, reporting, SSAS, SSRS, etc.

When I first starting writing this tool and using it I mostly focused on the key production servers.  While testing this feature I’ve gradually added more servers to it.  It’s been very helpful keeping an eye on servers I rarely look at but seem to have weird issues (cough, dev boxes, cough).  Tagging allows me to filter those out of my main list but still be able to find them when I need them.

IsItSql Shows Databases

posted @ 8/30/2016 10:58 AM by Bill Graziano (0 comments)

You can download the latest build of Is It SQL from my consulting site.  You can now view the databases on a server.  And you can click on a column header to sort that column.

database-page

And no, my fantasy baseball team in 2015 didn’t do very well.  Unfortunately this year is much the same.   

Here are a few other improvements:

  • The Active Task page for a server no longer displays tasks waiting on BROKER_RECEIVE_WAITFOR.
  • The menu bar stays on top when scrolling down.  And I’m really surprised how much this little change has made the whole application so much more usable.
  • There’s better error reporting if you launch the application and the port is already in use.  I mainly see this when I launch the application and it’s already running as a service.
  • It prioritizes ODBC 13 over ODBC 11.  And it will gracefully fail back to earlier drivers until it finds one it can use.

If you’re running as a service, just stop the service, copy over IsItSql.EXE, and restart the service.  There’s never any installation needed.  And my email is at the bottom of the README if you have any questions or concerns.

IsItSql–1.0.18

posted @ 8/4/2016 4:44 PM by Bill Graziano (0 comments)

This release includes a few minor bug fixes and smaller enhancements.  Thanks to everyone that tried it out and reported back.  At the bottom of the README.html I include my email address if you’d like to provide feedback.  I’d love to hear what you like or what needs work.

One of my goals for this application was to know if a server was unreachable or it returned an error running any of the polling queries.  These servers show up in red in the list on the main page.  They are also displayed at the top of every page in a nice red box.  Previously some pages didn’t display the servers with errors at the top.  One page even displayed them twice.  I guess it really, really wanted you to fix whatever the issue was.

I also excluded sessions running a WAITFOR command from the active sessions that are displayed when you look at the detail for a server.  A number of system SPIDs seem to use it.  And it seems a bit of my code does too.  I’m finding I don’t want to see it more than I want to see it.  I’m not sure this is the right answer and I’ll revisit it one of these days. 

The database size was a simple sum of sys.master_files which included snapshots.  Which ballooned the size reported if you had a lot of snapshots.  For now I’ve excluded snapshots from the size total.  I’m hoping in the next release to add back the actual size on the disk from the snapshot.  That’s what I want to see anyway.

You might not have noticed but the columns in the main page are sortable.  Just click on a header and they’ll sort.  They also remember their sort order across refreshes.  I changed a few of the columns to sort in descending order by default.  So CPU percentage and database size will show the largest values first.  I also need some type of icon indicating that these columns sort and what the order is but that’s going to be a future enhancement.

That’s it for this release.  The install should be as simple as stopping the service, copying over the executable and restarting the service.  If you have any issues, email me at the address at the bottom of the README.  You can download the latest build at www.scalesql.com/downloads/isitsql.1.0.18.zip

Is It SQL?

posted @ 8/2/2016 8:22 AM by Bill Graziano (7 comments)

I’ve been working on a simple monitoring tool for SQL Server over the last few months.  I often find myself asking “What just happened?” and “Was it SQL Server?”.  I wanted a small utility I could leave running at clients that would answer that question.  It needed to be simple enough that a non-SQL Server person could look at it and get an idea if there was an issue with SQL Server.  But also sophisticated enough that it would point me in the right direction.

 index-page

It will capture the following information for each server every minute:

  • CPU Usage for SQL Server and non-SQL Server tasks
  • Batch requests per second
  • Disk Read and Write volume and Page Life Expectancy
  • The largest waits grouped together.  For example, the PAGEIO* waits are all grouped together as Disk IO waits.  You can customize this mapping for your environment.
  • It also captures some basic information about the server including the version, restart date and aggregate database size.

server-page

When you bring up a server page it will show the currently running queries and the last 60 minutes of the metrics above.  When someone reports an issue I can look at the last hour for any server with just two clicks.  And quickly answer: “Is It SQL Server?”

I also wanted this to be very simple to use and install.  It can run as a simple command line executable.  It doesn’t require .NET or Java or even SQL Server.  The only requirement is some flavor of ODBC driver for SQL Server which nearly every server has.  It hosts a self-contained web server that displays the information you’re looking for.  It includes an option to install itself as a service so it runs in the background and is always available.  Adding servers is as simple as putting their names into a text file.

All the big servers I monitor already have a fancy monitoring solution.  But I’ve never been happy with the solutions for the second tier of servers or the smaller clients that won’t buy a dedicated monitoring solution.  So this is what I’ve been using. 

You can download this at http://www.scalesql.com/downloads/isitsql.1.0.17.zip.  There’s a README document that will help you get started.

ClearTrace Build 51 Available

posted @ 7/12/2016 5:44 PM by Bill Graziano (0 comments)

I’m pleased to announce that ClearTrace Build 51 is now available for download.  If you’re using a previous build (especially build 50) I encourage you to download this version.

This build should load the latest version of the SMO libraries installed on your machine.  If you’re using SQL Server 2016, you need the SQL Server engine AND SQL Server Management Studio installed.  I believe this is the only way to get the complete SMO library.  This also means that when SQL Server 2036 is released it will automatically support it – and hopefully all the releases between now and then.

This version has a new sort named “CPU & Reads”.  It sorts based on the rank of each statement for CPU and Reads.  If you choose to display the top 15 statements and choose this sort you will get the top 7 of CPU and the top 7 of Reads.  Assuming there will be some overlap it will then fill up to 15 with using the highest ranking statements.  Or to put it in terms you might better understand, the ORDER BY looks like this:

CASE
    WHEN [CpuRank] < [ReadsRank] THEN [CpuRank]
    ELSE [ReadsRank]
END , [CpuRank], [ReadsRank]

I’ve found this brings the most interesting statements to the top.

The build also fixes a small parsing bug for statements that look like this: EXEC sp_executesql @statement=N'SELECT 1'.  The combination of EXEC and sp_executesql was making the actual statement disappear.  That’s fixed.

If you find any issues, please post them in the ClearTrace Support Forums.

Thanks and Happy Tracing!

Installing SQL Server 2014 yields Unauthorized Operation

posted @ 12/10/2015 12:32 PM by Bill Graziano (3 comments)

We were experiencing trouble installing SQL Server 2014 onto a Windows 2012 R2 VM.  These are new production boxes so I was being very careful.  The problem manifest itself in two ways:

  • A clean installation using our service account got almost all the way through and then failed with the Unauthorized Operation message.  It was painful to get out of the installer.  I would just hang on a cancel and then eventually exit.
  • Installing using Local System worked but then we couldn’t switch to use our service account.  We received the same error.  This error included an 0x80070005 error code.

I have a number of other SQL Server 2014 installations on Windows 2012 R2 so this surprised me.  The detailed error log included this:

Slp: Sco: Attempting to open service handle for service MSSQLSERVER
Slp: Prompting user if they want to retry this action due to the following failure:
Slp: ----------------------------------------
Slp: The following is an exception stack listing the exceptions in outermost to innermost order
Slp: Inner exceptions are being indented
Slp: 
Slp: Exception type: Microsoft.SqlServer.Configuration.Sco.ScoException
Slp:     Message: 
Slp:         Attempted to perform an unauthorized operation.
Slp:     HResult : 0x84bb0001
Slp:         FacilityCode : 1211 (4bb)
Slp:         ErrorCode : 1 (0001)
Slp:     Data: 
Slp:       DisableRetry = true
Slp:     Inner exception type: System.UnauthorizedAccessException
Slp:         Message: 
Slp:                 Attempted to perform an unauthorized operation.
Slp:         HResult : 0x80070005
Slp:         Stack: 
Slp:                 at Microsoft.SqlServer.Configuration.Sco.Service.StartService(String[] startParams)
Slp: ----------------------------------------
Slp: User has chosen to retry this action

 

A search through the Internet revealed a number of possibilities.  I tried to run the installation as Administrator.  I wasn’t very hopeful on that since the first thing the installer does is pop up the UAC prompt.  Someone also suggested disabling the UAC functionality.  I didn’t try that.  We also copied the installation media of the mounted DVD into a regular directory.  That didn’t help either.

We finally discovered that the team that preps the VMs was now removing both “ABC\Domain Users” and “Authenticated Users” from the local “Users” group.  And that was the issue.  Putting the service account back in that group fixed the issue. 

T-SQL Tuesday #72: Data Modeling

posted @ 11/10/2015 10:45 AM by Most Valuable Yak (Rob Volk) (0 comments)

It's that time again: T-SQL Tuesday! This month the lovely and amazing Mickey Stuewe (b | t) is hosting and asks us to write about the trials and tribulations of data modeling.

Below is a tale that occurs somewhat frequently (in my humble experience anyway), I’ve changed some of the details so as not to embarrass anyone besides myself. Any resemblance to actual data models, living or dead, is really unfortunate.

(Yes, the excessive code is deliberate)

I really love ice cream!

I decided to open my own ice cream parlor and design the database I'd need to support it:

CREATE TABLE IceCream(IceCreamID int NOT NULL IDENTITY(1,1) PRIMARY KEY, IceCreamName nvarchar(255) NOT NULL);

Naturally we need to support some accessories:

CREATE TABLE IceCreamToppings(IceCreamToppingID int NOT NULL IDENTITY(1,1) PRIMARY KEY, IceCreamToppingName nvarchar(255) NOT NULL);

And of course we need data:

INSERT IceCream(IceCreamName) VALUES ('vanilla'),('chocolate'),('strawberry'),('Neopolitano'),('pistachio'),('mint chocolate chip');
INSERT IceCreamToppings(IceCreamToppingName) VALUES ('chocolate sprinkles'),('candy sprinkles'),('chocolate chips'),
('blueberry sauce'),('whipped cream'),('butterscotch'),('m&ms'),('oreo crumbles');

Business took off into the stratosphere!

My customers also love putting ice cream on breakfast foods, so I offered pancakes and waffles!

CREATE TABLE Pancakes(PancakeID int NOT NULL IDENTITY(1,1) PRIMARY KEY, PancakeName nvarchar(255) NOT NULL);
CREATE TABLE PancakeToppings(PancakeToppingID int NOT NULL IDENTITY(1,1) PRIMARY KEY, PancakeToppingName nvarchar(255) NOT NULL);
CREATE TABLE Waffles(WaffleID int NOT NULL IDENTITY(1,1) PRIMARY KEY, WaffleName nvarchar(255) NOT NULL);
CREATE TABLE WaffleToppings(WaffleToppingID int NOT NULL IDENTITY(1,1) PRIMARY KEY, WaffleToppingName nvarchar(255) NOT NULL);

INSERT Pancakes(PancakeName) VALUES ('Homemade'),('frozen');
INSERT Waffles(WaffleName) VALUES ('Belgian'),('Eggo round'),('Eggo Square');
INSERT PancakeToppings(PancakeToppingName) VALUES ('chocolate chips'),('blueberries'),('blueberry sauce'),('maple syrup'),('apples'),('cinnamon sugar'),('powdered sugar');
INSERT WaffleToppings(WaffleToppingName) VALUES ('chocolate chips'),('blueberries'),('blueberry sauce'),('maple syrup'),('apples'),('cinnamon sugar'),('powdered sugar');

WOW!

They loved this so much I started offering other breakfast options:

CREATE TABLE Omelettes(OmeletteID int NOT NULL IDENTITY(1,1) PRIMARY KEY, OmeletteName nvarchar(255) NOT NULL);
CREATE TABLE OmeletteStuffings(OmeletteStuffingID int NOT NULL IDENTITY(1,1) PRIMARY KEY, OmeletteStuffingName nvarchar(255) NOT NULL);

INSERT Omelettes(OmeletteName) VALUES ('whole eggs'),('egg whites');
INSERT OmeletteStuffings(OmeletteStuffingName) VALUES ('sausage'),('mushrooms'),('green peppers'),('ham'),('spinach'),('feta cheese'),('onion'),('red peppers');

And then lunch offerings soon followed:

CREATE TABLE Pizza(PizzaID int NOT NULL IDENTITY(1,1) PRIMARY KEY, PizzaName nvarchar(255) NOT NULL);
CREATE TABLE PizzaToppings(PizzaToppingID int NOT NULL IDENTITY(1,1) PRIMARY KEY, PizzaToppingName nvarchar(255) NOT NULL);

CREATE TABLE Burgers(BurgerID int NOT NULL IDENTITY(1,1) PRIMARY KEY, BurgerName nvarchar(255) NOT NULL);
CREATE TABLE BurgerToppings(BurgerToppingID int NOT NULL IDENTITY(1,1) PRIMARY KEY, BurgerToppingName nvarchar(255) NOT NULL);

CREATE TABLE Sandwiches(SandwichID int NOT NULL IDENTITY(1,1) PRIMARY KEY, SandwichName nvarchar(255) NOT NULL);
CREATE TABLE SandwichToppings(SandwichToppingID int NOT NULL IDENTITY(1,1) PRIMARY KEY, SandwichToppingName nvarchar(255) NOT NULL);

INSERT Pizza(PizzaName) VALUES ('deep dish'),('sicilian'),('regular');
INSERT PizzaToppings(PizzaToppingName) VALUES ('pepperoni'),('sausage'),('mushrooms'),('green peppers'),('ham'),('spinach'),('feta cheese'),('onion'),('pineapple');
INSERT Burgers(BurgerName) VALUES ('beef'),('kobe'),('angus'),('turkey'),('chicken');
INSERT BurgerToppings(BurgerToppingName) VALUES ('bacon'),('mayo'),('mustard'),('lettuce'),('tomato'),('red onion'),('white onion'),('sauteed onion'),('mushrooms'),('sauteed mushrooms'),('pickles');
INSERT Sandwiches(SandwichName) VALUES ('roast beef'),('ham'),('turkey'),('pastrami'),('baloney');
INSERT SandwichToppings(SandwichToppingName) VALUES ('bacon'),('mayo'),('mustard'),('lettuce'),('tomato'),('red onion'),('white onion'),('sauteed onion'),('mushrooms'),('sauteed mushrooms'),('pickles');

And then, my business went international!

People wanted my food, but didn't speak English, so I offered translations for everything:

CREATE TABLE IceCreamToppingsTranslation(LanguageID int NOT NULL, IceCreamToppingID int NOT NULL, TranslatedText nvarchar(4000) NOT NULL);
CREATE TABLE WaffleToppingsTranslation(LanguageID int NOT NULL, WaffleToppingID int NOT NULL, TranslatedText nvarchar(4000) NOT NULL);
CREATE TABLE PancakeToppingsTranslation(LanguageID int NOT NULL, PancakeToppingID int NOT NULL, TranslatedText nvarchar(4000) NOT NULL);
CREATE TABLE PizzaToppingsTranslation(LanguageID int NOT NULL, PizzaToppingID int NOT NULL, TranslatedText nvarchar(4000) NOT NULL);
CREATE TABLE BurgerToppingsTranslation(LanguageID int NOT NULL, BurgerToppingID int NOT NULL, TranslatedText nvarchar(4000) NOT NULL);
-- repeat ad nauseum

And then...

At some point, I was too busy serving food to design databases anymore, and couldn't deal with the performance issues that were cropping up, so I hired ace database designer Portia N. Carrera, and she promptly replied:

"This design is ridiculous! And it will only get worse as time goes on! No one in their right mind 
would ever do such a thing!" (edited to remove excessive but totally justified profanity) 

Well, they would. They did. **

"Bull----! How did it get like this?" 

Well, that's kinda the thing. I started small, and added stuff. It's easy and natural, and fairly sensible as long as things stay small. Unfortunately as things grew I never went back to look at what's there and question whether this is the "right" way or if there's a better way. This became technical debt and the effort needed to change it was insurmountable.

"Here's a good take on technical debt:" https://medium.com/@MostlyHarmlessD/on-technical-debt-7bac65edf349 

It describes the reasoning/mentality around these decisions, with much of it focused on time constraints. Portia calls this "5 minute/5 second design" and curses everyone's lack of foresight (when she's in a bad mood). ***

Well Portia, how would you fix it?

"Like this:" 

CREATE TABLE FoodTypes(FoodTypeID int NOT NULL PRIMARY KEY, FoodType nvarchar(64) NOT NULL);
INSERT FoodTypes(FoodType) VALUES(N'Ice Cream'),(N'Pancakes'),(N'Waffles'),(N'Omelettes'),(N'Pizza'),(N'Burgers'),(N'Sandwiches');

CREATE TABLE Foods(FoodID int NOT NULL PRIMARY KEY, FoodTypeID int NOT NULL, Food nvarchar(64) NOT NULL);
INSERT Foods(FoodTypeID,Food) 
VALUES(1,N'Vanilla'),(1,N'Chocolate'),(1,N'Strawberry') -- Ice cream
,(2,N'Homemade'),(2,N'Frozen') -- Pancakes
,(3,N'Belgian'),(3,N'Eggo Round'),(3,N'Eggo Square') -- Waffles
,(4,N'Whole Eggs'),(4,N'Egg Whites') -- Omelettes
,(5,N'Deep Dish'),(5,N'Sicilian'),(5,N'Regular');  -- Pizza

CREATE TABLE FoodExtras(FoodExtraID int NOT NULL PRIMARY KEY, FoodTypeID int NOT NULL, FoodExtraName nvarchar(64) NOT NULL);
INSERT FoodExtras(FoodTypeID,FoodExtraName) 
VALUES(1,N'Chocolate sprinkles'),(1,N'Candy sprinkles'),(1,N'Chocolate chips'),(1,N'Blueberry sauce') -- Ice cream toppings
,(2,N'Chocolate chips'),(2,N'Blueberries'),(2,N'Blueberry sauce'),(2,N'Maple syrup') -- Pancake toppings
,(3,N'Chocolate chips'),(3,N'Blueberries'),(3,N'Blueberry sauce'),(3,N'Maple syrup') -- Waffle toppings
,(4,N'Sausage'),(4,N'Mushrooms'),(4,N'Green Peppers'),(4,N'Ham') -- Omelette stuffings
,(5,N'Pepperoni'),(5,N'Sausage'),(5,N'Mushrooms'),(5,N'Green Peppers'),(5,N'Ham'); -- Pizza toppings

CREATE TABLE FoodExtraTranslations(FoodExtraID int NOT NULL, LanguageID int NOT NULL, TranslatedText nvarchar(4000) NOT NULL);
INSERT FoodExtraTranslations(FoodExtraID,LanguageID,TranslatedText)
VALUES(1,1031,N'German Chocolate Sprinkles'),(1,1036,N'French Chocolate Sprinkles')
,(2,1031,N'German Candy Sprinkles'),(1,1036,N'French Candy Sprinkles')
,(13,1031,N'Sausage auf Deutsch'),(13,1036,N'Sausage au français');

She continued:

"If this looks suspiciously like the dreaded Entity-Attribute-Value model, that's because it is. 
There are definite disadvantages to EAV, but for this particular case it's a reasonable option to support 
this data. The biggest advantage is that no new structure is required; your data becomes row-based rather than 
table-based. If you get a new type of food, you just add a row to the FoodTypes table, then the associated rows 
for extras and their translations. You'll probably wonder why Food and FoodExtras are separate, and they really 
don't have to be. I'll leave that redesign for you to figure out, you're not paying me enough for that." 

But I can't rewrite all my application code! They're expecting tables named Waffles, Pancakes, etc. and it's too much work to change it!

"Oh give me a ------- break! Just create views with the same name that only SELECT those rows:" 

CREATE VIEW Pancakes(PancakeID,PancakeName) AS SELECT FoodID, Food FROM Foods WHERE FoodTypeID=2 WITH CHECK OPTION; -- Pancakes
GO
CREATE VIEW Waffles(WaffleID,WaffleName) AS SELECT FoodID, Food FROM Foods WHERE FoodTypeID=3 WITH CHECK OPTION; -- Waffles
GO
CREATE VIEW Pizza(PizzaID,PizzaName) AS SELECT FoodID, Food FROM Foods WHERE FoodTypeID=5 WITH CHECK OPTION; -- Pizza
GO

"And if you're doing things correctly, you're calling a stored procedure to do all your INSERT/UPDATE/DELETE operations, 
those can be modified easily:" 

CREATE PROCEDURE AddFood @foodType int, @foodName nvarchar(64) AS
INSERT Foods(FoodTypeID,Food) VALUES(@foodType,@foodName);
GO

ALTER PROCEDURE AddPizza @pizzaName nvarchar(64) AS
EXEC AddFood 5, @pizzaName;
-- original AddPizza code commented out now...
GO

Portia continued:

"BTW, you shouldn't hard-code your IDs either, but I'll let you fix that. And if you're doing it with ad-hoc SQL, 
or an ORM of some kind, you can still use the views if you add an INSTEAD OF trigger to handle INSERTs and UPDATEs. You're 
not paying me enough to do that either, but it's easy enough to write a code generator to do all that." 

Lessons Learned

If you can’t make or take the time to design them up front, take time every few months to review your table designs. If you see a lot of similar or identical tables, look to consolidate them. See if a row-based approach can be done. Also make sure to test performance, sometimes it may not improve with a new design.

But more importantly, TAKE MORE THAN 5 MINUTES to consider your database designs! It's too easy to "get it done right now" and leave it to someone else to fix later, if ever. Technical debt compounds faster than credit card debt.

** This is based on a model I've seen in my career, with about 130 "types" of things that all have an associated translation table attached.

*** And if you think that technical debt is unavoidable, or not that bad, consider working at Facebook: http://www.darkcoding.net/software/facebooks-code-quality-problem/

Notes from Speaker Idol 2014

posted @ 10/24/2015 7:08 PM by Most Valuable Yak (Rob Volk) (0 comments)

When last I blogged, I recommended everyone who was eligible to sign up for PASS Summit Speaker Idol 2015. The contestants have been announced: http://www.dcac.co/your-pass-speaker-idol-2015-top-12-are

They also announced the judges in case you want to try the bribery route: http://www.dcac.co/your-pass-speaker-idol-2015-judges

To help this year's participants I will be blogging my notes from when I presented for Speaker Idol 2014, and I'll also add some specifics about my experience in a separate post. I also recommend that you read up on the previous contest and some follow up from a few of the judges:

http://itknowledgeexchange.techtarget.com/sql-server/your-sql-pass-2014-speaker-idol-contestants-are/
http://itknowledgeexchange.techtarget.com/sql-server/pass-speaker-idol-2014-review-and-notes/
http://www.dcac.co/pass-summit-2014-speaker-idol-a-judges-perspective
http://www.datamodel.com/index.php/2014/11/13/pass-summit-2014-speaker-idoljudging-you-part-i/

The notes that follow are transcribed from handwritten notes. I'm listing anonymized comments from the judges, primarily because I haven't gotten their permission, but also to avoid bias on either theirs or the contestant's side.  In no particular order (red were items to improve, green were items that were well done, my personal comments in parentheses):
  • Need a story about how (your presentation topic) saved you time or improved your job
  • Mentioned great benefits (of presentation topic) without providing evidence or comparison
  • Too much information for a 5 minute presentation
  • Don't condense a 1 hour presentation to 5 minutes
  • The faster you need to go, the slower you should talk (don't rush through explanation)
  • Voice cut out when turning head away from microphone, turn whole body instead
  • Test demos, make sure they don't fail
  • Use scripts only, DO NOT TYPE CODE (typing is not the same as a demo)
  • No bullets on slide template (plain formatting)
  • Too many bullet points on slides
  • Avoid using periods at end of bullet points (they aren't sentences)
  • Small graphics on slides
  • Good to paste code on slides to avoid needing SSMS, also good font and color for code
  • Great time management with 23 seconds left (should finish between 4:30 and 5:00 minutes)
  • Not sure what topic was about, need to make clear in beginning; not much meat in first 2 minutes
  • Good to engage audience at beginning of 5 minute presentation ("ballsy" according to 1 judge, but not generally recommended)
  • If asking question, be sure to answer/respond, repeat question for audio/audience, and repeat show of hands to audience
  • Make sure images on your slides have proper attribution (author/source and license type), also for quotations/citations (several contestants were missing these)
  • Good recovery from technical problems (projector, mouse/remote, audio, demo issues; 3 speakers were commended for this)
  • Presentation felt like "Books Online"
  • Presented on floor rather than podium
  • ZoomIT use mostly poor (only 2 speakers had favorable comments, 3 had negative)
  • Never really explained/defined the topic, rushed to demo after long intro, could have explained better within time limit
  • Font and background color on slides hard to read (it was noted that slides were templates and presenter may not be able to change)
  • Allow enough time for audience to read slides
  • Don't repeat slides verbatim
  • Good eye contact and gestures (not too big, not too small), good energy and enthusiasm
  • Make sure to move with purpose, avoid moving without purpose
  • Careful of posture and foot placement, don't rock on feet, don't lean against objects; don't put hands in pockets
  • Presentation agenda was disjointed (should follow logical flow)
  • Too much code on one slide, hard to read
  • Wait for answers to questions (asked audience question and moved on too quickly)
  • Laser pointer discipline, do not move it excessively, use as little as possible (more comments from me in future post)
  • Engage audience before presentation if possible (say hello, ask questions)
  • Inconsistent capitalization on slides
  • Set up Top 5/Countdown but didn't emphasize during presentation
  • Use SmartArt in Powerpoint for arrows and symbols on slides, rather than raster graphics
  • Do not grip lectern, do not hold hands behind back, keep hands loose, use gestures, don't stand in one place
  • Include quotations in quotation marks (make sure to include attribution)
  • Do not wear anything that covers your face or a hat can cast shadows
  • If you do an "About Me" slide, keep it short, include at end and have some contact info (email, Twitter)
  • For a 5 minute presentation, don't have as much breadth, go into more depth on fewer items/topics (e.g. mention 3 utilities rather than 7 utilities)
  • Including cost/benefit analysis was good touch
  • Do not look at slides on projection screen, maintain eye contact with audience (know your slide sequence)
  • Spell out term before using its acronym (TDE, PLE, etc.)
  • Check text visibility especially when using predetermined slide template
  • Do not repeat slides verbatim
  • Be sure of technical details/facts (one slide had incorrect information) and level (including regedit was too deep for 5 mins)
  • Command prompt font and background colors (default B&W not visible)
Final round:
  • Improved timing for several contestants
  • Improved body movement & gestures
  • Zoomit still needs work for some folks
  • Still some ums/uhs
  • Improved slides (fewer words, fewer bullets)
Here's feedback I received during my first presentation:
  • Ran out of time, had several slides of material I didn't cover
  • Um, uh in speech
  • Stood in one spot
  • Needs more humor
  • Stood in front of PA speaker, caused muddy audio
  • Wore microphone on lanyard, picked up fabric/jingly sounds
  • Looked back at slides on projection screen too often
I don't have notes on my specific feedback during the final round, except that my timing improved dramatically (due to rehearsal). I'll go into more detail on that in my other post.

I hope this helps all the contestants for 2015, and good luck to you all!

PASS Speaker Idol 2015

posted @ 9/28/2015 2:27 PM by Most Valuable Yak (Rob Volk) (0 comments)

In 2014, PASS offered a new type of session at their Global Summit event: Speaker Idol.

Initiated by Denny Cherry (b | t), the concept is:
  • To provide an opportunity for community speakers
  • Who haven't yet presented a regular session at PASS Summit
  • A chance to present to a panel of judges - like American Idol, but without singing. (Sadly. Or not.)
  • For a guaranteed regular session at the next PASS Summit!
This is a FANTASTIC idea, and I was fortunate enough to participate that year. And I encourage EVERYONE who is eligible to participate this year and every year they can.

Why?
  • You get a chance to present at Summit, even if you don't get to the final round
  • You are getting free feedback from PASS Summit speakers
  • Who know exactly how you feel up on the stage
  • Who WANT to see you deliver a full session next year
You can find out more about the 2015 Speaker Idol here:

http://www.dcac.co/pass-summit-2015-speaker-idol-is-back
http://www.dcac.co/want-to-speak-at-the-pass-summit-but-havent-been-selected

There is absolutely no downside.

Even if you don't get chosen, I recommend that you attend as many Speaker Idol sessions as you can. The feedback from the judges will help you become a better speaker, and would cost you hundreds of dollars from a professional speaking coach (and may not be as useful since they're probably not a PASS speaker).

What are you waiting for? You can enter using this form:

https://docs.google.com/forms/d/1fjcI98guXfH5ESlmMZ0CygGtGTVqqRFH-12vJZMP0A4/viewform

STOP READING THIS BLOG AND ENTER NOW!

ClearTrace Build 47 Available

posted @ 7/5/2015 11:06 PM by Bill Graziano (0 comments)

You can download the latest version of ClearTrace from my consulting site: scaleSQL Consulting

I’ve decided to stop using the year as part of the version.  I did that to match SQL Server branding but it doesn’t work for ClearTrace.  The current build supports SQL Server 2008, SQL Server 2008 R2, SQL Server 2012, and SQL Server 2014.  It tries to load the most recent SMO and goes backwards until it successfully loads an SMO library.  Always use the most recent version of ClearTrace and it should support any current SQL Server.

If you’re having issues with ClearTrace you can visit the new ClearTrace Support Forums.  That’s the best place to ask questions or report bugs.

imageThe latest build adds a number of features.  The first is an option that  doesn’t reprocess trace files.  I regularly generate large traces that have many files and I’m too impatient to wait for them all to finish.  The script I use generates a unique name for each trace file.  This option allows me to skip the trace files that have already been loaded.

I also fixed a pseudo-memory leak that had been hanging around for a LONG time.  One of my early uses of ClearTrace was for a client that had lots of prepared SQL.  In the trace file I would find lots of sp_execute 32423445 or some such.  That number identifies the prepared SQL statement.  ClearTrace captures the statement that prepares that SQL and reports back the actual SQL executed.  This was all being stored in memory.  Either I wasn’t clearing it properly or I didn’t understand how it got cleared but this structure was ballooning up.  I’d regularly fill up memory to just under 2GB and watch the application crash.  Now the application clears this structure after each trace file.  If you want better handling of prepared SQL just increase the trace file size.  That will capture more of the prepared SQL statements.

Trace Groups allow you to group traces together.  They are especially helpful to view changes over time.  I typically have one trace group per server.  I’ve seen users that have dozens and dozens of trace groups with all kinds of fancy names.  So I made the Trace Group Name drop down wider and hopefully easier to work with.

imageIf you capture statement level traces and you have lots of nested stored procedures you’ll see lots of EXEC statements in your list of statements.  If you end up a few layers deep you can see LOTS of EXEC statements.  So I added an entry to the Events drop down filter to exclude statements that start with EXEC.  All it does is eliminate anything that starts with “EXEC”.  So far it’s been helpful. 

I also fixed the history tab to work better with statements.  And by “better” I mean it actually works now.  You should get a statements history without the application blowing up.

(And if you downloaded build 46 please update to build 47.  It’s just minor bug fixes around how statements are displayed in the grid.  Or not displayed as was happening in some situations.)

And that’s it.  Give it a try and let me know any issues in the ClearTrace forums.

SSMS Tools Pack 3.5.0.0 is out with a NEW feature, a few improved features and various bug fixes.

posted @ 5/3/2015 9:04 PM by Mladen Prajdić (4 comments)

The new feature is called Global Replacement Texts.
It gives you the ability to add custom replacement texts to your scripts in four different features.

Licensing now gives you the ability to upgrade your existing license from the previous major version.
This upgrade path is around 30% cheaper than buying a completely new license.

SQL Editor Plus has been extended with 2 new functionalities:
- Rename Window Tab by right clicking on your window's tab.
- Data Destruction Protector now protects against accidental DROP TABLE.

SQL History has a bug fixed that it didn't save on every interval when using Save to Database functionality.

Search Database Data now has column type choice like search through all textual or numeric columns, etc.

Other Minor Features have been improved like:
- Easier distribution of pre-configured add-in settings to new users with Default Settings Deploy.
- Full compatibility with built-in accessibility color schemes for visually impaired.
- Increased stability.

Hope you enjoy it.

Error: 18456, Severity: 14, State: 11

posted @ 10/8/2014 2:20 PM by Bill Graziano (0 comments)

I’ve seen this error off and on in our environment for a while.  The full error looks like this:

Login failed for user DOMAIN\login’. Reason: Token-based server access validation failed with an infrastructure error. Check for previous errors. [CLIENT: X.X.X.X]
Error: 18456, Severity: 14, State: 11.

Aaron Bertrand wrote the post I always start with on troubleshooting 18456 errors.  I’ve always struggled tracking down the root cause of state 11 though.  I’ve seen it intermittently occur and then clean itself up.  One of the reported causes is difficultly connecting to a domain controller so that kind of fits. 

But this time it was occurring and not going away.  People have suggested deleting and recreating the login, running as Administrator, disabling UAC, etc.

In our case, the login had been given permission through a Windows Group.  That group had been removed, thus the user no longer had permission.  But rather than give a standard login failure, it gave the state 11 error.  That left me heading down a rabbit hole until xp_logininfo cleared up that they just didn’t have access.  My only guess is that because they’d logged in before there was some artifact left on the system and the result was a slightly different login failure error message.

I’m hoping this might help the next person tracking down the dreaded state 11.

Refactoring for Performance

posted @ 10/7/2014 11:48 AM by Tara Kizer (5 comments)

For the past few days, I've been working on a stored procedure that was the top offender for CPU usage. I tried various index changes, played around with index hints, etc. Everything I tried either made very little improvement or no improvement.

I was ready to throw in the towel, but then I decided to give the code one last look. I don't have the business knowledge for this particular system, so I had been avoiding the complex stored procedure that has over 200 lines of code. One of the queries has 8 joins, and 2 of those are to views that also contain joins. My head about exploded when I first looked at it.

But then I took a step back and narrowed down where the performance problem was. It was a view that was using SUM, MIN, MAX and ROW_NUMBER functions. It was performing those functions on a large data set, and then the stored procedure filtered it using the input parameters.

My code change was to remove the functions from the view and instead calculate those in the stored procedure.

This is a very simple example:

Original

CREATE VIEW View1
AS 
SELECT 
    SUM(Column3) OVER(PARTITION BY Column6, Column7) AS SumColumn3, 
    ROW_NUMBER() OVER(PARTITION BY Column6, Column7) AS ViewRowNumber,
    MIN(Column4) OVER(PARTITION BY Column8) AS MinColumn4, 
    MAX(Column4) OVER(PARTITION BY Column8) AS MaxColumn4 
FROM Table1 
WHERE Column5 = 'SomeValue' AND Column9 = 0

CREATE PROC Proc1 (@var1 varchar(10), @var2 varchar(10)) 
AS 
SELECT SumColumn3, ViewRowNumber, MinColumn4, MaxColumn4
FROM View1
WHERE Column1 = @var1 AND Column2 = @var2

Refactored

CREATE VIEW View1
AS 
SELECT Column3, Column4, Column6, Column7, Column8
FROM Table1  
WHERE Column5 = 'SomeValue' AND Column9 = 0

CREATE PROC Proc1 (@var1 varchar(10), @var2 varchar(10)) 
AS
SELECT 
    SUM(Column3) OVER(PARTITION BY Column6, Column7) AS SumColumn3, 
    ROW_NUMBER() OVER(PARTITION BY Column6, Column7) AS ViewRowNumber,
    MIN(Column4) OVER(PARTITION BY Column8) AS MinColumn4, 
    MAX(Column4) OVER(PARTITION BY Column8) AS MaxColumn4
FROM View1
WHERE Column1 = @var1 AND Column2 = @var2

Here are my actual results:

  Duration (ms) Reads CPU
Original 1009 430127 2730
Refactored 24 523 16

 

BOOM!

SQL Saturday 300 BBQ Crawl

posted @ 7/14/2014 6:52 PM by Bill Graziano (1 comment)

SQL Saturday #300 is coming up right here in Kansas City on September 13th, 2014.  This is our fifth SQL Saturday which means it's the fifth anniversary of our now infamous BBQ Crawl.  We get together on Friday afternoon before the event and visit a few local joints.  We've done nice places and we've done dives.  We haven’t picked the venues yet but I promise you’ll be well fed!

And if you’re thinking about the BBQ crawl you should think about submitting a session.  Our call for speakers closes Tuesday, July 15th so you just have time! 

If you’re going to be at the event, contact me and I’ll get you added to the list.

SSMS Tools Pack 3.0 is out. Full SSMS 2014 support and improved features.

posted @ 5/20/2014 12:04 AM by Mladen Prajdić (6 comments)

With version 3.0 the SSMS 2014 is fully supported.
Since this is a new major version you'll eventually need a new license. Please check the EULA to see when.

As a thank you for your patience with this release, everyone that bought
the SSMS Tools Pack after April 1st, the release date of SQL Server 2014, will receive a free upgrade
.
You won't have to do anything for this to take effect.

First thing you'll notice is that the UI has been completely changed.
It's more in line with SSMS and looks less web-like.
Also the core has been updated and rewritten in some places to be better suited for future features.

Major improvements for this release are:

Window Connection Coloring
Something a lot of people have asked me over the last 2 years is if there's a way
to color the tab of the window itself. I'm very glad to say that now it is.
In SSMS 2012 and higher the actual query window tab is also colored at the top border
with the same color as the already existing strip making it much easier to see to which
server your query window is connected to even when a window is not focused.
To make it even better, you can not also specify the desired color based on the database name
and not just the server name. This makes is useful for production environments where you
need to be careful in which database you run your queries in.

Format SQL
The format SQL core was rewritten so it'll be easier to improve it in future versions.
New improvement is the ability to terminate SQL statements with semicolons.
This is available only in SSMS 2012 and up.

Execution Plan Analyzer
A big request was to implement the Problems and Solutions tooltip as a window
that you can copy the text from. This is now available. You can move the window around
and copy text from it. It's a small improvement but better stuff will come.

SQL History
Current Window History has been improved with faster search and now also shows
the color of the server/database it was ran against. This is very helpful if you change
your connection in the same query window making it clear which server/database you ran query on.
The option to Force Save the history has been added. This is a menu item that flushes
the execution and tab content history save buffers to disk.

SQL Snippets
Added an option to generate snippet from selected SQL text on right click menu.

Run script on multiple databases
Configurable database groups that you can save and reuse were added.
You can create groups of preselected databases to choose from for each server.
This makes repetitive tasks much easier

New small team licensing option
A lot of requests came in for 1 computer, Unlimited VMs option so now it's here.
Hope it serves you well.

T-SQL Tuesday #53-Matt's Making Me Do This!

posted @ 4/8/2014 5:17 PM by Most Valuable Yak (Rob Volk) (1 comment)

Hello everyone! It's that time again, time for T-SQL Tuesday, the wonderful blog series started by Adam Machanic (b|t).

This month we are hosted by Matt Velic (b|t) who asks the question, "Why So Serious?", in celebration of April Fool's Day. He asks the contributors for their dirty tricks. And for some reason that escapes me, he and Jeff Verheul (b|t) seem to think I might be able to write about those. Shocked, I am!

Nah, not really. They're absolutely right, this one is gonna be fun!

I took some inspiration from Matt's suggestions, namely Resource Governor and Login Triggers.  I've done some interesting login trigger stuff for a presentation, but nothing yet with Resource Governor. Best way to learn it!

One of my oldest pet peeves is abuse of the sa login. Don't get me wrong, I use it too, but typically only as SQL Agent job owner. It's been a while since I've been stuck with it, but back when I started using SQL Server, EVERY application needed sa to function. It was hard-coded and couldn't be changed. (welllllll, that is if you didn't use a hex editor on the EXE file, but who would do such a thing?)

My standard warning applies: don't run anything on this page in production. In fact, back up whatever server you're testing this on, including the master database. Snapshotting a VM is a good idea. Also make sure you have other sysadmin level logins on that server.

So here's a standard template for a logon trigger to address those pesky sa users:

CREATE TRIGGER SA_LOGIN_PRIORITY ON ALL SERVER 
WITH ENCRYPTION, EXECUTE AS N'sa'
AFTER LOGON AS
IF ORIGINAL_LOGIN()<>N'sa' OR APP_NAME() LIKE N'SQL Agent%' RETURN;

-- interesting stuff goes here

GO

 

What can you do for "interesting stuff"? Books Online limits itself to merely rolling back the logon, which will throw an error (and alert the person that the logon trigger fired).  That's a good use for logon triggers, but really not tricky enough for this blog.  Some of my suggestions are below:

WAITFOR DELAY '23:59:59';

 

Or:

EXEC sp_MSforeach_db 'EXEC sp_detach_db ''?'';'

 

Or:

EXEC msdb.dbo.sp_add_job @job_name=N'`', @enabled=1, @start_step_id=1, @notify_level_eventlog=0, @delete_level=3;
EXEC msdb.dbo.sp_add_jobserver @job_name=N'`', @server_name=@@SERVERNAME;
EXEC msdb.dbo.sp_add_jobstep @job_name=N'`', @step_id=1, @step_name=N'`', @command=N'SHUTDOWN;';
EXEC msdb.dbo.sp_start_job @job_name=N'`';

 

Really, I don't want to spoil your own exploration, try it yourself!  The thing I really like about these is it lets me promote the idea that "sa is SLOW, sa is BUGGY, don't use sa!".  Before we get into Resource Governor, make sure to drop or disable that logon trigger. They don't work well in combination. (Had to redo all the following code when SSMS locked up)

Resource Governor is a feature that lets you control how many resources a single session can consume. The main goal is to limit the damage from a runaway query. But we're not here to read about its main goal or normal usage! I'm trying to make people stop using sa BECAUSE IT'S SLOW! Here's how RG can do that:

USE master;
GO

CREATE FUNCTION dbo.SA_LOGIN_PRIORITY()
RETURNS sysname
WITH SCHEMABINDING, ENCRYPTION AS 
BEGIN
RETURN CASE 
    WHEN ORIGINAL_LOGIN()=N'sa' AND APP_NAME() NOT LIKE N'SQL Agent%'
    THEN N'SA_LOGIN_PRIORITY'
    ELSE N'default' END
END
GO

CREATE RESOURCE POOL SA_LOGIN_PRIORITY
WITH (
     MIN_CPU_PERCENT = 0 ,MAX_CPU_PERCENT = 1 
    ,CAP_CPU_PERCENT = 1 ,AFFINITY SCHEDULER = (0)
    ,MIN_MEMORY_PERCENT = 0 ,MAX_MEMORY_PERCENT = 1
--  ,MIN_IOPS_PER_VOLUME = 1 ,MAX_IOPS_PER_VOLUME = 1  -- uncomment for SQL Server 2014
);

CREATE WORKLOAD GROUP SA_LOGIN_PRIORITY
WITH
    ( IMPORTANCE = LOW
      ,REQUEST_MAX_MEMORY_GRANT_PERCENT = 1
      ,REQUEST_MAX_CPU_TIME_SEC = 1
      ,REQUEST_MEMORY_GRANT_TIMEOUT_SEC = 1
      ,MAX_DOP = 1
      ,GROUP_MAX_REQUESTS = 1 )
USING SA_LOGIN_PRIORITY;

ALTER RESOURCE GOVERNOR WITH (CLASSIFIER_FUNCTION=dbo.SA_LOGIN_PRIORITY);
ALTER RESOURCE GOVERNOR RECONFIGURE;

 

From top to bottom:

  1. Create a classifier function to determine which pool the session should go to. More info on classifier functions.
  2. Create the pool and provide a generous helping of resources for the sa login.
  3. Create the workload group and further prioritize those resources for the sa login.
  4. Apply the classifier function and reconfigure RG to use it.

I have to say this one is a bit sneakier than the logon trigger, least of all you don't get any error messages.  I heartily recommend testing it in Management Studio, and click around the UI a lot, there's some fun behavior there. And DEFINITELY try it on SQL 2014 with the IO settings included!  You'll notice I made allowances for SQL Agent jobs owned by sa, they'll go into the default workload group.  You can add your own overrides to the classifier function if needed.

Some interesting ideas I didn't have time for but expect you to get to before me:

  • Set up different pools/workgroups with different settings and randomize which one the classifier chooses
  • Do the same but base it on time of day (Books Online example covers this)...
  • Or, which workstation it connects from. This can be modified for certain special people in your office who either don't listen, or are attracted (and attractive) to you.

And if things go wrong you can always use the following from another sysadmin or Dedicated Admin connection:

ALTER RESOURCE GOVERNOR DISABLE;

 

That will let you go in and either fix (or drop) the pools, workgroups and classifier function.

So now that you know these types of things are possible, and if you are tired of your team using sa when they shouldn't, I expect you'll enjoy playing with these quite a bit!

Unfortunately, the aforementioned Dedicated Admin Connection kinda poops on the party here.  Books Online for both topics will tell you that the DAC will not fire either feature. So if you have a crafty user who does their research, they can still sneak in with sa and do their bidding without being hampered.

Of course, you can still detect their login via various methods, like a server trace, SQL Server Audit, extended events, and enabling "Audit Successful Logins" on the server.  These all have their downsides: traces take resources, extended events and SQL Audit can't fire off actions, and enabling successful logins will bloat your error log very quickly.  SQL Audit is also limited unless you have Enterprise Edition, and Resource Governor is Enterprise-only.  And WORST OF ALL, these features are all available and visible through the SSMS UI, so even a doofus developer or manager could find them.

Fortunately there are Event Notifications!

Event notifications are becoming one of my favorite features of SQL Server (keep an eye out for more blogs from me about them). They are practically unknown and heinously underutilized.  They are also a great gateway drug to using Service Broker, another great but underutilized feature. Hopefully this will get you to start using them, or at least your enemies in the office will once they read this, and then you'll have to learn them in order to fix things.

So here's the setup:

USE msdb;
GO

CREATE PROCEDURE dbo.SA_LOGIN_PRIORITY_act
WITH ENCRYPTION AS
DECLARE @x XML, @message nvarchar(max);
RECEIVE @x=CAST(message_body AS XML) FROM SA_LOGIN_PRIORITY_q;
IF @x.value('(//LoginName)[1]','sysname')=N'sa' AND @x.value('(//ApplicationName)[1]','sysname') NOT LIKE N'SQL Agent%'
BEGIN
    -- interesting activation procedure stuff goes here
END
GO

CREATE QUEUE SA_LOGIN_PRIORITY_q 
    WITH STATUS=ON, RETENTION=OFF,
    ACTIVATION (PROCEDURE_NAME=dbo.SA_LOGIN_PRIORITY_act, MAX_QUEUE_READERS=1, EXECUTE AS OWNER);

CREATE SERVICE SA_LOGIN_PRIORITY_s ON QUEUE SA_LOGIN_PRIORITY_q([http://schemas.microsoft.com/SQL/Notifications/PostEventNotification]);

CREATE EVENT NOTIFICATION SA_LOGIN_PRIORITY_en ON SERVER WITH FAN_IN
FOR AUDIT_LOGIN
TO SERVICE N'SA_LOGIN_PRIORITY_s', N'current database'
GO

 

From top to bottom:

  1. Create activation procedure for event notification queue.
  2. Create queue to accept messages from event notification, and activate the procedure to process those messages when received.
  3. Create service to send messages to that queue.
  4. Create event notification on AUDIT_LOGIN events that fire the service.

I placed this in msdb as it is an available system database and already has Service Broker enabled by default. You should change this to another database if you can guarantee it won't get dropped.

So what to put in place for "interesting activation procedure code"?  Hmmm, so far I haven't addressed Matt's suggestion of writing a lengthy script to send an annoying message:

SET @message=@x.value('(//HostName)[1]','sysname') + 
N' tried to log in to server ' + @x.value('(//ServerName)[1]','sysname') + 
N' as SA at ' + @x.value('(//StartTime)[1]','sysname') + 
N' using the ' + @x.value('(//ApplicationName)[1]','sysname') + 
N' program. That''s why you''re getting this message and the attached pornography which' + 
N' is bloating your inbox and violating company policy, among other things. If you know' + 
N' this person you can go to their desk and hit them, or use the following SQL to end their session:

KILL ' + @x.value('(//SPID)[1]','sysname') +   
N';

Hopefully they''re in the middle of a huge query that they need to finish right away.'

EXEC msdb.dbo.sp_send_dbmail @recipients=N'CompanyWideDistroList@yourcompany.com', @subject=N'SA Login Alert', @query_result_width=32767, 
@body=@message, @query=N'EXEC sp_readerrorlog;', @attach_query_result_as_file=1, @query_attachment_filename=N'UtterlyGrossPorn_SeriouslyDontOpenIt.jpg'

I'm not sure I'd call that a lengthy script, but the attachment should get pretty big, and I'm sure the email admins will love storing multiple copies of it.  The nice thing is that this also fires on Dedicated Admin connections! You can even identify DAC connections from the event data returned, I leave that as an exercise for you. You can use that info to change the action taken by the activation procedure, and since it's a stored procedure, it can pretty much do anything!

Except KILL the SPID, or SHUTDOWN the server directly.  I'm still working on those.

Simple Merging Of PDF Documents with iTextSharp 5.4.5.0

posted @ 1/10/2014 7:38 PM by Mladen Prajdić (1 comment)

As we were working on our first SQL Saturday in Slovenia, we came to a point when we had to print out the so-called SpeedPASS's for attendees. This SpeedPASS file is a PDF and contains thier raffle, lunch and admission tickets.

The problem is we have to download one PDF per attendee and print that out. And printing more than 10 docs at once is a pain. So I decided to make a little console app that would merge multiple PDF files into a single file that would be much easier to print. I used an open source PDF manipulation library called iTextSharp version 5.4.5.0

This is a console program I used. It’s brilliantly named MergeSpeedPASS. It only has two methods and is really short. Don't let the name fool you It can be used to merge any PDF files.

The first parameter is the name of the target PDF file that will be created.
The second parameter is the directory containing PDF files to be merged into a single file.

using iTextSharp.text;
using iTextSharp.text.pdf;
using System;
using System.IO;

namespace MergeSpeedPASS
{
    class Program
    {
        static void Main(string[] args)
        {
            if (args.Length == 0 || args[0] == "-h" || args[0] == "/h")
            {
                Console.WriteLine("Welcome to MergeSpeedPASS. Created by Mladen Prajdic. Uses iTextSharp 5.4.5.0.");
                Console.WriteLine("Tool to create a single SpeedPASS PDF from all downloaded generated PDFs.");
                Console.WriteLine("");
                Console.WriteLine("Example: MergeSpeedPASS.exe targetFileName sourceDir");
                Console.WriteLine("         targetFileName = name of the new merged PDF file. Must include .pdf extension.");
                Console.WriteLine("         sourceDir      = path to the dir containing downloaded attendee SpeedPASS PDFs");
                Console.WriteLine("");
                Console.WriteLine(@"Example: MergeSpeedPASS.exe MergedSpeedPASS.pdf d:\Downloads\SQLSaturdaySpeedPASSFiles");
            }
            else if (args.Length == 2)
                CreateMergedPDF(args[0], args[1]);

            Console.WriteLine("");
            Console.WriteLine("Press any key to exit...");
            Console.Read();
        }

        static void CreateMergedPDF(string targetPDF, string sourceDir)
        {
            using (FileStream stream = new FileStream(targetPDF, FileMode.Create))
            {
                Document pdfDoc = new Document(PageSize.A4);
                PdfCopy pdf = new PdfCopy(pdfDoc, stream);
                pdfDoc.Open();                
                var files = Directory.GetFiles(sourceDir);
                Console.WriteLine("Merging files count: " + files.Length);
                int i = 1;
                foreach (string file in files)
                {
                    Console.WriteLine(i + ". Adding: " + file);
                    pdf.AddDocument(new PdfReader(file));
                    i++;
                }

                if (pdfDoc != null)
                    pdfDoc.Close();

                Console.WriteLine("SpeedPASS PDF merge complete.");
            }
        }
    }
}


Hope it helps you and have fun.

PASS Summit 2013 Review

posted @ 10/20/2013 8:21 PM by Ajarn Mark Caldwell (0 comments)

As a long-standing member of PASS who lives in the greater Seattle area and has attended about nine of these Summits, let me start out by saying how GREAT it was to go to Charlotte, North Carolina this year.  Many of the new folks that I met at the Summit this year, upon hearing that I was from Seattle, commented that I must have been disappointed to have to travel to the Summit this year after 5 years in a row in Seattle.  Well, nothing could be further from the truth.  I cheered loudly when I first heard that the 2013 Summit would be outside Seattle.  I have many fond memories of trips to Orlando, Florida and Grapevine, Texas for past Summits (missed out on Denver, unfortunately).  And there is a funny dynamic that takes place when the conference is local.  If you do as I have done the last several years and saved my company money by not getting a hotel, but rather just commuting from home, then both family and coworkers tend to act like you’re just on a normal schedule.  For example, I have a young family, and my wife and kids really wanted to still see me come home “after work”, but there are a whole lot of after-hours activities, social events, and great food to be enjoyed at the Summit each year.  Even more so if you really capitalize on the opportunities to meet face-to-face with people you either met at previous summits or have spoken to or heard of, from Twitter, blogs, and forums.  Then there is also the lovely commuting in Seattle traffic from neighboring cities rather than the convenience of just walking across the street from your hotel.  So I’m just saying, there are really nice aspects of having the conference 2500 miles away.

Beyond that, the training was fantastic as usual.  The SQL Server community has many outstanding presenters and experts with deep knowledge of the tools who are extremely willing to share all of that with anyone who wants to listen.  The opening video with PASS President Bill Graziano in a NASCAR race turned dream sequence was very well done, and the keynotes, as usual, were great.  This year I was particularly impressed with how well attended were the Professional Development sessions.  Not too many years ago, those were very sparsely attended, but this year, the two that I attended were standing-room only, and these were not tiny rooms.  I would say this is a testament to both the maturity of the attendees realizing how important these topics are to career success, as well as to the ever-increasing skills of the presenters and the program committee for selecting speakers and topics that resonated with people.  If, as is usually the case, you were not able to get to every session that you wanted to because there were just too darn many good ones, I encourage you to get the recordings.

Overall, it was a great time as these events always are.  It was wonderful to see old friends and make new ones, and the people of Charlotte did an awesome job hosting the event and letting their hospitality shine (extra kudos to SQLSentry for all they did with the shuttle, maps, and other event sponsorships).  We’re back in Seattle next year (it is a release year, after all) but I would say that with the success of this year’s event, I strongly encourage the Board and PASS HQ to firmly reestablish the location rotation schedule.  I’ll even go so far as to suggest standardizing on an alternating Seattle – Charlotte schedule, or something like that.

If you missed the Summit this year, start saving now, and register early, so you can join us!

It&rsquo;s A Team Sport: PASS Board Year 2, Q3

posted @ 10/13/2013 9:23 PM by Denise McInerney (0 comments)

As I type this I’m on an airplane en route to my 12th PASS Summit. It’s been a very busy 3.5 months since my last post on my work as a Board member. Nearing the end of my 2-year term I am struck by how much has happened, and yet how fast the time has gone.

But I’ll save the retrospective post for next time and today focus on what happened in Q3. In the last three months we made progress on several fronts, thanks to the contributions of many volunteers and HQ staff members. They deserve our appreciation for their dedication to delivering for the membership week after week.

Virtual Chapters

The Virtual Chapters continue to provide many PASS members with valuable free training. Between July and September of 2013 VCs hosted over 50 webinars with a total of 4300 attendees.

This quarter also saw the launch of the Security & Global Russian VCs. Both are off to a strong start and I welcome these additions to the Virtual Chapter portfolio.

At the beginning of 2012 we had 14 Virtual Chapters. Today we have 22. This growth has been exciting to see. It has also created a need to have more volunteers help manage the work of the VCs year-round. We have renewed focus on having Virtual Chapter Mentors work with the VC Leaders and other volunteers. I am grateful to volunteers Julie Koesmarno, Thomas LeBlanc and Marcus Bittencourt who join original VC Mentor Steve Simon on this team. Thank you for stepping up to help.

Many improvements to the VC web sites have been rolling out over the past few weeks. Our marketing and IT teams have been busy working a new look-and-feel, features and a logo for each VC. They have given the VCs a fresh, professional look consistent with the rest of the PASS branding, and all VCs now have a logo that connects to PASS and the particular focus of the chapter.

24 Hours of PASS

The Summit Preview edition  of 24HOP was held on July 31 and by all accounts was a success. Our first use of the GoToWebinar platform for this event went extremely well. Thanks to our speakers, moderators and sponsors for making this event possible. Special thanks to HQ staffers Vicki Van Damme and Jane Duffy for a smoothly run event.

Coming up: the 24HOP Portuguese Edition will be held November 13-14, followed December 12-13 by the Spanish Edition. Thanks to the Portuguese- and Spanish-speaking community volunteers who are organizing these events.

July Board Meeting

The Board met July 18-19 in Kansas City. The first order of business was the election of the Executive Committee who will take office January 1. I was elected Vice President of Marketing and will join incoming President Thomas LaRock, incoming Executive Vice President of Finance Adam Jorgensen and Immediate Past President Bill Graziano on the Exec Co. I am honored that my fellow Board members elected me to this position and look forward to serving the organization in this role.

Visit to PASS HQ

In late September I traveled to Vancouver for my first visit to PASS HQ, where I joined Tom LaRock and Adam Jorgensen to make plans for 2014.  Our visit was just a few weeks before PASS Summit and coincided with the Board election, and the office was humming with activity. I saw first-hand the enthusiasm and dedication of everyone there. In each interaction I observed a focus on what is best for PASS and our members. Our partners at HQ are key to the organization’s success. This week at PASS Summit is a great opportunity for all of us to remember that, and say “thanks.”

Next Up

PASS Summit—of course! I’ll be around all week and look forward to connecting with many of our member over meals, at the Community Zone and between sessions.

In the evenings you can find me at the Welcome Reception, Exhibitor’s Reception and Community Appreciation Party. And I will be at the Board Q&A session  Friday at 12:45 p.m.

Transitions

The newly elected Exec Co and Board members take office January 1, and the Virtual Chapter portfolio is transitioning to a new director. I’m thrilled that Jen Stirrup will be taking over. Jen has experience as a volunteer and co-leader of the Business Intelligence Virtual Chapter and was a key contributor to the BI VCs expansion to serving our members in the EMEA region. I’ll be working closely with Jen over the next couple of months to ensure a smooth transition.