Blog Stats

  • Blogs - 50
  • Posts - 2029
  • Articles - 63
  • Comments - 12336
  • Trackbacks - 443

Bloggers

T-SQL Tuesday #53-Matt's Making Me Do This!

posted @ 4/8/2014 5:17 PM by Most Valuable Yak (Rob Volk) (1 comment)

Hello everyone! It's that time again, time for T-SQL Tuesday, the wonderful blog series started by Adam Machanic (b|t).

This month we are hosted by Matt Velic (b|t) who asks the question, "Why So Serious?", in celebration of April Fool's Day. He asks the contributors for their dirty tricks. And for some reason that escapes me, he and Jeff Verheul (b|t) seem to think I might be able to write about those. Shocked, I am!

Nah, not really. They're absolutely right, this one is gonna be fun!

I took some inspiration from Matt's suggestions, namely Resource Governor and Login Triggers.  I've done some interesting login trigger stuff for a presentation, but nothing yet with Resource Governor. Best way to learn it!

One of my oldest pet peeves is abuse of the sa login. Don't get me wrong, I use it too, but typically only as SQL Agent job owner. It's been a while since I've been stuck with it, but back when I started using SQL Server, EVERY application needed sa to function. It was hard-coded and couldn't be changed. (welllllll, that is if you didn't use a hex editor on the EXE file, but who would do such a thing?)

My standard warning applies: don't run anything on this page in production. In fact, back up whatever server you're testing this on, including the master database. Snapshotting a VM is a good idea. Also make sure you have other sysadmin level logins on that server.

So here's a standard template for a logon trigger to address those pesky sa users:

CREATE TRIGGER SA_LOGIN_PRIORITY ON ALL SERVER 
WITH ENCRYPTION, EXECUTE AS N'sa'
AFTER LOGON AS
IF ORIGINAL_LOGIN()<>N'sa' OR APP_NAME() LIKE N'SQL Agent%' RETURN;

-- interesting stuff goes here

GO

 

What can you do for "interesting stuff"? Books Online limits itself to merely rolling back the logon, which will throw an error (and alert the person that the logon trigger fired).  That's a good use for logon triggers, but really not tricky enough for this blog.  Some of my suggestions are below:

WAITFOR DELAY '23:59:59';

 

Or:

EXEC sp_MSforeach_db 'EXEC sp_detach_db ''?'';'

 

Or:

EXEC msdb.dbo.sp_add_job @job_name=N'`', @enabled=1, @start_step_id=1, @notify_level_eventlog=0, @delete_level=3;
EXEC msdb.dbo.sp_add_jobserver @job_name=N'`', @server_name=@@SERVERNAME;
EXEC msdb.dbo.sp_add_jobstep @job_name=N'`', @step_id=1, @step_name=N'`', @command=N'SHUTDOWN;';
EXEC msdb.dbo.sp_start_job @job_name=N'`';

 

Really, I don't want to spoil your own exploration, try it yourself!  The thing I really like about these is it lets me promote the idea that "sa is SLOW, sa is BUGGY, don't use sa!".  Before we get into Resource Governor, make sure to drop or disable that logon trigger. They don't work well in combination. (Had to redo all the following code when SSMS locked up)

Resource Governor is a feature that lets you control how many resources a single session can consume. The main goal is to limit the damage from a runaway query. But we're not here to read about its main goal or normal usage! I'm trying to make people stop using sa BECAUSE IT'S SLOW! Here's how RG can do that:

USE master;
GO

CREATE FUNCTION dbo.SA_LOGIN_PRIORITY()
RETURNS sysname
WITH SCHEMABINDING, ENCRYPTION AS 
BEGIN
RETURN CASE 
    WHEN ORIGINAL_LOGIN()=N'sa' AND APP_NAME() NOT LIKE N'SQL Agent%'
    THEN N'SA_LOGIN_PRIORITY'
    ELSE N'default' END
END
GO

CREATE RESOURCE POOL SA_LOGIN_PRIORITY
WITH (
     MIN_CPU_PERCENT = 0 ,MAX_CPU_PERCENT = 1 
    ,CAP_CPU_PERCENT = 1 ,AFFINITY SCHEDULER = (0)
    ,MIN_MEMORY_PERCENT = 0 ,MAX_MEMORY_PERCENT = 1
--  ,MIN_IOPS_PER_VOLUME = 1 ,MAX_IOPS_PER_VOLUME = 1  -- uncomment for SQL Server 2014
);

CREATE WORKLOAD GROUP SA_LOGIN_PRIORITY
WITH
    ( IMPORTANCE = LOW
      ,REQUEST_MAX_MEMORY_GRANT_PERCENT = 1
      ,REQUEST_MAX_CPU_TIME_SEC = 1
      ,REQUEST_MEMORY_GRANT_TIMEOUT_SEC = 1
      ,MAX_DOP = 1
      ,GROUP_MAX_REQUESTS = 1 )
USING SA_LOGIN_PRIORITY;

ALTER RESOURCE GOVERNOR WITH (CLASSIFIER_FUNCTION=dbo.SA_LOGIN_PRIORITY);
ALTER RESOURCE GOVERNOR RECONFIGURE;

 

From top to bottom:

  1. Create a classifier function to determine which pool the session should go to. More info on classifier functions.
  2. Create the pool and provide a generous helping of resources for the sa login.
  3. Create the workload group and further prioritize those resources for the sa login.
  4. Apply the classifier function and reconfigure RG to use it.

I have to say this one is a bit sneakier than the logon trigger, least of all you don't get any error messages.  I heartily recommend testing it in Management Studio, and click around the UI a lot, there's some fun behavior there. And DEFINITELY try it on SQL 2014 with the IO settings included!  You'll notice I made allowances for SQL Agent jobs owned by sa, they'll go into the default workload group.  You can add your own overrides to the classifier function if needed.

Some interesting ideas I didn't have time for but expect you to get to before me:

  • Set up different pools/workgroups with different settings and randomize which one the classifier chooses
  • Do the same but base it on time of day (Books Online example covers this)...
  • Or, which workstation it connects from. This can be modified for certain special people in your office who either don't listen, or are attracted (and attractive) to you.

And if things go wrong you can always use the following from another sysadmin or Dedicated Admin connection:

ALTER RESOURCE GOVERNOR DISABLE;

 

That will let you go in and either fix (or drop) the pools, workgroups and classifier function.

So now that you know these types of things are possible, and if you are tired of your team using sa when they shouldn't, I expect you'll enjoy playing with these quite a bit!

Unfortunately, the aforementioned Dedicated Admin Connection kinda poops on the party here.  Books Online for both topics will tell you that the DAC will not fire either feature. So if you have a crafty user who does their research, they can still sneak in with sa and do their bidding without being hampered.

Of course, you can still detect their login via various methods, like a server trace, SQL Server Audit, extended events, and enabling "Audit Successful Logins" on the server.  These all have their downsides: traces take resources, extended events and SQL Audit can't fire off actions, and enabling successful logins will bloat your error log very quickly.  SQL Audit is also limited unless you have Enterprise Edition, and Resource Governor is Enterprise-only.  And WORST OF ALL, these features are all available and visible through the SSMS UI, so even a doofus developer or manager could find them.

Fortunately there are Event Notifications!

Event notifications are becoming one of my favorite features of SQL Server (keep an eye out for more blogs from me about them). They are practically unknown and heinously underutilized.  They are also a great gateway drug to using Service Broker, another great but underutilized feature. Hopefully this will get you to start using them, or at least your enemies in the office will once they read this, and then you'll have to learn them in order to fix things.

So here's the setup:

USE msdb;
GO

CREATE PROCEDURE dbo.SA_LOGIN_PRIORITY_act
WITH ENCRYPTION AS
DECLARE @x XML, @message nvarchar(max);
RECEIVE @x=CAST(message_body AS XML) FROM SA_LOGIN_PRIORITY_q;
IF @x.value('(//LoginName)[1]','sysname')=N'sa' AND @x.value('(//ApplicationName)[1]','sysname') NOT LIKE N'SQL Agent%'
BEGIN
    -- interesting activation procedure stuff goes here
END
GO

CREATE QUEUE SA_LOGIN_PRIORITY_q 
    WITH STATUS=ON, RETENTION=OFF,
    ACTIVATION (PROCEDURE_NAME=dbo.SA_LOGIN_PRIORITY_act, MAX_QUEUE_READERS=1, EXECUTE AS OWNER);

CREATE SERVICE SA_LOGIN_PRIORITY_s ON QUEUE SA_LOGIN_PRIORITY_q([http://schemas.microsoft.com/SQL/Notifications/PostEventNotification]);

CREATE EVENT NOTIFICATION SA_LOGIN_PRIORITY_en ON SERVER WITH FAN_IN
FOR AUDIT_LOGIN
TO SERVICE N'SA_LOGIN_PRIORITY_s', N'current database'
GO

 

From top to bottom:

  1. Create activation procedure for event notification queue.
  2. Create queue to accept messages from event notification, and activate the procedure to process those messages when received.
  3. Create service to send messages to that queue.
  4. Create event notification on AUDIT_LOGIN events that fire the service.

I placed this in msdb as it is an available system database and already has Service Broker enabled by default. You should change this to another database if you can guarantee it won't get dropped.

So what to put in place for "interesting activation procedure code"?  Hmmm, so far I haven't addressed Matt's suggestion of writing a lengthy script to send an annoying message:

SET @message=@x.value('(//HostName)[1]','sysname') + 
N' tried to log in to server ' + @x.value('(//ServerName)[1]','sysname') + 
N' as SA at ' + @x.value('(//StartTime)[1]','sysname') + 
N' using the ' + @x.value('(//ApplicationName)[1]','sysname') + 
N' program. That''s why you''re getting this message and the attached pornography which' + 
N' is bloating your inbox and violating company policy, among other things. If you know' + 
N' this person you can go to their desk and hit them, or use the following SQL to end their session:

KILL ' + @x.value('(//SPID)[1]','sysname') +   
N';

Hopefully they''re in the middle of a huge query that they need to finish right away.'

EXEC msdb.dbo.sp_send_dbmail @recipients=N'CompanyWideDistroList@yourcompany.com', @subject=N'SA Login Alert', @query_result_width=32767, 
@body=@message, @query=N'EXEC sp_readerrorlog;', @attach_query_result_as_file=1, @query_attachment_filename=N'UtterlyGrossPorn_SeriouslyDontOpenIt.jpg'

I'm not sure I'd call that a lengthy script, but the attachment should get pretty big, and I'm sure the email admins will love storing multiple copies of it.  The nice thing is that this also fires on Dedicated Admin connections! You can even identify DAC connections from the event data returned, I leave that as an exercise for you. You can use that info to change the action taken by the activation procedure, and since it's a stored procedure, it can pretty much do anything!

Except KILL the SPID, or SHUTDOWN the server directly.  I'm still working on those.

Simple Merging Of PDF Documents with iTextSharp 5.4.5.0

posted @ 1/10/2014 7:38 PM by Mladen Prajdić (1 comment)

As we were working on our first SQL Saturday in Slovenia, we came to a point when we had to print out the so-called SpeedPASS's for attendees. This SpeedPASS file is a PDF and contains thier raffle, lunch and admission tickets.

The problem is we have to download one PDF per attendee and print that out. And printing more than 10 docs at once is a pain. So I decided to make a little console app that would merge multiple PDF files into a single file that would be much easier to print. I used an open source PDF manipulation library called iTextSharp version 5.4.5.0

This is a console program I used. It’s brilliantly named MergeSpeedPASS. It only has two methods and is really short. Don't let the name fool you It can be used to merge any PDF files.

The first parameter is the name of the target PDF file that will be created.
The second parameter is the directory containing PDF files to be merged into a single file.

using iTextSharp.text;
using iTextSharp.text.pdf;
using System;
using System.IO;

namespace MergeSpeedPASS
{
    class Program
    {
        static void Main(string[] args)
        {
            if (args.Length == 0 || args[0] == "-h" || args[0] == "/h")
            {
                Console.WriteLine("Welcome to MergeSpeedPASS. Created by Mladen Prajdic. Uses iTextSharp 5.4.5.0.");
                Console.WriteLine("Tool to create a single SpeedPASS PDF from all downloaded generated PDFs.");
                Console.WriteLine("");
                Console.WriteLine("Example: MergeSpeedPASS.exe targetFileName sourceDir");
                Console.WriteLine("         targetFileName = name of the new merged PDF file. Must include .pdf extension.");
                Console.WriteLine("         sourceDir      = path to the dir containing downloaded attendee SpeedPASS PDFs");
                Console.WriteLine("");
                Console.WriteLine(@"Example: MergeSpeedPASS.exe MergedSpeedPASS.pdf d:\Downloads\SQLSaturdaySpeedPASSFiles");
            }
            else if (args.Length == 2)
                CreateMergedPDF(args[0], args[1]);

            Console.WriteLine("");
            Console.WriteLine("Press any key to exit...");
            Console.Read();
        }

        static void CreateMergedPDF(string targetPDF, string sourceDir)
        {
            using (FileStream stream = new FileStream(targetPDF, FileMode.Create))
            {
                Document pdfDoc = new Document(PageSize.A4);
                PdfCopy pdf = new PdfCopy(pdfDoc, stream);
                pdfDoc.Open();                
                var files = Directory.GetFiles(sourceDir);
                Console.WriteLine("Merging files count: " + files.Length);
                int i = 1;
                foreach (string file in files)
                {
                    Console.WriteLine(i + ". Adding: " + file);
                    pdf.AddDocument(new PdfReader(file));
                    i++;
                }

                if (pdfDoc != null)
                    pdfDoc.Close();

                Console.WriteLine("SpeedPASS PDF merge complete.");
            }
        }
    }
}


Hope it helps you and have fun.

PASS Summit 2013 Review

posted @ 10/20/2013 8:21 PM by Ajarn Mark Caldwell (0 comments)

As a long-standing member of PASS who lives in the greater Seattle area and has attended about nine of these Summits, let me start out by saying how GREAT it was to go to Charlotte, North Carolina this year.  Many of the new folks that I met at the Summit this year, upon hearing that I was from Seattle, commented that I must have been disappointed to have to travel to the Summit this year after 5 years in a row in Seattle.  Well, nothing could be further from the truth.  I cheered loudly when I first heard that the 2013 Summit would be outside Seattle.  I have many fond memories of trips to Orlando, Florida and Grapevine, Texas for past Summits (missed out on Denver, unfortunately).  And there is a funny dynamic that takes place when the conference is local.  If you do as I have done the last several years and saved my company money by not getting a hotel, but rather just commuting from home, then both family and coworkers tend to act like you’re just on a normal schedule.  For example, I have a young family, and my wife and kids really wanted to still see me come home “after work”, but there are a whole lot of after-hours activities, social events, and great food to be enjoyed at the Summit each year.  Even more so if you really capitalize on the opportunities to meet face-to-face with people you either met at previous summits or have spoken to or heard of, from Twitter, blogs, and forums.  Then there is also the lovely commuting in Seattle traffic from neighboring cities rather than the convenience of just walking across the street from your hotel.  So I’m just saying, there are really nice aspects of having the conference 2500 miles away.

Beyond that, the training was fantastic as usual.  The SQL Server community has many outstanding presenters and experts with deep knowledge of the tools who are extremely willing to share all of that with anyone who wants to listen.  The opening video with PASS President Bill Graziano in a NASCAR race turned dream sequence was very well done, and the keynotes, as usual, were great.  This year I was particularly impressed with how well attended were the Professional Development sessions.  Not too many years ago, those were very sparsely attended, but this year, the two that I attended were standing-room only, and these were not tiny rooms.  I would say this is a testament to both the maturity of the attendees realizing how important these topics are to career success, as well as to the ever-increasing skills of the presenters and the program committee for selecting speakers and topics that resonated with people.  If, as is usually the case, you were not able to get to every session that you wanted to because there were just too darn many good ones, I encourage you to get the recordings.

Overall, it was a great time as these events always are.  It was wonderful to see old friends and make new ones, and the people of Charlotte did an awesome job hosting the event and letting their hospitality shine (extra kudos to SQLSentry for all they did with the shuttle, maps, and other event sponsorships).  We’re back in Seattle next year (it is a release year, after all) but I would say that with the success of this year’s event, I strongly encourage the Board and PASS HQ to firmly reestablish the location rotation schedule.  I’ll even go so far as to suggest standardizing on an alternating Seattle – Charlotte schedule, or something like that.

If you missed the Summit this year, start saving now, and register early, so you can join us!

It&rsquo;s A Team Sport: PASS Board Year 2, Q3

posted @ 10/13/2013 9:23 PM by Denise McInerney (0 comments)

As I type this I’m on an airplane en route to my 12th PASS Summit. It’s been a very busy 3.5 months since my last post on my work as a Board member. Nearing the end of my 2-year term I am struck by how much has happened, and yet how fast the time has gone.

But I’ll save the retrospective post for next time and today focus on what happened in Q3. In the last three months we made progress on several fronts, thanks to the contributions of many volunteers and HQ staff members. They deserve our appreciation for their dedication to delivering for the membership week after week.

Virtual Chapters

The Virtual Chapters continue to provide many PASS members with valuable free training. Between July and September of 2013 VCs hosted over 50 webinars with a total of 4300 attendees.

This quarter also saw the launch of the Security & Global Russian VCs. Both are off to a strong start and I welcome these additions to the Virtual Chapter portfolio.

At the beginning of 2012 we had 14 Virtual Chapters. Today we have 22. This growth has been exciting to see. It has also created a need to have more volunteers help manage the work of the VCs year-round. We have renewed focus on having Virtual Chapter Mentors work with the VC Leaders and other volunteers. I am grateful to volunteers Julie Koesmarno, Thomas LeBlanc and Marcus Bittencourt who join original VC Mentor Steve Simon on this team. Thank you for stepping up to help.

Many improvements to the VC web sites have been rolling out over the past few weeks. Our marketing and IT teams have been busy working a new look-and-feel, features and a logo for each VC. They have given the VCs a fresh, professional look consistent with the rest of the PASS branding, and all VCs now have a logo that connects to PASS and the particular focus of the chapter.

24 Hours of PASS

The Summit Preview edition  of 24HOP was held on July 31 and by all accounts was a success. Our first use of the GoToWebinar platform for this event went extremely well. Thanks to our speakers, moderators and sponsors for making this event possible. Special thanks to HQ staffers Vicki Van Damme and Jane Duffy for a smoothly run event.

Coming up: the 24HOP Portuguese Edition will be held November 13-14, followed December 12-13 by the Spanish Edition. Thanks to the Portuguese- and Spanish-speaking community volunteers who are organizing these events.

July Board Meeting

The Board met July 18-19 in Kansas City. The first order of business was the election of the Executive Committee who will take office January 1. I was elected Vice President of Marketing and will join incoming President Thomas LaRock, incoming Executive Vice President of Finance Adam Jorgensen and Immediate Past President Bill Graziano on the Exec Co. I am honored that my fellow Board members elected me to this position and look forward to serving the organization in this role.

Visit to PASS HQ

In late September I traveled to Vancouver for my first visit to PASS HQ, where I joined Tom LaRock and Adam Jorgensen to make plans for 2014.  Our visit was just a few weeks before PASS Summit and coincided with the Board election, and the office was humming with activity. I saw first-hand the enthusiasm and dedication of everyone there. In each interaction I observed a focus on what is best for PASS and our members. Our partners at HQ are key to the organization’s success. This week at PASS Summit is a great opportunity for all of us to remember that, and say “thanks.”

Next Up

PASS Summit—of course! I’ll be around all week and look forward to connecting with many of our member over meals, at the Community Zone and between sessions.

In the evenings you can find me at the Welcome Reception, Exhibitor’s Reception and Community Appreciation Party. And I will be at the Board Q&A session  Friday at 12:45 p.m.

Transitions

The newly elected Exec Co and Board members take office January 1, and the Virtual Chapter portfolio is transitioning to a new director. I’m thrilled that Jen Stirrup will be taking over. Jen has experience as a volunteer and co-leader of the Business Intelligence Virtual Chapter and was a key contributor to the BI VCs expansion to serving our members in the EMEA region. I’ll be working closely with Jen over the next couple of months to ensure a smooth transition.

Go Big or Go Special

posted @ 9/27/2013 10:52 PM by Ajarn Mark Caldwell (0 comments)

Watching Shark Tank tonight and the first presentation was by Mango Mango Preserves and it highlighted an interesting contrast in business trends today and how to capitalize on opportunities.  <Spoiler Alert> Even though every one of the sharks was raving about the product samples they tried, with two of them going for second and third servings, none of them made a deal to invest in the company.</Spoiler>  In fact, one of the sharks, Kevin O’Leary, kept ripping into the owners with statements to the effect that he thinks they are headed over a financial cliff because he felt their costs were way out of line and would be their downfall if they didn’t take action to radically cut costs. He said that he had previously owned a jams and jellies business and knew the cost ratios that you had to have to make it work.  I don’t doubt he knows exactly what he’s talking about and is 100% accurate…for doing business his way, which I’ll call “Go Big”.  But there’s a whole other way to do business today that would be ideal for these ladies to pursue.

As I understand it, based on his level of success in various businesses and the fact that he is even in a position to be investing in other companies, Kevin’s approach is to go mass market (Go Big) and make hundreds of millions of dollars in sales (or something along that scale) while squeezing out every ounce of cost that you can to produce an acceptable margin.  But there is a very different way of making a very successful business these days, which is all about building a passionate and loyal community of customers that are rooting for your success and even actively trying to help you succeed by promoting your product or company (Go Special).  This capitalizes on the power of social media, niche marketing, and The Long Tail.  One of the most prolific writers about capitalizing on this trend is Seth Godin, and I hope that the founders of Mango Mango pick up a couple of his books (probably Purple Cow and Tribes would be good starts) or at least read his blog.  I think the adoration expressed by all of the sharks for the product is the biggest hint that they have a remarkable product and that they are perfect for this type of business approach.

Both are completely valid business models, and it may certainly be that the scale at which Kevin O’Leary wants to conduct business where he invests his money is well beyond the long tail, but that doesn’t mean that there is not still a lot of money to be made there.  I wish them the best of luck with their endeavors!

My Reference for Amy Lewis

posted @ 9/25/2013 9:44 PM by Denise McInerney (0 comments)

The 2013 election campaign for the PASS Board of Directors is underway. There are seven qualified candidates running this year. They all offer a wealth of experience volunteering for PASS and the SQL Server community.

One of these candidates, Amy Lewis, asked me to write a reference for her to include on her candidate application. I have a lot of experience working with Amy and was pleased to provide this reference:

I enthusiastically support Amy Lewis as a candidate for the PASS Board of Directors. I have known and worked with Amy in various PASS' volunteer capacities for years, starting when we were both leaders of SIGs (the precursors to the Virtual Chapters.) In that time I have seen Amy grow as a leader, taking on increasing responsibility and developing her leadership skills in the process. From the Program Committee to the BI Virtual Chapter to her local user group's SQL Saturday Amy has demonstrated a capacity to organize and lead volunteers. A successful leader delivers results, and does so in a way that encourages and empowers the people she is working with; Amy embodies this leadership style.

As Director for Virtual Chapters I have most recently worked with Amy in her capacity of DW/BI VC Leader. This VC is one of our largest and most active, and Amy's leadership is a key contribution to that success. I was pleased to see that Amy was also thinking about succession and prepared other volunteers to take over the chapter leadership.

Amy has shown an understanding of PASS' strategic goals and has focused her volunteer efforts to help us reach those goals. For the past couple of years we have been trying to expand PASS reach and relevance to SQL communities around the world. The VCs are a key vehicle for this expansion.

Amy embraced this idea and organized the VC to engage volunteers in Europe & Australia and provide content that could reach SQL professionals in those regions.

A second key strategy for PASS is expanding into the data analytics space. Again Amy rose to the occasion helping to shape the program for our first Business Analytics Conference and leveraging the BI VC to promote the event.

By all measures I think Amy is prepared to serve on the Board and contribute in a positive way.

Full-text Indexing Books Online

posted @ 9/17/2013 6:32 PM by Most Valuable Yak (Rob Volk) (1 comment)

While preparing for a recent SQL Saturday presentation, I was struck by a crazy idea (shocking, I know): Could someone import the content of SQL Server Books Online into a database and apply full-text indexing to it?  The answer is yes, and it's really quite easy to do.

The first step is finding the installed help files.  If you have SQL Server 2012, BOL is installed under the Microsoft Help Library.  You can find the install location by opening SQL Server Books Online and clicking the gear icon for the Help Library Manager.  When the new window pops up click the Settings link, you'll get the following:

You'll see the path under Library Location.

Once you navigate to that path you'll have to drill down a little further, to C:\ProgramData\Microsoft\HelpLibrary\content\Microsoft\store.  This is where the help file content is kept if you downloaded it for offline use.

Depending on which products you've downloaded help for, you may see a few hundred files.  Fortunately they're named well and you can easily find the "SQL_Server_Denali_Books_Online_" files.  We are interested in the .MSHC files only, and can skip the Installation and Developer Reference files.

Despite the .MHSC extension, these files are compressed with the standard Zip format, so your favorite archive utility (WinZip, 7Zip, WinRar, etc.) can open them.  When you do, you'll see a few thousand files in the archive.  We are only interested in the .htm files, but there's no harm in extracting all of them to a folder.  7zip provides a command-line utility and the following will extract to a D:\SQLHelp folder previously created:

7z e –oD:\SQLHelp "C:\ProgramData\Microsoft\HelpLibrary\content\Microsoft\store\SQL_Server_Denali_Books_Online_B780_SQL_110_en-us_1.2.mshc" *.htm

Well that's great Rob, but how do I put all those files into a full-text index?

I'll tell you in a second, but first we have to set up a few things on the database side.  I'll be using a database named Explore (you can certainly change that) and the following setup is a fragment of the script I used in my presentation:

USE Explore;
GO
CREATE SCHEMA help AUTHORIZATION dbo;
GO

-- Create default fulltext catalog for later FT indexes
CREATE FULLTEXT CATALOG FTC AS DEFAULT;
GO

CREATE TABLE help.files(file_id int not null IDENTITY(1,1) 
    CONSTRAINT PK_help_files PRIMARY KEY,
    path varchar(256) not null CONSTRAINT UNQ_help_files_path UNIQUE,
    doc_type varchar(6) DEFAULT('.xml'),
    content varbinary(max) not null);

CREATE FULLTEXT INDEX ON help.files(content TYPE COLUMN doc_type LANGUAGE 1033) 
    KEY INDEX PK_help_files;

This will give you a table, default full-text catalog, and full-text index on that table for the content you're going to insert.  I'll be using the command line again for this, it's the easiest method I know:

for %a in (D:\SQLHelp\*.htm) do sqlcmd -S. -E -d Explore -Q"set nocount on;insert help.files(path,content) select '%a', cast(c as varbinary(max)) from openrowset(bulk '%a', SINGLE_CLOB) as c(c)"

You'll need to copy and run that as one line in a command prompt.  I'll explain what this does while you run it and watch several thousand files get imported:

The "for" command allows you to loop over a collection of items.  In this case we want all the .htm files in the D:\SQLHelp folder.  For each file it finds, it will assign the full path and file name to the %a variable.  In the "do" clause, we'll specify another command to be run for each iteration of the loop.  I make a call to "sqlcmd" in order to run a SQL statement.  I pass in the name of the server (-S.), where "." represents the local default instance. I specify -d Explore as the database, and -E for trusted connection.  I then use -Q to run a query that I enclose in double quotes.

The query uses OPENROWSET(BULK…SINGLE_CLOB) to open the file as a data source, and to treat it as a single character large object.  In order for full-text indexing to work properly, I have to convert the text content to varbinary. I then INSERT these contents along with the full path of the file into the help.files table created earlier.  This process continues for each file in the folder, creating one new row in the table.

And that's it! 5 SQL Statements and 2 command line statements to unzip and import SQL Server Books Online!  In case you're wondering why I didn't use FILESTREAM or FILETABLE, it's simply because I haven't learned them…yet. I may return to this blog after I figure that out and update it with the steps to do so.  I believe that will make it even easier.

In the spirit of exploration, I'll leave you to work on some fulltext queries of this content.  I also recommend playing around with the sys.dm_fts_xxxx DMVs (I particularly like sys.dm_fts_index_keywords, it's pretty interesting).  There are additional example queries in the download material for my presentation linked above.

Many thanks to Kevin Boles (t) for his advice on (re)checking the content of the help files.  Don't let that .htm extension fool you! The 2012 help files are actually XML, and you'd need to specify '.xml' in your document type column in order to extract the full-text keywords.  (You probably noticed this in the default definition for the doc_type column.)  You can query sys.fulltext_document_types to get a complete list of the types that can be full-text indexed.

I also need to thank Hilary Cotter for giving me the original idea. I believe he used MSDN content in a full-text index for an article from waaaaaaaaaaay back, that I can't find now, and had forgotten about until just a few days ago.  He is also co-author of Pro Full-Text Search in SQL Server 2008, which I highly recommend.  He also has some FTS articles on Simple Talk:

http://www.simple-talk.com/sql/learn-sql-server/sql-server-full-text-search-language-features/
http://www.simple-talk.com/sql/learn-sql-server/sql-server-full-text-search-language-features,-part-2/

Fetching Partition Information

posted @ 7/14/2013 5:44 PM by Mike Femenella (0 comments)

For a recent SSIS package at work I needed to determine the distinct values in a partition, the number of rows in each partition and the file group name on which each partition resided in order to come up with a grouping mechanism. Of course sys.partitions comes to mind for some of that but there are a few other tables you need to link to in order to grab the information required.

The table I’m working on contains 8.8 billion rows. Finding the distinct partition keys from this table was not a fast operation. My original solution was to create  a temporary table, grab the distinct values for the partitioned column, then update via sys.partitions for the rows and the $partition function for the partitionid and finally look back to the sys.filegroups table for the filegroup names. It wasn’t pretty, it could take up to 15 minutes to return the results. The primary issue is pulling distinct values from the table. Queries for distinct against 8.8 billion rows don’t go quickly.

A few beers into a conversation with a friend and we ended up talking about work which led to a conversation about the task described above. The solution was already built in SQL Server, just needed to pull it together.

The first table I needed was sys.partition_range_values. This contains one row for each range boundary value for a partition function. In my case I have a partition function which uses dayid values. For example July 4th would be represented as an int, 20130704. This table lists out all of the dayid values which were defined in the function. This eliminated the need to query my source table for distinct dayid values, everything I needed was already built in here for me. The only caveat was that in my SSIS package I needed to create a bucket for any dayid values that were out of bounds for my function. For example if my function handled 20130501 through 20130704 and I had day values of 20130401 or 20130705 in my table, these would not be listed in sys.partition_range_values. I just created an “everything else” bucket in my ssis package just in case I had any dayid values unaccounted for.

To get the number of rows for a partition is very easy. The sys.partitions table contains values for each partition. Easy enough to achieve by querying for the object_id and index value of 1 (the clustered index)

The final piece of information was the filegroup name. There are 2 options available to get the filegroup name, sys.data_spaces or sys.filegroups. For my query I chose sys.filegroups but really it’s a matter of preference and data needs. In order to bridge between sys.partitions table and either sys.data_spaces or sys.filegroups you need to get the container_id. This can be done by joining sys.allocation_units.container_id to the sys.partitions.hobt_id. sys.allocation_units contains the field data_space_id which then lets you join in either sys.data_spaces or sys.file_groups.

The end result is the query below, which typically executes for me in under 1 second. I’ve included the join to sys.filegroups and to sys.dataspaces, and I’ve  just commented out the join sys.filegroups.

As I mentioned above, this shaves a good 10-15 minutes off of my original ssis package and is a really easy tweak to get a boost in my ETL time. Enjoy.

image003

Rolling Along: PASS Board Year 2, Q2

posted @ 7/14/2013 3:55 PM by Denise McInerney (2 comments)

Eighteen months into my time as a PASS Director I’m especially proud of what the Virtual Chapters have accomplished and want to share that progress with you. I'm also pleased that the organization has invested more resources to support the VCs.

In this quarter I got to attend two conferences and meet more members of the SQL community.

Virtual Chapters

In the first six months of 2013 VCs have hosted more than 50 webinars, offering free technical education to over 6200 attendees. This is a great benefit to PASS members; thanks to the VC leaders, volunteers and speakers who contribute their time to produce these events.

The Performance VC held their “Summer Performance Palooza”, an event featuring eight back-to-back sessions. Links to the session recordings can be found on the VCs web site.

The new webinar platform, GoToWebinar, has been rolled out to all the VCs. This is a more stable, scalable platform and represents an important investment into the future of the VCs.

A few new VCs are in the planning stages, including one focused on Security and one for Russian speakers. Visit the Virtual Chapter home page to sign up for the chapters that interest you.

Each Virtual Chapter is offering a discount code for PASS Summit 2013. Be sure to ask your VC leader for the code to save $200 on Summit registration.

24 Hours of PASS

The next 24HOP will be on July 31. This Summit Preview edition will feature 24 consecutive webcasts presented by experts who will be speaking at Summit in October. Registration for this free event is open now. And we will be using the GoToWebinar platform for 24HOP also.

Business Analytics Conference

April marked the first PASS Business Analytics Conference in Chicago. This introduced PASS to another segment of data professionals: the analysts and data scientists who work with the world’s growing collection of data. Overall the inaugural event was a success and gave us a glimpse into this increasingly important space.

After Chicago the Board had several serious discussions about the lessons learned from this seven and what we should do next. We agreed to apply those lessons and continue to invest in this event; there will be a PASS Business Analytics Conference in 2014. I’m very pleased the next event will be in San Jose, CA, the heart of Silicon Valley, a place where a great deal of investment and innovation in data analytics is taking place.

Global SQL Community

Over the last couple of years PASS has been taking steps to become more relevant to SQL communities in different parts of the world. In May I had the opportunity to attend SQL Bits XI in Nottingham, England. It was enlightening to meet and talk with SQL professionals from around the U.K. as well as many other European countries. The many SQL Bits volunteers put on a great event and were gracious hosts.

Budgets

The Board passed the FY14 budget at the end of June. The  budget process can be challenging and requires the Board to make some difficult choices about where to allocate resources. Overall I’m satisfied with the decisions we made and think we are investing in the right activities and programs.

Next Up

The Board is meeting July 18-19 in Kansas City. We will be holding the Executive Committee election for the Exec Co that will take office in 2014. We will also be discussing plans for the next BA conference as well as the next steps for our Global Growth initiative.

Applications for the upcoming Board of Directors election open on July 24. If you are considering running for the Board you can visit the PASS elections site to learn more about the election process. And I encourage anyone considering running to reach out to current and past Board members to learn about what the role entails.

Plans for the next PASS Summit are in full swing. We are working on some fun new ideas to introduce attendees to the many ways to become involved in the SQL community.

SSMS Tools Pack 2.7 is released. New website, improved licensing and features.

posted @ 7/2/2013 4:52 PM by Mladen Prajdić (3 comments)

New website
Nice, isn't it? Cleaner, simpler, better looking and more modern.
If you have any suggestions for further improvements I'd be glad to hear them.

Simpler licensing
With SSMS tools Pack 2.7 the licensing is finally where it should be.
It is now based on the activate/deactivate model.
This way you can move a license from machine to machine with simple deactivation on one and
reactivation on another machine. Much better, no?
Because of very good feedback I have added an option for 6 machines and lowered the 4 machines option to 3 machines.
This should make it much simpler for you to choose the right option for yourself.

Improved features
Version 2.5.3 was already extremely stable and 2.7 continues with that tradition.
Because of that I could fully focus on features and why 3.0 will rock even more that 2.7! ;)
In version 2.7 I have addressed quite a few improvements you were requesting for a while now.

SQL History
This is probably the biggest time saver out there, therefore it's only fair it gets a few important updates.
  • If you have an existing .sql file opened, the Window Content History now saves your code
    to that existing file and also makes a backup in the SQL History log default location.
    Search is still done through the SQL History log but the Tab Sessions Restore opens your existing .sql file.
    This way you don't have to remember to save your existing files by yourself anymore.
  • A bug when you couldn't search properly if you copied the log files to a new location was fixed.
    Unfortunately this removed the option to filter a search with the time component.
    The smallest search interval is now one day.
  • The SSMS Tools Pack now remembers the visibility of the Current Window History window when you exit SSMS.

SQL Snippets
You can now set the position of the cursor in your snippets by placing {C} somewhere in your snippet.
It's a small improvement but can be a huge time saver since you don't have to move through
the snippet to the desired location anymore.

Run script on multiple databases
Database choices can now be saved with a name and then loaded again next time.
You can also choose to run the script in a new window for each chosen database.

Search through grid results
You can now go previous/next search result with the Prev/Next control inside the search window.
This is extremely useful if you have a large resultset. IT saves you the scrolling.

CRUD generator
Four new variables have been added:
  • |CurrentDate| writes current date in format yyyy-MM-dd to your script
  • |CurrentTime| writes current time in 24h format HH:mm:ss to your script
  • |CurrentWinUser| writes current Windows logged on user to your script
  • |CurrentSqlUser| writes current SQL logged on login to your script
This was actually quite a requested feature so if you have any other ideas for extra variables, do let me know.

That's about it. I hope you're going to enjoy this version as much as the previous ones.
Have fun!

Criminals and Other Illegal Characters

posted @ 6/19/2013 4:49 PM by Most Valuable Yak (Rob Volk) (0 comments)

SQLTeam's favorite Slovenian blogger Mladen (b | t) had an interesting question on Twitter: http://www.twitter.com/MladenPrajdic/status/347057950470307841

I liked Kendal Van Dyke's (b | t) reply: http://twitter.com/SQLDBA/status/347058908801667072

And he was right!  This is one of those pretty-useless-but-sounds-interesting propositions that I've based all my presentations on, and most of my blog posts.

If you read all the replies you'll see a lot of good suggestions.  I particularly like Aaron Bertrand's (b | t) idea of going into the Unicode character set, since there are over 65,000 characters available.  But how to find an illegal character?  Detective work?

I'm working on the premise that if SQL Server will reject it as a name it would throw an error.  So all we have to do is generate all Unicode characters, rename a database with that character, and catch any errors.

It turns out that dynamic SQL can lend a hand here:

IF DB_ID(N'a') IS NULL CREATE DATABASE [a];

DECLARE @c INT=1, @sql NVARCHAR(MAX)=N'', @err NVARCHAR(MAX)=N'';
WHILE @c<65536 BEGIN
    BEGIN TRY
        SET @sql=N'alter database ' + 
            QUOTENAME(CASE WHEN @c=1 THEN N'a' ELSE NCHAR(@c-1) END) +
            N' modify name=' + QUOTENAME(NCHAR(@c));
        RAISERROR(N'*** Trying %d',10,1,@c) WITH NOWAIT;
        EXEC(@sql);
        SET @c+=1;
    END TRY
    BEGIN CATCH
        SET @err=ERROR_MESSAGE();
        RAISERROR(N'Ooops - %d - %s',10,1,@c,@err) WITH NOWAIT;
        BREAK;
    END CATCH
END
SET @sql=N'alter database ' + QUOTENAME(NCHAR(@c-1)) + N' modify name=[a]';
EXEC(@sql);

The script creates a dummy database "a" if it doesn't already exist, and only tests single characters as a database name.  If you have databases with single character names then you shouldn't run this on that server.

It takes a few minutes to run, but if you do you'll see that no errors are thrown for any of the characters.  It seems that SQL Server will accept any character, no matter where they're from.  (Well, there's one, but I won't tell you which. Actually there's 2, but one of them requires some deep existential thinking.)

The output is also interesting, as quite a few codes do some weird things there.  I'm pretty sure it's due to the font used in SSMS for the messages output window, not all characters are available.  If you run it using the SQLCMD utility, and use the -o switch to output to a file, and -u for Unicode output, you can open the file in Notepad or another text editor and see the whole thing.

I'm not sure what character I'd recommend to answer Mladen's question.  I think the standard tab (ASCII 9) is fine.  There's also several specific separator characters in the original ASCII character set (decimal 28-31).

But of all the choices available in Unicode whitespace, I think my favorite would be the Mongolian Vowel Separator.  Or maybe the zero-width space. (that'll be fun to print!)  And since this is Mladen we're talking about, here's a good selection of "intriguing" characters he could use.

SQL Saturday #220 - Atlanta - Pre-Con Scholarship Winners!

posted @ 5/7/2013 3:17 PM by Most Valuable Yak (Rob Volk) (0 comments)

A few weeks ago, AtlantaMDF offered scholarships for each of our upcoming Pre-conference sessions at SQL Saturday #220. We would like to congratulate the winners!

David Thomas SQL Server Security http://sqlsecurity.eventbrite.com/
Vince Bible Surfing the Multicore Wave: Processors, Parallelism, and Performance http://surfmulticore.eventbrite.com/
Mostafa Maged Languages of BI http://languagesofbi.eventbrite.com/
Daphne Adams Practical Self-Service BI with PowerPivot for Excel http://selfservicebi.eventbrite.com/
Tim Lawrence The DBA Skills Upgrade Toolkit http://dbatoolkit.eventbrite.com/

Thanks to everyone who applied! And once again we must thank Idera's generous sponsorship, and the time and effort made by Bobby Dimmick (w|t) and Brian Kelley (w|t) of Midlands PASS for judging all the applicants.

Don't forget, there's still time to attend the Pre-Cons on May 17, 2013! Click on the EventBrite links for more details and to register!

   

SQL Saturday #220 - Atlanta - Pre-Conference Scholarships!

posted @ 4/22/2013 3:59 PM by Most Valuable Yak (Rob Volk) (0 comments)

 

We Want YOU…To Learn!

AtlantaMDF and Idera are teaming up to find a few good people. If you are:

  • A student looking to work in the database or business intelligence fields
  • A database professional who is between jobs or wants a better one
  • A developer looking to step up to something new
  • On a limited budget and can’t afford professional SQL Server training
  • Able to attend training from 9 to 5 on May 17, 2013

AtlantaMDF is presenting 5 Pre-Conference Sessions (pre-cons) for SQL Saturday #220! And thanks to Idera’s sponsorship, we can offer one free ticket to each of these sessions to eligible candidates! That means one scholarship per Pre-Con!

One Recipient Each will Attend:

Denny Cherry: SQL Server Security http://sqlsecurity.eventbrite.com/
Adam Machanic: Surfing the Multicore Wave: Processors, Parallelism, and Performance http://surfmulticore.eventbrite.com/
Stacia Misner: Languages of BI http://languagesofbi.eventbrite.com/
Bill Pearson: Practical Self-Service BI with PowerPivot for Excel http://selfservicebi.eventbrite.com/
Eddie Wuerch: The DBA Skills Upgrade Toolkit http://dbatoolkit.eventbrite.com/

If you are interested in attending these pre-cons send an email by April 30, 2013 to AtlantaMDFPrecons@gmail.com and tell us:

  • Why you are a good candidate to receive this scholarship
  • Which sessions you’d like to attend, and why (list multiple sessions in order of preference)
  • What the session will teach you and how it will help you achieve your goals

The emails will be evaluated by the good folks at Midlands PASS in Columbia, SC. The recipients will be notified by email and announcements made on May 6, 2013.

GOOD LUCK!



P.S. - Don't forget that SQLSaturday #220 offers free* training in addition to the pre-cons!
You can find more information about SQL Saturday #220 at http://www.sqlsaturday.com/220/eventhome.aspx.
View the scheduled sessions at http://www.sqlsaturday.com/220/schedule.aspx and register for them at http://www.sqlsaturday.com/220/register.aspx.

* Registration charges a $10 fee to cover lunch expenses.

Database Mirroring on SQL Server Express Edition

posted @ 4/1/2013 7:00 PM by Most Valuable Yak (Rob Volk) (0 comments)

Like most SQL Server users I'm rather frustrated by Microsoft's insistence on making the really cool features only available in Enterprise Edition.  And it really doesn't help that they changed the licensing for SQL 2012 to be core-based, so now it's like 4 times as expensive!  It almost makes you want to go with Oracle.  That, and a desire to have Larry Ellison do things to your orifices.

And since they've introduced Availability Groups, and marked database mirroring as deprecated, you'd think they'd make make mirroring available in all editions.  Alas…they don't…officially anyway.  Thanks to my constant poking around in places I'm not "supposed" to, I've discovered the low-level code that implements database mirroring, and found that it's available in all editions!

It turns out that the query processor in all SQL Server editions prepends a simple check before every edition-specific DDL statement:

IF CAST(SERVERPROPERTY('Edition') as nvarchar(max)) NOT LIKE '%e%e%e% Edition%'
print 'Lame'
else
print 'Cool'

If that statement returns true, it fails. (the print statements are just placeholders)  Go ahead and test it on Standard, Workgroup, and Express editions compared to an Enterprise or Developer edition instance (which support everything).

Once again thanks to Argenis Fernandez (b | t) and his awesome sessions on using Sysinternals, I was able to watch the exact process SQL Server performs when setting up a mirror.  Surprisingly, it's not actually implemented in SQL Server!  Some of it is, but that's something of a smokescreen, the real meat of it is simple filesystem primitives.

The NTFS filesystem supports links, both hard links and symbolic, so that you can create two entries for the same file in different directories and/or different names.  You can create them using the MKLINK command in a command prompt:

mklink /D D:\SkyDrive\Data D:\Data
mklink /D D:\SkyDrive\Log D:\Log

This creates a symbolic link from my data and log folders to my Skydrive folder.  Any file saved in either location will instantly appear in the other.  And since my Skydrive will be automatically synchronized with the cloud, any changes I make will be copied instantly (depending on my internet bandwidth of course).

So what does this have to do with database mirroring?  Well, it seems that the mirroring endpoint that you have to create between mirror and principal servers is really nothing more than a Skydrive link.  Although it doesn't actually use Skydrive, it performs the same function.  So in effect, the following statement:

ALTER DATABASE Mir SET PARTNER='TCP://MyOtherServer.domain.com:5022'

Is turned into:

mklink /D "D:\Data" "\\MyOtherServer.domain.com\5022$"

The 5022$ "port" is actually a hidden system directory on the principal and mirror servers. I haven't quite figured out how the log files are included in this, or why you have to SET PARTNER on both principal and mirror servers, except maybe that mklink has to do something special when linking across servers.  I couldn't get the above statement to work correctly, but found that doing mklink to a local Skydrive folder gave me similar functionality.

To wrap this up, all you have to do is the following:

  1. Install Skydrive on both SQL Servers (principal and mirror) and set the local Skydrive folder (D:\SkyDrive in these examples)
  2. On the principal server, run mklink /D on the data and log folders to point to SkyDrive: mklink /D D:\SkyDrive\Data D:\Data
  3. On the mirror server, run the complementary linking: mklink /D D:\Data D:\SkyDrive\Data
  4. Create your database and make sure the files map to the principal data and log folders (D:\Data and D:\Log)
  5. Viola! Your databases are kept in sync on multiple servers!

One wrinkle you will encounter is that the mirror server will show the data and log files, but you won't be able to attach them to the mirror SQL instance while they are attached to the principal. I think this is a bug in the Skydrive, but as it turns out that's fine: you can't access a mirror while it's hosted on the principal either.  So you don't quite get automatic failover, but you can attach the files to the mirror if the principal goes offline.  It's also not exactly synchronous, but it's better than nothing, and easier than either replication or log shipping with a lot less latency.

I will end this with the obvious "not supported by Microsoft" and "Don't do this in production without an updated resume" spiel that you should by now assume with every one of my blog posts, especially considering the date.

SQL Saturday #220 Atlanta May 2013!

posted @ 3/20/2013 6:51 PM by Most Valuable Yak (Rob Volk) (0 comments)

If you love SQL Server training and are near the Atlanta area, or just love us so much you're willing to travel here, please come join us for:

SQL SATURDAY #220!

The main event is Saturday, May 18.  The event is free, with a $10.00 lunch fee.  The main page has more details here: http://www.sqlsaturday.com/220/eventhome.aspx

We are also offering pre-conference sessions on Friday, May 17, by 5 world-renowned presenters:

 

 

 

 

We have an early bird registration price of $119 until noon EST Friday, March 22.  After that the price goes to $149, a STEAL when you compare it to the PASS Summit price. :)

Please click on the links to register and for more information.  You can also follow the hash tag #SQLSatATL on Twitter for more news about this event.

Can't wait to see you all there!

SQL Server: Writing CASE expressions properly when NULLs are involved

posted @ 3/18/2013 9:55 PM by Mladen Prajdić (7 comments)

We’ve all written a CASE expression (yes, it’s an expression and not a statement) or two every now and then. But did you know there are actually 2 formats you can write the CASE expression in? This actually bit me when I was trying to add some new functionality to an old stored procedure. In some rare cases the stored procedure just didn’t work correctly. After a quick look it turned out to be a CASE expression problem when dealing with NULLS.

In the first format we make simple “equals to” comparisons to a value:

SELECT CASE <value>
WHEN <equals this value> THEN <return this>

WHEN <equals this value> THEN <return this>
-- ... more WHEN's here
ELSE <return this>
END


Second format is much more flexible since it allows for complex conditions. USE THIS ONE!

SELECT  CASE
WHEN <value> <compared to> <value> THEN <return this>
WHEN <value> <compared to> <value> THEN <return this>
-- ... more WHEN's here
ELSE <return this>
END

Now that we know both formats and you know which to use (the second one if that hasn’t been clear enough) here’s an example how the first format WILL make your evaluation logic WRONG.

Run the following code for different values of @i. Just comment out any 2 out of 3 “SELECT @i =” statements.

DECLARE @i INT
SELECT  @i = -1 -- first result
SELECT  @i = 55 -- second result
SELECT  @i = NULL -- third result



SELECT @i AS OriginalValue,

-- first CASE format. DON'T USE THIS!
CASE @i
WHEN -1 THEN '-1'
WHEN NULL THEN 'We have a NULL!'
ELSE 'We landed in ELSE'
END AS DontUseThisCaseFormatValue,

-- second CASE format. USE THIS!
CASE
WHEN @i = -1 THEN '-1'
WHEN @i IS NULL THEN 'We have a NULL!'
ELSE 'We landed in ELSE'
END AS UseThisCaseFormatValue


When the value of @i is –1 everything works as expected, since both formats go into the –1 WHEN branch.

CaseResult1

When the value of @i is 55 everything again works as expected, since both formats go into the ELSE branch.

CaseResult2

When the value of @i is NULL the problems become evident. The first format doesn’t go into the WHEN NULL branch because it makes an equality comparison between two NULLs.
Because a NULL is an unknown value: NULL = NULL is false. That is why the first format goes into the ELSE Branch but the second format correctly handles the proper IS NULL comparison.

CaseResult3

 

Please use the second more explicit format. Your future self will be very grateful to you when he doesn’t have to discover these kinds of bugs.

A Case for Women in Technology

posted @ 3/7/2013 6:49 AM by Denise McInerney (0 comments)

Pragmatic Works and the PASS Women in Tech chapter are co-sponsoring a webinar series featuring women speakers.

I presented a session on “A Case for Women in Technology” explaining why we are all affected by the lack of women studying and working in tech. The recording is available here. And here are the slides from that presentation:

The presentation includes a link to a trailer for an upcoming documentary. This short video makes a good case for why we need more women creating technology.

There are many organizations doing good and important work on this issue. Here are some of them:

Here are the publications I referenced in my slides:

Women in IT: The Facts

Why Diversity Matters

Women in IT: By the Numbers

NCWIT Scorecard

From DBA to Data Analyst

posted @ 2/21/2013 3:57 PM by Denise McInerney (0 comments)

Cross posted from the PASS Blog

There is a lot changing in the data professional’s world these days. More data is being produced and stored. More enterprises are trying to use that data to improve their products and services and understand their customers better. More data platforms and tools seem to be crowding the market.

For a traditional DBA this can be a confusing and perhaps unsettling time. It’s also a time that offers great opportunity for career growth. I speak from personal experience.

We sometimes refer to the “accidental DBA”, the person who finds herself suddenly responsible for managing the database because she has some other technical skills. While it was not accidental, six months ago I was unexpectedly offered a chance to transition out of my DBA role and become a data analyst. I have since come to view this offer as a gift, though at the time I wasn’t quite sure what to do with it.

Throughout my DBA career I’ve gotten support from my PASS friends and colleagues and they were the first ones I turned to for counsel about this new situation. Everyone was encouraging and I received two pieces of valuable advice: first, leverage what I already know about data and second, work to understand the business’ needs.

Bringing the power of data to bear to solve business problems is really the heart of the job. The challenge is figuring out how to do that.

PASS had been the source of much of my technical training as a DBA, so I naturally started there to begin my Business Intelligence education. Once again the Virtual Chapter webinars, local chapter meetings and SQL Saturdays have been invaluable.

I work in a large company where we are fortunate to have some very talented data scientists and analysts. These colleagues have been generous with their time and advice.

I also took a statistics class through Coursera where I got a refresher in statistics and an introduction to the R programming language.

And that’s not the end of the free resources available to someone wanting to acquire new skills. There are many knowledgeable Business Intelligence and Analytics professionals who teach through their blogs. Every day I can learn something new from one of these experts.

Sometimes we plan our next career move and sometimes it just happens. Either way a database professional who follows industry developments and acquires new skills will be better prepared when change comes.

Take the opportunity to learn something about the changing data landscape and attend a Business Intelligence, Business Analytics or Big Data Virtual Chapter meeting. And if you are moving into this new world of data consider attending the PASS Business Analytics Conference in April where you can meet and learn from those who are already on that road.

It’s been said that “the only thing constant is change.” That’s never been more true for the data professional than it is today. But if you are someone who loves data and grasps its potential you are in the right place at the right time.

Few events I&rsquo;m speaking at in early 2013

posted @ 2/13/2013 1:34 AM by Mladen Prajdić (0 comments)

2013 has started great and the SQL community is already brimming with events. At some of these events you can come say hi. I’ll be glad you do!

These are the events with dates and locations that I know I’ll be speaking at so far.

 

February 16th: SQL Saturday #198 - Vancouver, Canada

The session I’ll present in Vancouver is

SQL Impossible: Restoring/Undeleting a table

Yes, you read the title right. No, it's not about the usual "one table per partition" and "restore full backup then copy the data over" methods. No, there are no 3rd party tools involved. Just you and your SQL Server. Yes, it's crazy. No, it's not for production purposes. And yes, that's why it's so much fun. Prepare to dive into the world of data pages, log records, deletes, truncates and backups and how it all works together to get your table back from the endless void. Want to know more? Come and see!

This is an advanced level session where we’ll dive into the internals of data pages, transaction log records and page restores.

 

March 8th-9th: SQL Saturday #194 - Exeter, UK

In Exeter I’ll be presenting twice.

On the first day I’ll have a full day precon titled:

From SQL Traces to Extended Events - The next big switch

This pre-con will give you insight into both of the current tracing technologies in SQL Server. The old SQL Trace which has served us well over the past 10 or so years is on its way out because the overhead and details it produces are no longer enough to deal with today's loads. The new Extended Events are a new lightweight tracing mechanism built directly into the SQLOS thus giving us information SQL Trace just couldn't. They were designed and built with performance in mind and it shows. The new Extended Events are a new lightweight tracing mechanism built directly into the SQLOS thus giving us information SQL Trace just couldn't. They were designed and built with performance in mind and it shows. Mastering Extended Events requires learning at least one new skill: XML querying.

The second session I’ll have on Saturday titled:

SQL Injection from website to SQL Server

SQL Injection is still one of the biggest reasons various websites and applications get hacked. The solution as everyone tells us is simple. Use SQL parameters. But is that enough? In this session we'll look at how would an attacker go about using SQL Injection to gain access to your database, see its schema and data, take over the server, upload files and do various other mischief on your domain.

This is a fun session that always brings out a few laughs in the audience because they didn’t realize what can be done.

 

April 23rd-25th: NTK conference - Bled, Slovenia (Slovenian website only)

This is a conference with history. This year marks its 18th year running.

It’s a relatively large IT conference that focuses on various Microsoft technologies like .Net, Azure, SQL Server, Exchange, Security, etc…

The main session’s language is Slovenian but this is slowly changing so it’s becoming more interesting for foreign attendees. This year it’s happening in the beautiful town of Bled in the Alps. The scenery alone is worth the visit, wouldn’t you agree?

And this year there are quite a few well known speakers present!

Session title isn’t known yet.

 

 

 

May 2nd-4th: SQL Bits XI – Nottingham, UK

SQL Bits is the largest SQL Server conference in Europe. It’s a 3 day conference with top speakers and content all dedicated to SQL Server.

The session I’ll present here is an hour long version of the precon I’ll give in Exeter.

From SQL Traces to Extended Events - The next big switch

The session description is the same as for the Exeter precon but we'll focus more on how the Extended Events work with only a brief overview of old SQL Trace architecture.

Whoosh: PASS Board Year 1, Q4

posted @ 1/12/2013 6:14 PM by Denise McInerney (0 comments)

"Whoosh". That's the sound the last quarter of 2012 made as it rushed by. My first year on the PASS Board is complete, and the last three months of it were probably the busiest.

PASS Summit 2012
Much of October was devoted to preparing for Summit. Every Board  member, HQ staffer and dozens of volunteers were busy in the run-up to our flagship event. It takes a lot of work to put on the Summit. The community meetings,  first-timers program, keynotes, sessions and that fabulous Community Appreciation party are the result of many hours of preparation.


Virtual Chapters at the Summit
With a lot of help from Karla Landrum, Michelle Nalliah, Lana Montgomery and others at HQ the VCs had a good presence at Summit. We started the week with a VC leaders meeting. I shared some information about the activities and growth during the first part of the year.  

From January - September 2012:

  • The number of VCs increased from 14 to 20
  • VC membership  grew from 55,200 to 80,100
  • Total attendance at VC meetings increased from 1,480 to 2,198
  • Been part of PASS Global Growth with language-based VC- including Chinese, Spanish and Portuguese.

We also heard from some VC leaders and volunteers. Ryan Adams (Performance VC) shared his tips for successful marketing of VC events. Amy Lewis (Business Intelligence VC) described how the BI chapter has expanded to support PASS' global growth by finding volunteers to organize events at times that are convenient for people in Europe and Australia. Felipe Ferreira (Portuguese language VC) described the experience of building a user group first in Brazil, then expanding to work with Portuguese-speaking data professionals around the world.

Virtual Chapter leaders and volunteers were in evidence throughout Summit, beginning with the Welcome Reception. For the past several years VCs have had an organized presence at this event, signing up new members and advertising their meetings. Many VC leaders also spent time at the Community Zone. This new addition to the Summit proved to be a vibrant spot were new members and volunteers could network with others and find out how to start a chapter or host a SQL Saturday.

Women In Technology
2012 was the 10th WIT Luncheon to be held at Summit. I was honored to be asked to be on the panel to discuss the topic "Where Have We Been and Where are We Going?" The PASS community has come a long way in our understanding of issues facing women in tech and our support of women in the organization. It was great to hear from panelists Stefanie Higgins and Kevin Kline who were there at the beginning as well as Kendra Little and Jen Stirrup who are part of the progress being made by women in our community today.

Bylaw Changes
The Board spent a good deal of time in 2012 discussing how to move our global growth initiatives forward. An important component of this is a proposed change to how the Board is elected with some seats representing geographic regions. At the end of December we voted on these proposed bylaw changes which have been published for review. The member review and feedback is open until February 8. I encourage all members to review these changes and send any feedback to governance@sqlpass.org  In addition to reading the bylaws, I recommend reading Bill Graziano's blog post on the subject.

Business Analytics Conference
At Summit we announced a new event: the PASS Business Analytics Conference. The inaugural event will be April 10-12, 2013 in Chicago. The world of data is changing rapidly. More and more businesses want to extract value and insight from their data. Data professionals who provide these insights or enable others to do so are in demand. The BA Conference offers expert content on predictive analytics, data exploration and visualization, content delivery strategies and more. By holding this new event PASS is participating in important discussions happening in our industry, offering our members more educational value and reaching out to data professionals who are not currently part of our organization.


New Year, New Portfolio
In addition to my work with the Virtual Chapters I am also now responsible for the 24 Hours of PASS portfolio. Since the first 24HOP of 2013 is scheduled for January 30 we started the transition of the portfolio work from Rob Farley to me right after Summit. Work immediately started to secure speakers for the January event. We have also been evaluating webinar platforms that can be used for 24HOP as well as the Virtual Chapters.

Next Up
24 Hours of PASS: Business Analytics Edition will be held on January 30. I'll be there and will moderate one or two sessions. The 24HOP topics are a sneak peek into the type of content that will be offered at the Business Analytics Conference. I hope to see some of you there.

The Virtual Chapters have hit the ground running in 2013; many of them have events scheduled. The Application Development VC is getting restarted  and a new Business Analytics VC will be starting soon. Check out the lineup and join the VCs that interest you. And watch the Events page and Connector for announcements of upcoming meetings.

At the end of January I will be attending a Board meeting in Seattle, and February 23 I will be at SQL Saturday #177 in Silicon Valley.