TRUNCATE TABLE master..sysdatabases

...and other neat SQL Server tricks
posts - 51, comments - 60, trackbacks - 14

Tuesday, April 08, 2014

T-SQL Tuesday #53-Matt's Making Me Do This!

Hello everyone! It's that time again, time for T-SQL Tuesday, the wonderful blog series started by Adam Machanic (b|t).

This month we are hosted by Matt Velic (b|t) who asks the question, "Why So Serious?", in celebration of April Fool's Day. He asks the contributors for their dirty tricks. And for some reason that escapes me, he and Jeff Verheul (b|t) seem to think I might be able to write about those. Shocked, I am!

Nah, not really. They're absolutely right, this one is gonna be fun!

I took some inspiration from Matt's suggestions, namely Resource Governor and Login Triggers.  I've done some interesting login trigger stuff for a presentation, but nothing yet with Resource Governor. Best way to learn it!

One of my oldest pet peeves is abuse of the sa login. Don't get me wrong, I use it too, but typically only as SQL Agent job owner. It's been a while since I've been stuck with it, but back when I started using SQL Server, EVERY application needed sa to function. It was hard-coded and couldn't be changed. (welllllll, that is if you didn't use a hex editor on the EXE file, but who would do such a thing?)

My standard warning applies: don't run anything on this page in production. In fact, back up whatever server you're testing this on, including the master database. Snapshotting a VM is a good idea. Also make sure you have other sysadmin level logins on that server.

So here's a standard template for a logon trigger to address those pesky sa users:

CREATE TRIGGER SA_LOGIN_PRIORITY ON ALL SERVER 
WITH ENCRYPTION, EXECUTE AS N'sa'
AFTER LOGON AS
IF ORIGINAL_LOGIN()<>N'sa' OR APP_NAME() LIKE N'SQL Agent%' RETURN;

-- interesting stuff goes here

GO

 

What can you do for "interesting stuff"? Books Online limits itself to merely rolling back the logon, which will throw an error (and alert the person that the logon trigger fired).  That's a good use for logon triggers, but really not tricky enough for this blog.  Some of my suggestions are below:

WAITFOR DELAY '23:59:59';

 

Or:

EXEC sp_MSforeach_db 'EXEC sp_detach_db ''?'';'

 

Or:

EXEC msdb.dbo.sp_add_job @job_name=N'`', @enabled=1, @start_step_id=1, @notify_level_eventlog=0, @delete_level=3;
EXEC msdb.dbo.sp_add_jobserver @job_name=N'`', @server_name=@@SERVERNAME;
EXEC msdb.dbo.sp_add_jobstep @job_name=N'`', @step_id=1, @step_name=N'`', @command=N'SHUTDOWN;';
EXEC msdb.dbo.sp_start_job @job_name=N'`';

 

Really, I don't want to spoil your own exploration, try it yourself!  The thing I really like about these is it lets me promote the idea that "sa is SLOW, sa is BUGGY, don't use sa!".  Before we get into Resource Governor, make sure to drop or disable that logon trigger. They don't work well in combination. (Had to redo all the following code when SSMS locked up)

Resource Governor is a feature that lets you control how many resources a single session can consume. The main goal is to limit the damage from a runaway query. But we're not here to read about its main goal or normal usage! I'm trying to make people stop using sa BECAUSE IT'S SLOW! Here's how RG can do that:

USE master;
GO

CREATE FUNCTION dbo.SA_LOGIN_PRIORITY()
RETURNS sysname
WITH SCHEMABINDING, ENCRYPTION AS 
BEGIN
RETURN CASE 
    WHEN ORIGINAL_LOGIN()=N'sa' AND APP_NAME() NOT LIKE N'SQL Agent%'
    THEN N'SA_LOGIN_PRIORITY'
    ELSE N'default' END
END
GO

CREATE RESOURCE POOL SA_LOGIN_PRIORITY
WITH (
     MIN_CPU_PERCENT = 0 ,MAX_CPU_PERCENT = 1 
    ,CAP_CPU_PERCENT = 1 ,AFFINITY SCHEDULER = (0)
    ,MIN_MEMORY_PERCENT = 0 ,MAX_MEMORY_PERCENT = 1
--  ,MIN_IOPS_PER_VOLUME = 1 ,MAX_IOPS_PER_VOLUME = 1  -- uncomment for SQL Server 2014
);

CREATE WORKLOAD GROUP SA_LOGIN_PRIORITY
WITH
    ( IMPORTANCE = LOW
      ,REQUEST_MAX_MEMORY_GRANT_PERCENT = 1
      ,REQUEST_MAX_CPU_TIME_SEC = 1
      ,REQUEST_MEMORY_GRANT_TIMEOUT_SEC = 1
      ,MAX_DOP = 1
      ,GROUP_MAX_REQUESTS = 1 )
USING SA_LOGIN_PRIORITY;

ALTER RESOURCE GOVERNOR WITH (CLASSIFIER_FUNCTION=dbo.SA_LOGIN_PRIORITY);
ALTER RESOURCE GOVERNOR RECONFIGURE;

 

From top to bottom:

  1. Create a classifier function to determine which pool the session should go to. More info on classifier functions.
  2. Create the pool and provide a generous helping of resources for the sa login.
  3. Create the workload group and further prioritize those resources for the sa login.
  4. Apply the classifier function and reconfigure RG to use it.

I have to say this one is a bit sneakier than the logon trigger, least of all you don't get any error messages.  I heartily recommend testing it in Management Studio, and click around the UI a lot, there's some fun behavior there. And DEFINITELY try it on SQL 2014 with the IO settings included!  You'll notice I made allowances for SQL Agent jobs owned by sa, they'll go into the default workload group.  You can add your own overrides to the classifier function if needed.

Some interesting ideas I didn't have time for but expect you to get to before me:

  • Set up different pools/workgroups with different settings and randomize which one the classifier chooses
  • Do the same but base it on time of day (Books Online example covers this)...
  • Or, which workstation it connects from. This can be modified for certain special people in your office who either don't listen, or are attracted (and attractive) to you.

And if things go wrong you can always use the following from another sysadmin or Dedicated Admin connection:

ALTER RESOURCE GOVERNOR DISABLE;

 

That will let you go in and either fix (or drop) the pools, workgroups and classifier function.

So now that you know these types of things are possible, and if you are tired of your team using sa when they shouldn't, I expect you'll enjoy playing with these quite a bit!

Unfortunately, the aforementioned Dedicated Admin Connection kinda poops on the party here.  Books Online for both topics will tell you that the DAC will not fire either feature. So if you have a crafty user who does their research, they can still sneak in with sa and do their bidding without being hampered.

Of course, you can still detect their login via various methods, like a server trace, SQL Server Audit, extended events, and enabling "Audit Successful Logins" on the server.  These all have their downsides: traces take resources, extended events and SQL Audit can't fire off actions, and enabling successful logins will bloat your error log very quickly.  SQL Audit is also limited unless you have Enterprise Edition, and Resource Governor is Enterprise-only.  And WORST OF ALL, these features are all available and visible through the SSMS UI, so even a doofus developer or manager could find them.

Fortunately there are Event Notifications!

Event notifications are becoming one of my favorite features of SQL Server (keep an eye out for more blogs from me about them). They are practically unknown and heinously underutilized.  They are also a great gateway drug to using Service Broker, another great but underutilized feature. Hopefully this will get you to start using them, or at least your enemies in the office will once they read this, and then you'll have to learn them in order to fix things.

So here's the setup:

USE msdb;
GO

CREATE PROCEDURE dbo.SA_LOGIN_PRIORITY_act
WITH ENCRYPTION AS
DECLARE @x XML, @message nvarchar(max);
RECEIVE @x=CAST(message_body AS XML) FROM SA_LOGIN_PRIORITY_q;
IF @x.value('(//LoginName)[1]','sysname')=N'sa' AND @x.value('(//ApplicationName)[1]','sysname') NOT LIKE N'SQL Agent%'
BEGIN
    -- interesting activation procedure stuff goes here
END
GO

CREATE QUEUE SA_LOGIN_PRIORITY_q 
    WITH STATUS=ON, RETENTION=OFF,
    ACTIVATION (PROCEDURE_NAME=dbo.SA_LOGIN_PRIORITY_act, MAX_QUEUE_READERS=1, EXECUTE AS OWNER);

CREATE SERVICE SA_LOGIN_PRIORITY_s ON QUEUE SA_LOGIN_PRIORITY_q([http://schemas.microsoft.com/SQL/Notifications/PostEventNotification]);

CREATE EVENT NOTIFICATION SA_LOGIN_PRIORITY_en ON SERVER WITH FAN_IN
FOR AUDIT_LOGIN
TO SERVICE N'SA_LOGIN_PRIORITY_s', N'current database'
GO

 

From top to bottom:

  1. Create activation procedure for event notification queue.
  2. Create queue to accept messages from event notification, and activate the procedure to process those messages when received.
  3. Create service to send messages to that queue.
  4. Create event notification on AUDIT_LOGIN events that fire the service.

I placed this in msdb as it is an available system database and already has Service Broker enabled by default. You should change this to another database if you can guarantee it won't get dropped.

So what to put in place for "interesting activation procedure code"?  Hmmm, so far I haven't addressed Matt's suggestion of writing a lengthy script to send an annoying message:

SET @message=@x.value('(//HostName)[1]','sysname') + 
N' tried to log in to server ' + @x.value('(//ServerName)[1]','sysname') + 
N' as SA at ' + @x.value('(//StartTime)[1]','sysname') + 
N' using the ' + @x.value('(//ApplicationName)[1]','sysname') + 
N' program. That''s why you''re getting this message and the attached pornography which' + 
N' is bloating your inbox and violating company policy, among other things. If you know' + 
N' this person you can go to their desk and hit them, or use the following SQL to end their session:

KILL ' + @x.value('(//SPID)[1]','sysname') +   
N';

Hopefully they''re in the middle of a huge query that they need to finish right away.'

EXEC msdb.dbo.sp_send_dbmail @recipients=N'CompanyWideDistroList@yourcompany.com', @subject=N'SA Login Alert', @query_result_width=32767, 
@body=@message, @query=N'EXEC sp_readerrorlog;', @attach_query_result_as_file=1, @query_attachment_filename=N'UtterlyGrossPorn_SeriouslyDontOpenIt.jpg'

I'm not sure I'd call that a lengthy script, but the attachment should get pretty big, and I'm sure the email admins will love storing multiple copies of it.  The nice thing is that this also fires on Dedicated Admin connections! You can even identify DAC connections from the event data returned, I leave that as an exercise for you. You can use that info to change the action taken by the activation procedure, and since it's a stored procedure, it can pretty much do anything!

Except KILL the SPID, or SHUTDOWN the server directly.  I'm still working on those.

posted @ Tuesday, April 08, 2014 5:18 PM | Feedback (1) | Filed Under [ Coolness ]

Tuesday, September 17, 2013

Full-text Indexing Books Online

While preparing for a recent SQL Saturday presentation, I was struck by a crazy idea (shocking, I know): Could someone import the content of SQL Server Books Online into a database and apply full-text indexing to it?  The answer is yes, and it's really quite easy to do.

The first step is finding the installed help files.  If you have SQL Server 2012, BOL is installed under the Microsoft Help Library.  You can find the install location by opening SQL Server Books Online and clicking the gear icon for the Help Library Manager.  When the new window pops up click the Settings link, you'll get the following:

You'll see the path under Library Location.

Once you navigate to that path you'll have to drill down a little further, to C:\ProgramData\Microsoft\HelpLibrary\content\Microsoft\store.  This is where the help file content is kept if you downloaded it for offline use.

Depending on which products you've downloaded help for, you may see a few hundred files.  Fortunately they're named well and you can easily find the "SQL_Server_Denali_Books_Online_" files.  We are interested in the .MSHC files only, and can skip the Installation and Developer Reference files.

Despite the .MHSC extension, these files are compressed with the standard Zip format, so your favorite archive utility (WinZip, 7Zip, WinRar, etc.) can open them.  When you do, you'll see a few thousand files in the archive.  We are only interested in the .htm files, but there's no harm in extracting all of them to a folder.  7zip provides a command-line utility and the following will extract to a D:\SQLHelp folder previously created:

7z e –oD:\SQLHelp "C:\ProgramData\Microsoft\HelpLibrary\content\Microsoft\store\SQL_Server_Denali_Books_Online_B780_SQL_110_en-us_1.2.mshc" *.htm

Well that's great Rob, but how do I put all those files into a full-text index?

I'll tell you in a second, but first we have to set up a few things on the database side.  I'll be using a database named Explore (you can certainly change that) and the following setup is a fragment of the script I used in my presentation:

USE Explore;
GO
CREATE SCHEMA help AUTHORIZATION dbo;
GO

-- Create default fulltext catalog for later FT indexes
CREATE FULLTEXT CATALOG FTC AS DEFAULT;
GO

CREATE TABLE help.files(file_id int not null IDENTITY(1,1) 
    CONSTRAINT PK_help_files PRIMARY KEY,
    path varchar(256) not null CONSTRAINT UNQ_help_files_path UNIQUE,
    doc_type varchar(6) DEFAULT('.xml'),
    content varbinary(max) not null);

CREATE FULLTEXT INDEX ON help.files(content TYPE COLUMN doc_type LANGUAGE 1033) 
    KEY INDEX PK_help_files;

This will give you a table, default full-text catalog, and full-text index on that table for the content you're going to insert.  I'll be using the command line again for this, it's the easiest method I know:

for %a in (D:\SQLHelp\*.htm) do sqlcmd -S. -E -d Explore -Q"set nocount on;insert help.files(path,content) select '%a', cast(c as varbinary(max)) from openrowset(bulk '%a', SINGLE_CLOB) as c(c)"

You'll need to copy and run that as one line in a command prompt.  I'll explain what this does while you run it and watch several thousand files get imported:

The "for" command allows you to loop over a collection of items.  In this case we want all the .htm files in the D:\SQLHelp folder.  For each file it finds, it will assign the full path and file name to the %a variable.  In the "do" clause, we'll specify another command to be run for each iteration of the loop.  I make a call to "sqlcmd" in order to run a SQL statement.  I pass in the name of the server (-S.), where "." represents the local default instance. I specify -d Explore as the database, and -E for trusted connection.  I then use -Q to run a query that I enclose in double quotes.

The query uses OPENROWSET(BULK…SINGLE_CLOB) to open the file as a data source, and to treat it as a single character large object.  In order for full-text indexing to work properly, I have to convert the text content to varbinary. I then INSERT these contents along with the full path of the file into the help.files table created earlier.  This process continues for each file in the folder, creating one new row in the table.

And that's it! 5 SQL Statements and 2 command line statements to unzip and import SQL Server Books Online!  In case you're wondering why I didn't use FILESTREAM or FILETABLE, it's simply because I haven't learned them…yet. I may return to this blog after I figure that out and update it with the steps to do so.  I believe that will make it even easier.

In the spirit of exploration, I'll leave you to work on some fulltext queries of this content.  I also recommend playing around with the sys.dm_fts_xxxx DMVs (I particularly like sys.dm_fts_index_keywords, it's pretty interesting).  There are additional example queries in the download material for my presentation linked above.

Many thanks to Kevin Boles (t) for his advice on (re)checking the content of the help files.  Don't let that .htm extension fool you! The 2012 help files are actually XML, and you'd need to specify '.xml' in your document type column in order to extract the full-text keywords.  (You probably noticed this in the default definition for the doc_type column.)  You can query sys.fulltext_document_types to get a complete list of the types that can be full-text indexed.

I also need to thank Hilary Cotter for giving me the original idea. I believe he used MSDN content in a full-text index for an article from waaaaaaaaaaay back, that I can't find now, and had forgotten about until just a few days ago.  He is also co-author of Pro Full-Text Search in SQL Server 2008, which I highly recommend.  He also has some FTS articles on Simple Talk:

http://www.simple-talk.com/sql/learn-sql-server/sql-server-full-text-search-language-features/
http://www.simple-talk.com/sql/learn-sql-server/sql-server-full-text-search-language-features,-part-2/

posted @ Tuesday, September 17, 2013 6:33 PM | Feedback (1) | Filed Under [ Coolness ]

Wednesday, June 19, 2013

Criminals and Other Illegal Characters

SQLTeam's favorite Slovenian blogger Mladen (b | t) had an interesting question on Twitter: http://www.twitter.com/MladenPrajdic/status/347057950470307841

I liked Kendal Van Dyke's (b | t) reply: http://twitter.com/SQLDBA/status/347058908801667072

And he was right!  This is one of those pretty-useless-but-sounds-interesting propositions that I've based all my presentations on, and most of my blog posts.

If you read all the replies you'll see a lot of good suggestions.  I particularly like Aaron Bertrand's (b | t) idea of going into the Unicode character set, since there are over 65,000 characters available.  But how to find an illegal character?  Detective work?

I'm working on the premise that if SQL Server will reject it as a name it would throw an error.  So all we have to do is generate all Unicode characters, rename a database with that character, and catch any errors.

It turns out that dynamic SQL can lend a hand here:

IF DB_ID(N'a') IS NULL CREATE DATABASE [a];

DECLARE @c INT=1, @sql NVARCHAR(MAX)=N'', @err NVARCHAR(MAX)=N'';
WHILE @c<65536 BEGIN
    BEGIN TRY
        SET @sql=N'alter database ' + 
            QUOTENAME(CASE WHEN @c=1 THEN N'a' ELSE NCHAR(@c-1) END) +
            N' modify name=' + QUOTENAME(NCHAR(@c));
        RAISERROR(N'*** Trying %d',10,1,@c) WITH NOWAIT;
        EXEC(@sql);
        SET @c+=1;
    END TRY
    BEGIN CATCH
        SET @err=ERROR_MESSAGE();
        RAISERROR(N'Ooops - %d - %s',10,1,@c,@err) WITH NOWAIT;
        BREAK;
    END CATCH
END
SET @sql=N'alter database ' + QUOTENAME(NCHAR(@c-1)) + N' modify name=[a]';
EXEC(@sql);

The script creates a dummy database "a" if it doesn't already exist, and only tests single characters as a database name.  If you have databases with single character names then you shouldn't run this on that server.

It takes a few minutes to run, but if you do you'll see that no errors are thrown for any of the characters.  It seems that SQL Server will accept any character, no matter where they're from.  (Well, there's one, but I won't tell you which. Actually there's 2, but one of them requires some deep existential thinking.)

The output is also interesting, as quite a few codes do some weird things there.  I'm pretty sure it's due to the font used in SSMS for the messages output window, not all characters are available.  If you run it using the SQLCMD utility, and use the -o switch to output to a file, and -u for Unicode output, you can open the file in Notepad or another text editor and see the whole thing.

I'm not sure what character I'd recommend to answer Mladen's question.  I think the standard tab (ASCII 9) is fine.  There's also several specific separator characters in the original ASCII character set (decimal 28-31).

But of all the choices available in Unicode whitespace, I think my favorite would be the Mongolian Vowel Separator.  Or maybe the zero-width space. (that'll be fun to print!)  And since this is Mladen we're talking about, here's a good selection of "intriguing" characters he could use.

posted @ Wednesday, June 19, 2013 4:50 PM | Feedback (0) | Filed Under [ Coolness Hotness ]

Tuesday, May 07, 2013

SQL Saturday #220 - Atlanta - Pre-Con Scholarship Winners!

A few weeks ago, AtlantaMDF offered scholarships for each of our upcoming Pre-conference sessions at SQL Saturday #220. We would like to congratulate the winners!

David Thomas SQL Server Security http://sqlsecurity.eventbrite.com/
Vince Bible Surfing the Multicore Wave: Processors, Parallelism, and Performance http://surfmulticore.eventbrite.com/
Mostafa Maged Languages of BI http://languagesofbi.eventbrite.com/
Daphne Adams Practical Self-Service BI with PowerPivot for Excel http://selfservicebi.eventbrite.com/
Tim Lawrence The DBA Skills Upgrade Toolkit http://dbatoolkit.eventbrite.com/

Thanks to everyone who applied! And once again we must thank Idera's generous sponsorship, and the time and effort made by Bobby Dimmick (w|t) and Brian Kelley (w|t) of Midlands PASS for judging all the applicants.

Don't forget, there's still time to attend the Pre-Cons on May 17, 2013! Click on the EventBrite links for more details and to register!

   

posted @ Tuesday, May 07, 2013 3:18 PM | Filed Under [ Coolness Hotness ]

Monday, April 22, 2013

SQL Saturday #220 - Atlanta - Pre-Conference Scholarships!

 

We Want YOU…To Learn!

AtlantaMDF and Idera are teaming up to find a few good people. If you are:

  • A student looking to work in the database or business intelligence fields
  • A database professional who is between jobs or wants a better one
  • A developer looking to step up to something new
  • On a limited budget and can’t afford professional SQL Server training
  • Able to attend training from 9 to 5 on May 17, 2013

AtlantaMDF is presenting 5 Pre-Conference Sessions (pre-cons) for SQL Saturday #220! And thanks to Idera’s sponsorship, we can offer one free ticket to each of these sessions to eligible candidates! That means one scholarship per Pre-Con!

One Recipient Each will Attend:

Denny Cherry: SQL Server Security http://sqlsecurity.eventbrite.com/
Adam Machanic: Surfing the Multicore Wave: Processors, Parallelism, and Performance http://surfmulticore.eventbrite.com/
Stacia Misner: Languages of BI http://languagesofbi.eventbrite.com/
Bill Pearson: Practical Self-Service BI with PowerPivot for Excel http://selfservicebi.eventbrite.com/
Eddie Wuerch: The DBA Skills Upgrade Toolkit http://dbatoolkit.eventbrite.com/

If you are interested in attending these pre-cons send an email by April 30, 2013 to AtlantaMDFPrecons@gmail.com and tell us:

  • Why you are a good candidate to receive this scholarship
  • Which sessions you’d like to attend, and why (list multiple sessions in order of preference)
  • What the session will teach you and how it will help you achieve your goals

The emails will be evaluated by the good folks at Midlands PASS in Columbia, SC. The recipients will be notified by email and announcements made on May 6, 2013.

GOOD LUCK!



P.S. - Don't forget that SQLSaturday #220 offers free* training in addition to the pre-cons!
You can find more information about SQL Saturday #220 at http://www.sqlsaturday.com/220/eventhome.aspx.
View the scheduled sessions at http://www.sqlsaturday.com/220/schedule.aspx and register for them at http://www.sqlsaturday.com/220/register.aspx.

* Registration charges a $10 fee to cover lunch expenses.

posted @ Monday, April 22, 2013 3:59 PM | Filed Under [ Coolness Hotness ]

Monday, April 01, 2013

Database Mirroring on SQL Server Express Edition

Like most SQL Server users I'm rather frustrated by Microsoft's insistence on making the really cool features only available in Enterprise Edition.  And it really doesn't help that they changed the licensing for SQL 2012 to be core-based, so now it's like 4 times as expensive!  It almost makes you want to go with Oracle.  That, and a desire to have Larry Ellison do things to your orifices.

And since they've introduced Availability Groups, and marked database mirroring as deprecated, you'd think they'd make make mirroring available in all editions.  Alas…they don't…officially anyway.  Thanks to my constant poking around in places I'm not "supposed" to, I've discovered the low-level code that implements database mirroring, and found that it's available in all editions!

It turns out that the query processor in all SQL Server editions prepends a simple check before every edition-specific DDL statement:

IF CAST(SERVERPROPERTY('Edition') as nvarchar(max)) NOT LIKE '%e%e%e% Edition%'
print 'Lame'
else
print 'Cool'

If that statement returns true, it fails. (the print statements are just placeholders)  Go ahead and test it on Standard, Workgroup, and Express editions compared to an Enterprise or Developer edition instance (which support everything).

Once again thanks to Argenis Fernandez (b | t) and his awesome sessions on using Sysinternals, I was able to watch the exact process SQL Server performs when setting up a mirror.  Surprisingly, it's not actually implemented in SQL Server!  Some of it is, but that's something of a smokescreen, the real meat of it is simple filesystem primitives.

The NTFS filesystem supports links, both hard links and symbolic, so that you can create two entries for the same file in different directories and/or different names.  You can create them using the MKLINK command in a command prompt:

mklink /D D:\SkyDrive\Data D:\Data
mklink /D D:\SkyDrive\Log D:\Log

This creates a symbolic link from my data and log folders to my Skydrive folder.  Any file saved in either location will instantly appear in the other.  And since my Skydrive will be automatically synchronized with the cloud, any changes I make will be copied instantly (depending on my internet bandwidth of course).

So what does this have to do with database mirroring?  Well, it seems that the mirroring endpoint that you have to create between mirror and principal servers is really nothing more than a Skydrive link.  Although it doesn't actually use Skydrive, it performs the same function.  So in effect, the following statement:

ALTER DATABASE Mir SET PARTNER='TCP://MyOtherServer.domain.com:5022'

Is turned into:

mklink /D "D:\Data" "\\MyOtherServer.domain.com\5022$"

The 5022$ "port" is actually a hidden system directory on the principal and mirror servers. I haven't quite figured out how the log files are included in this, or why you have to SET PARTNER on both principal and mirror servers, except maybe that mklink has to do something special when linking across servers.  I couldn't get the above statement to work correctly, but found that doing mklink to a local Skydrive folder gave me similar functionality.

To wrap this up, all you have to do is the following:

  1. Install Skydrive on both SQL Servers (principal and mirror) and set the local Skydrive folder (D:\SkyDrive in these examples)
  2. On the principal server, run mklink /D on the data and log folders to point to SkyDrive: mklink /D D:\SkyDrive\Data D:\Data
  3. On the mirror server, run the complementary linking: mklink /D D:\Data D:\SkyDrive\Data
  4. Create your database and make sure the files map to the principal data and log folders (D:\Data and D:\Log)
  5. Viola! Your databases are kept in sync on multiple servers!

One wrinkle you will encounter is that the mirror server will show the data and log files, but you won't be able to attach them to the mirror SQL instance while they are attached to the principal. I think this is a bug in the Skydrive, but as it turns out that's fine: you can't access a mirror while it's hosted on the principal either.  So you don't quite get automatic failover, but you can attach the files to the mirror if the principal goes offline.  It's also not exactly synchronous, but it's better than nothing, and easier than either replication or log shipping with a lot less latency.

I will end this with the obvious "not supported by Microsoft" and "Don't do this in production without an updated resume" spiel that you should by now assume with every one of my blog posts, especially considering the date.

posted @ Monday, April 01, 2013 7:00 PM | Feedback (0) | Filed Under [ Coolness Stupidness Oddball ]

Wednesday, March 20, 2013

SQL Saturday #220 Atlanta May 2013!

If you love SQL Server training and are near the Atlanta area, or just love us so much you're willing to travel here, please come join us for:

SQL SATURDAY #220!

The main event is Saturday, May 18.  The event is free, with a $10.00 lunch fee.  The main page has more details here: http://www.sqlsaturday.com/220/eventhome.aspx

We are also offering pre-conference sessions on Friday, May 17, by 5 world-renowned presenters:

 

 

 

 

We have an early bird registration price of $119 until noon EST Friday, March 22.  After that the price goes to $149, a STEAL when you compare it to the PASS Summit price. :)

Please click on the links to register and for more information.  You can also follow the hash tag #SQLSatATL on Twitter for more news about this event.

Can't wait to see you all there!

posted @ Wednesday, March 20, 2013 6:52 PM | Filed Under [ Coolness Hotness ]

Monday, September 10, 2012

SQL Saturday #162 Cambridge

Despite the efforts of American Airlines, this past weekend I attended the first SQL Saturday in the UK!  Hosted by the SQLCambs Chapter of PASS and organized by Mark (b|t) & Lorraine Broadbent, ably assisted by John Martin (b|t), Mark Pryce-Maher (b|t) and other folks whose names I've unfortunately forgotten, it was held at the Crowne Plaza Hotel, which is completely surrounded by Cambridge University.

On Friday, they presented 3 pre-conference sessions given by the brilliant American Cloud & DBA Guru, Buck Woody (b|t), the brilliant Danish SQL Server Internals Guru, Mark Rasmussen (b|t), and the brilliant Scottish Business Intelligence Guru and recent Outstanding Pass Volunteer, Jen Stirrup (b|t).  While I would have loved to attend any of their pre-cons (having seen them present several times already), finances and American Airlines ultimately made that impossible.  But not to worry, I caught up with them during the regular sessions and at the speaker dinner.  And I got back the money they all owed me.  (Actually I owed Mark some money)

The schedule was jam-packed even with only 4 tracks, there were 8 regular slots, a lunch session for sponsor presentations, and a 15 minute keynote given by Buck Woody, who besides giving an excellent history of SQL Server at Microsoft (and before), also explained the source of the "unknown contact" image that appears in Outlook.  Hint: it's not Buck himself.

Amazingly, (and against all better judgment, ;) they allowed me to present at SQL Saturday 162!  I did a 5 minute Lightning Talk on Regular Expressions in SSMS.  I then did a regular 50 minute session on Constraints.  You can download the content for the regular session at that link, and for the regular expression presentation here.

I had a great time and had a great audience for both of my sessions.  You would never have guessed this was the first event for the organizers, everything went very smoothly, especially for the number of attendees and the relative smallness of the space.  The event sponsors also deserve a lot of credit for making themselves fit in a small area and for staying through the entire event until the giveaways at the very end.

Overall this was one of the best SQL Saturdays I've ever attended and I have to congratulate Mark B, Lorraine, John, Mark P-M, and all the volunteers and speakers for making this an astoundingly hard act to follow!  Well done!

posted @ Monday, September 10, 2012 6:21 PM | Feedback (0) | Filed Under [ Coolness ]

Tuesday, September 04, 2012

T-SQL Tuesday #34: Help! I Need Somebody!

Welcome everyone to T-SQL Tuesday Episode 34!  When last we tuned in, Mike Fal (b|t) hosted Trick Shots.  These highlighted techniques or tricks that you figured out on your own which helped you understand SQL Server better.

This month, I'm asking you to look back this past week, year, century, or hour...to a time when you COULDN'T figure it out.  When you were stuck on a SQL Server problem and you had to seek help.

In the beginning...


SQL Server has changed a lot since I started with it.  <Cranky Old Guy> Back in my day, Books Online was neither.  There were no blogs. Google was the third-place search site. There were perhaps two or three community forums where you could ask questions.  (Besides the Microsoft newsgroups...which you had to access with Usenet.  And endure the wrath of...Celko.)  Your "training" was reading a book, made from real dead trees, that you bought from your choice of brick-and-mortar bookstore. And except for your local user groups, there were no conferences, seminars, SQL Saturdays, or any online video hookups where you could interact with a person. You'd have to call Microsoft Support...on the phone...a LANDLINE phone.  And none of this "SQL Family" business!</Cranky Old Guy>

Even now, with all these excellent resources available, it's still daunting for a beginner to seek help for SQL Server.  The product is roughly 1247.4523 times larger than it was 15 years ago, and it's simply impossible to know everything about it.*  So whether you are a beginner, or a seasoned pro of over a decade's experience, what do you do when you need help on SQL Server?

That's so meta...


In the spirit of offering help, here are some suggestions for your topic:

  1. Tell us about a person or SQL Server community who have been helpful to you.  It can be about a technical problem, or not, e.g. someone who volunteered for your local SQL Saturday.  Sing their praises!  Let the world know who they are!
  2. Do you have any tricks for using Books Online?  Do you use the locally installed product, or are you completely online with BOL/MSDN/Technet, and why?
  3. If you've been using SQL Server for over 10 years, how has your help-seeking changed? Are you using Twitter, StackOverflow, MSDN Forums, or another resource that didn't exist when you started? What made you switch?
  4. Do you spend more time helping others than seeking help? What motivates you to help, and how do you contribute?
  5. Structure your post along the lyrics to The Beatles song Help! Audio or video renditions are particularly welcome! Lyrics must include reference to SQL Server terminology or community, and performances must be in your voice or include you playing an instrument.

These are just suggestions, you are free to write whatever you like.  Bonus points if you can incorporate ALL of these into a single post.  (Or you can do multiple posts, we're flexible like that.)  Help us help others by showing how others helped you!

Legalese, Your Rights, Yada yada...


If you would like to participate in T-SQL Tuesday please be sure to follow the rules below:

  • Your blog post must be published between Tuesday, September 11, 2012 00:00:00 GMT and Wednesday, September 12, 2012 00:00:00 GMT.
  • Include the T-SQL Tuesday logo (above) and hyperlink it back to this post.
  • If you don’t see your post in trackbacks, add the link to the comments below.

If you are on Twitter please tweet your blog using the #TSQL2sDay hashtag.  I can be contacted there as @sql_r, in case you have questions or problems with comments/trackback.  I'll have a follow-up post listing all the contributions as soon as I can.

Thank you all for participating, and special thanks to Adam Machanic (b|t) for all his help and for continuing this series!

posted @ Tuesday, September 04, 2012 11:03 AM | Feedback (10) | Filed Under [ Coolness ]

Tuesday, August 14, 2012

T-SQL Tuesday #33: Trick Shots: Undocumented, Underdocumented, and Unknown Conspiracies!

Mike Fal (b | t) is hosting this month's T-SQL Tuesday on Trick Shots.  I love this choice because I've been preoccupied with sneaky/tricky/evil SQL Server stuff for a long time and have been presenting on it for the past year.  Mike's directives were "Show us a cool trick or process you developed…It doesn’t have to be useful", which most of my blogging definitely fits, and "Tell us what you learned from this trick…tell us how it gave you insight in to how SQL Server works", which is definitely a new concept.  I've done a lot of reading and watching on SQL Server Internals and even attended training, but sometimes I need to go explore on my own, using my own tools and techniques.  It's an itch I get every few months, and, well, it sure beats workin'.

I've found some people to be intimidated by SQL Server's internals, and I'll admit there are A LOT of internals to keep track of, but there are tons of excellent resources that clearly document most of them, and show how knowing even the basics of internals can dramatically improve your database's performance.  It may seem like rocket science, or even brain surgery, but you don't have to be a genius to understand it.

Although being an "evil genius" can help you learn some things they haven't told you about. ;)

This blog post isn't a traditional "deep dive" into internals, it's more of an approach to find out how a program works.  It utilizes an extremely handy tool from an even more extremely handy suite of tools, Sysinternals.  I'm not the only one who finds Sysinternals useful for SQL Server: Argenis Fernandez (b | t), Microsoft employee and former T-SQL Tuesday host, has an excellent presentation on how to troubleshoot SQL Server using Sysinternals, and I highly recommend it.  Argenis didn't cover the Strings.exe utility, but I'll be using it to "hack" the SQL Server executable (DLL and EXE) files.

Please note that I'm not promoting software piracy or applying these techniques to attack SQL Server via internal knowledge. This is strictly educational and doesn't reveal any proprietary Microsoft information.  And since Argenis works for Microsoft and demonstrated Sysinternals with SQL Server, I'll just let him take the blame for it. :P (The truth is I've used Strings.exe on SQL Server before I ever met Argenis.)

Once you download and install Strings.exe you can run it from the command line.  For our purposes we'll want to run this in the Binn folder of your SQL Server instance (I'm referencing SQL Server 2012 RTM):

cd "C:\Program Files\Microsoft SQL Server\MSSQL11\MSSQL\Binn"
C:\Program Files\Microsoft SQL Server\MSSQL11\MSSQL\Binn> strings *sql*.dll > sqldll.txt
C:\Program Files\Microsoft SQL Server\MSSQL11\MSSQL\Binn> strings *sql*.exe > sqlexe.txt
 

I've limited myself to DLLs and EXEs that have "sql" in their names.  There are quite a few more but I haven't examined them in any detail. (Homework assignment for you!)

If you run this yourself you'll get 2 text files, one with all the extracted strings from every SQL DLL file, and the other with the SQL EXE strings.  You can open these in Notepad, but you're better off using Notepad++, EditPad, Emacs, Vim or another more powerful text editor, as these will be several megabytes in size.

And when you do open it…you'll find…a TON of gibberish.  (If you think that's bad, just try opening the raw DLL or EXE file in Notepad.  And by the way, don't do this in production, or even on a running instance of SQL Server.)  Even if you don't clean up the file, you can still use your editor's search function to find a keyword like "SELECT" or some other item you expect to be there.  As dumb as this sounds, I sometimes spend my lunch break just scanning the raw text for anything interesting.  I'm boring like that.

Sometimes though, having these files available can lead to some incredible learning experiences.  For me the most recent time was after reading Joe Sack's post on non-parallel plan reasons.  He mentions a new SQL Server 2012 execution plan element called NonParallelPlanReason, and demonstrates a query that generates "MaxDOPSetToOne".  Joe (formerly on the Microsoft SQL Server product team, so he knows this stuff) mentioned that this new element was not currently documented and tried a few more examples to see what other reasons could be generated.

Since I'd already run Strings.exe on the SQL Server DLLs and EXE files, it was easy to run grep/find/findstr for MaxDOPSetToOne on those extracts.  Once I found which files it belonged to (sqlmin.dll) I opened the text to see if the other reasons were listed.  As you can see in my comment on Joe's blog, there were about 20 additional non-parallel reasons.  And while it's not "documentation" of this underdocumented feature, the names are pretty self-explanatory about what can prevent parallel processing. I especially like the ones about cursors – more ammo! - and am curious about the PDW compilation and Cloud DB replication reasons.

One reason completely stumped me: NoParallelHekatonPlan.  What the heck is a hekaton?  Google and Wikipedia were vague, and the top results were not in English.  I found one reference to Greek, stating "hekaton" can be translated as "hundredfold"; with a little more Wikipedia-ing this leads to hecto, the prefix for "one hundred" as a unit of measure.  I'm not sure why Microsoft chose hekaton for such a plan name, but having already learned some Greek I figured I might as well dig some more in the DLL text for hekaton.  Here's what I found:

hekaton_slow_param_passing
Occurs when a Hekaton procedure call dispatch goes to slow parameter passing code path
The reason why Hekaton parameter passing code took the slow code path
hekaton_slow_param_pass_reason

sp_deploy_hekaton_database
sp_undeploy_hekaton_database
sp_drop_hekaton_database
sp_checkpoint_hekaton_database
sp_restore_hekaton_database

e:\sql11_main_t\sql\ntdbms\hekaton\sqlhost\sqllang\hkproc.cpp
e:\sql11_main_t\sql\ntdbms\hekaton\sqlhost\sqllang\matgen.cpp
e:\sql11_main_t\sql\ntdbms\hekaton\sqlhost\sqllang\matquery.cpp
e:\sql11_main_t\sql\ntdbms\hekaton\sqlhost\sqllang\sqlmeta.cpp
e:\sql11_main_t\sql\ntdbms\hekaton\sqlhost\sqllang\resultset.cpp

Interesting!  The first 4 entries (in red) mention parameters and "slow code".  Could this be the foundation of the mythical DBCC RUNFASTER command?  Have I been passing my parameters the slow way all this time?

And what about those sp_xxxx_hekaton_database procedures (in blue)? Could THEY be the secret to a faster SQL Server? Could they promise a "hundredfold" improvement in performance?  Are these special, super-undocumented DIB (databases in black)?

I decided to look in the SQL Server system views for any objects with hekaton in the name, or references to them, in hopes of discovering some new code that would answer all my questions:

SELECT name FROM sys.all_objects 
    WHERE name LIKE '%hekaton%'
SELECT name FROM sys.all_objects 
    WHERE object_definition(OBJECT_ID) LIKE '%hekaton%'

Which revealed:

name
------------------------

(0 row(s) affected)

name
------------------------
sp_createstats
sp_recompile
sp_updatestats

(3 row(s) affected)
 

Hmm.  Well that didn't find much.  Looks like these procedures are seriously undocumented, unknown, perhaps forbidden knowledge. Maybe a part of some unspeakable evil?

(No, I'm not paranoid, I just like mysteries and thought that punching this up with that kind of thing might keep you reading.  I know I'd fall asleep without it.)

OK, so let's check out those 3 procedures and see what they reveal when I search for "Hekaton":

sp_createstats:

-- filter out local temp tables, Hekaton tables, and tables for which current user has no permissions
-- Note that OBJECTPROPERTY returns NULL on type="IT" tables, thus we only call it on type='U' tables
 

OK, that's interesting, let's go looking down a little further:

((@table_type<>'U') or (0 = OBJECTPROPERTY(@table_id, 'TableIsInMemory'))) and -- Hekaton table
 

Wellllll, that tells us a few new things:

  1. There's such a thing as Hekaton tables (UPDATE: I'm not the only one to have found them!)
  2. They are not standard user tables and probably not in memory UPDATE: I misinterpreted this because I didn't read all the code when I wrote this blog post.
  3. The OBJECTPROPERTY function has an undocumented TableIsInMemory option

Let's check out sp_recompile:

--  (3) Must not be a Hekaton procedure.
 

And once again go a little further:

if (ObjectProperty(@objid, 'IsExecuted') <> 0 AND
    ObjectProperty(@objid, 'IsInlineFunction') = 0 AND
    ObjectProperty(@objid, 'IsView') = 0 AND
    -- Hekaton procedure cannot be recompiled
    -- Make them go through schema version bumping branch, which will fail
        ObjectProperty(@objid, 'ExecIsCompiledProc') = 0)
 

And now we learn that hekaton procedures also exist, they can't be recompiled, there's a "schema version bumping branch" somewhere, and OBJECTPROPERTY has another undocumented option, ExecIsCompiledProc.  (If you experiment with this you'll find this option returns null, I think it only works when called from a system object.)

This is neat! Sadly sp_updatestats doesn't reveal anything new, the comments about hekaton are the same as sp_createstats.  But we've ALSO discovered undocumented features for the OBJECTPROPERTY function, which we can now search for:

SELECT name, object_definition(OBJECT_ID) FROM sys.all_objects 
    WHERE object_definition(OBJECT_ID) LIKE '%OBJECTPROPERTY(%'
 

I'll leave that to you as more homework.  I should add that searching the system procedures was recommended long ago by the late, great Ken Henderson, in his Guru's Guide books, as a great way to find undocumented features.  That seems to be really good advice!

Now if you're a programmer/hacker, you've probably been drooling over the last 5 entries for hekaton (in green), because these are the names of source code files for SQL Server!  Does this mean we can access the source code for SQL Server?  As The Oracle suggested to Neo, can we return to The Source???

Actually, no.

Well, maybe a little bit.  While you won't get the actual source code from the compiled DLL and EXE files, you'll get references to source files, debugging symbols, variables and module names, error messages, and even the startup flags for SQL Server.  And if you search for "DBCC" or "CHECKDB" you'll find a really nice section listing all the DBCC commands, including the undocumented ones.  Granted those are pretty easy to find online, but you may be surprised what those web sites DIDN'T tell you! (And neither will I, go look for yourself!)  And as we saw earlier, you'll also find execution plan elements, query processing rules, and who knows what else.  It's also instructive to see how Microsoft organizes their source directories, how various components (storage engine, query processor, Full Text, AlwaysOn/HADR) are split into smaller modules. There are over 2000 source file references, go do some exploring!

So what did we learn?  We can pull strings out of executable files, search them for known items, browse them for unknown items, and use the results to examine internal code to learn even more things about SQL Server.  We've even learned how to use command-line utilities!  We are now 1337 h4X0rz!  (Not really.  I hate that leetspeak crap.)

Although, I must confess I might've gone too far with the "conspiracy" part of this post.  I apologize for that, it's just my overactive imagination.  There's really no hidden agenda or conspiracy regarding SQL Server internals.  It's not The Matrix.  It's not like you'd find anything like that in there:

Attach Matrix Database
DM_MATRIX_COMM_PIPELINES
MATRIXXACTPARTICIPANTS
dm_matrix_agents


 

Alright, enough of this paranoid ranting!  Microsoft are not really evil!  It's not like they're The Borg from Star Trek:

ALTER FEDERATION DROP
ALTER FEDERATION SPLIT
DROP FEDERATION
 

#tsql2sday

posted @ Tuesday, August 14, 2012 7:30 PM | Feedback (0) | Filed Under [ Coolness Hotness ]

Powered by:
Powered By Subtext Powered By ASP.NET