With CLUE as (Select * from Random_Thought ORDER BY Common_Sense DESC)

SQL Server thoughts, observations, and comments
posts - 52, comments - 79, trackbacks - 0

Tuesday, April 03, 2012

Unhelpful Help

Up until SQL 2012, I recommended installing Books On-Line (BOL) anywhere you installed SQL Server.  It made looking up reference information simpler, especially when you were on a server that didn’t have direct Internet access.  That all changed today.  I started the new Help Viewer with a local copy of BOL.  I actually found what I was looking for and closed the app.  Or so I thought.  Then I noticed something.  A little parasite had attached itself to my system. 

 

image

 

Yep, the “Help” system left an “agent” behind.  Now I shouldn’t have to tell you that running application helper agents on server platforms is a bad idea.  And it gets worse.  There is no way to configure the app so that it does NOT start the parasite agent each time you restart help.  So the solution becomes do not install help on production server platforms.  Which is pretty unhelpful.

posted @ Tuesday, April 03, 2012 1:42 PM | Feedback (0) |

Wednesday, December 28, 2011

So long and thanks for the fish…

This marks my last post as a SQLPASS Board member.  I learned a lot during my year of service and I thank everyone involved for this opportunity.  I would especially like to thank the Chapter leaders and Regional Mentors for Virtual Chapters who (mostly) patiently taught me about Virtual Chapters.   I hope the changes I put in place will help strengthen and grow VCs and PASS going forward.  I would also like to thank every one who encouraged me to reach beyond my comfort zone and accept a leadership position within the PASS organization. 

My overall principle was to be a good steward of the PASS community.  Could I have done more?  Always. Did I do enough?  I hope so.  But PASS is a volunteer organization and my time, like yours, is limited.  I have other obligations in life that supersede PASS.  Now I have more time for some of those.  I won’t be going away or leaving the SQL Community.  I will still contribute to the community and support PASS, just in a different role.  Time to let somebody else enjoy the hot seat for a while.

Finally, everyone who voted (not just for me) deserves a thanks.  More voters and more engaged voters, strong candidates, and a vigorous debate were all I wanted out of declaring as a candidate last year. This year the SQL community got exactly that.

Thank you..

posted @ Wednesday, December 28, 2011 4:06 PM | Feedback (0) | Filed Under [ PASS ]

Tuesday, November 29, 2011

A Rose by Any Other Name..

 

It is always a good start when you can steal a title line from one of the best writers in the English language.  Let’s hope I can make the rest of this post live up to the opening. 

One recurring problem with SQL server is moving databases to new servers.  Client applications use a variety of ways to resolve SQL Server names, some of which are not changed easily <cough SharePoint /cough>.  If you happen to be using default instances on both the source and target SQL Server, then the solution is pretty simple.  You create (or bug the network admin until she creates) two DNS “A” records. One points the old name to the new IP address.  The other creates a new alias for the old server, since the original system name is now redirected.  Note this will redirect ALL traffic from the old server to the new server, including RDP and file share connection attempts. 

 

image

Figure 1 – Microsoft DNS MMC Snap-In

 

image

Figure 2 – DNS New Host Dialog Box

Both records are necessary so you can still access the old server via an alternate name.

Server Role IP Address Name Alias
Source 10.97.230.60 SQL01 SQL01_Old
Target 10.97.230.80 SQL02 SQL01

Table 1 – Alias List

If you or somebody set up connections via IP address, you deserve to have to go to each app and fix it by hand.  That is the only way to fix that particular foul-up.

If have to deal with Named Instances either as a source or a target, then it gets more complicated.  The standard fix is to use the SQL Server Configuration Manager (or one of its earlier incarnations) to create a SQL client alias to redirect the connection.  This can be a pain installing and configuring the app on multiple client servers.  The good news is that SQL Server Configuration Manager AND all of its earlier versions simply write a few registry keys.  Extracting the keys into a .reg file makes centralized automated deployment a snap.

If the client is a 32-bit system, you have to extract the native key.  If it is a 64-bit, you have to extract the native key and the WoW (32 bit on 64 bit host) key.

First, pick a development system to create the actual registry key.  If you do this repeatedly, you can simply edit an existing registry file.  Create the entry using the SQL Configuration Manager.  You must use a 64-bit system to create the WoW key.  The following example redirects from a named instance “SQL01\SQLUtiluty” to a default instance on “SQL02”.

 

image

Figure 3 – SQL Server Configuration Manager - Native

Figure 3 shows the native key listing.

image

Figure 4 – SQL Server Configuration Manager – WoW

If you think you don’t need the WoW key because your app is 64 it, think again.  SQL Server Management Server is a 32-bit app, as are most SQL test utilities.  Always create both keys for 64-bit target systems.

Now that the keys exist, we can extract them into a .reg file. Fire up REGEDIT and browse to the following location:  HKLM\Software\Microsoft\MSSQLServer\Client\ConnectTo.  You can also search the registry for the string value of one of the server names (old or new).

Right click on the “ConnectTo” label and choose “Export”.  Save with an appropriate name and location.  The resulting file should look something like this:

image

Figure 5 – SQL01_Alias.reg

Repeat the process with the location: HKLM\Software\Wow6432Node\Microsoft\MSSQLServer\Client\ConnectTo

Note that if you have multiple alias entries, ALL of the entries will be exported.  In that case, you can edit the file and remove the extra aliases.

You can edit the files together into a single file.  Just leave a blank line between new keys like this:

image

Figure 6 – SQL01_Alias_All.reg

Of course if you have an automatic way to deploy, it makes sense to have an automatic way to Un-deploy.  To delete a registry key, simply edit the .reg file and replace the target with a “-“ sign like so.

image

Figure 7 – SQL01_Alias_UNDO.reg

Now we have the ability to move any database to any server without having to install or change any applications on any client server.  The whole process should be transparent to the applications, which makes planning and coordinating database moves a far simpler task.

posted @ Tuesday, November 29, 2011 9:45 AM | Feedback (0) |

Tuesday, November 01, 2011

Virtual Lab part 2–Templates, Patterns, Baselines

Once you have a good virtualization platform chosen, whether it is a desktop, server or laptop environment, the temptation is to build “X”.  “X” may be a SharePoint lab, a Virtual Cluster, an AD test environment or some other cool project that you really need RIGHT NOW.  That would be doing it wrong.

My grandfather taught woodworking and cabinetmaking for twenty-seven years at a trade school in Alabama.  He was the first instructor hired at that school and the only teacher for the first two years.  His students built tables, chairs, and workbenches so the school could start its HVAC courses.   Visiting as a child, I also noticed many extra “helper” stands, benches, holders, and gadgets all built from wood. 

What does that have to do with a virtual lab, you ask?  Well, that is the same approach you should take.  Build stuff that you will use.  Not for solving a particular problem, but to let the Virtual Lab be part of your normal troubleshooting toolkit.

Start with basic copies of various Operating Systems.  Load and patch server and desktop OS environments.  This also helps build your collection of ISO files, another essential element of a virtual Lab.  Once you have these “baseline” images, you can use your Virtualization software’s snapshot capability to freeze the image.  Clone the snapshot and you have a brand new fully patched machine in mere moments.  You may have to sysprep some of the Microsoft OS environments if you are going to create a domain environment or experiment with clustering.  That is still much faster than loading and patching from scratch.

So once you have a stock of raw materials (baseline images in this case) where should you start.  Again, my grandfather’s workshop gives us the answer.  In the shop it was workbenches and tables to hold large workpieces that made the equipment more useful.  In a Windows environment the same role falls to the fundamental network services:  DHCP, DNS, Active Directory, Routing, File Services, and Storage services.  Plan your internal network setup.  Build out an AD controller with all the features listed.  Make the actual domain an isolated domain so it will not care about where you take it.  Add the Microsoft iSCSI target.  Once you have this single system, you can leverage it for almost any network environment beyond a simple stand-alone system.

Having these templates and fundamental infrastructure elements ready to run means I can build a quick lab in minutes instead of hours.  My solutions are well-tested, my processes fully documented with screenshots, and my plans validated well before I have to make any changes to client systems.  the work I put in is easily returned in increased value and client satisfaction.

posted @ Tuesday, November 08, 2011 6:43 PM | Feedback (0) | Filed Under [ SQL General ]

Thursday, November 03, 2011

SQL 2012 Licensing Thoughts

The only thing more controversial than new Federal Tax plans is new Licensing plans from Microsoft.  In both cases, everyone calculates several numbers. 

  1. First, will I pay more or less under this plan? 
  2. Second, will my competition pay more or less than now? 
  3. Third, will <insert interesting person/company here> pay more or less? 

Not that items 2 and 3 are meaningful, that is just how people think.

Much like tax plans, the devil is in the details, so lets see how this looks.  Microsoft shows it here: http://www.microsoft.com/sqlserver/en/us/future-editions/sql2012-licensing.aspx

First up is a switch from per-socket to per-core licensing.  Anyone who didn’t see something like this coming should rapidly search for a new line of work because you are not paying attention.  The explosion of multi-core processors has made SQL Server a bargain.  Microsoft is in business to make money and the old per-socket model was not going to do that going forward.

Per-core licensing also simplifies virtualization licensing.  Physical Core = Virtual Core, at least for licensing.  Oversubscribe your processors, that’s your lookout.  You still pay for  what is exposed to the VM.  The cool part is you can seamlessly move physical and virtual workloads around and the licenses follow.  The catch is you have to have Software Assurance to make the licenses mobile.  Nice touch there.

Let’s have a moment of silence for the late, unlamented, largely ignored Workgroup Edition.  To quote the Microsoft  FAQ:  “Standard becomes our sole edition for basic database needs”.  Considering I haven’t encountered a singe instance of SQL Server Workgroup Edition in the wild, I don’t think this will be all that controversial.

As for pricing, it looks like a wash with current per-socket pricing based on four core sockets.  Interestingly, that is the minimum core count Microsoft proposes to swap to transition per-socket to per-core if you are on Software Assurance.  Reading the fine print shows that if you are using more, you will get more core licenses:

From the licensing FAQ.

15. How do I migrate from processor licenses to core licenses?  What is the migration path?

Licenses purchased with Software Assurance (SA) will upgrade to SQL Server 2012 at no additional cost. EA/EAP customers can continue buying processor licenses until your next renewal after June 30, 2012. At that time, processor licenses will be exchanged for core-based licenses sufficient to cover the cores in use by processor-licensed databases (minimum of 4 cores per processor for Standard and Enterprise, and minimum of 8 EE cores per processor for Datacenter).

Looks like the folks who invested in the AMD 12-core chips will make out like bandits.

Now, on to something new: SQL Server Business Intelligence Edition. Yep, finally a BI-specific SKU licensed for server+CAL configurations only.  Note that Enterprise Edition still supports the complete feature set; the BI Edition is intended for smaller shops who want to use the full BI feature set but without needing Enterprise Edition scale (or costs).  No, you don’t get ColumnStore, Compression, or Partitioning in the BI Edition.  Those are Enterprise scale features, ThankYouVeryMuch.  Then again, your starting licensing costs are about one sixth of an Enterprise Edition system (based on an 8 core server).

The only part of the message I am missing is if the current Failover Licensing Policy will change.  Do we need to fully or partially license failover servers?  That is a detail I definitely want to know.

posted @ Thursday, November 03, 2011 12:34 PM | Feedback (6) | Filed Under [ SQL General Microsoft Denali ]

Friday, August 19, 2011

Tell me a Story

 

I recently had a friend ask me to review his resume.  He is a very experienced DBA with excellent skills.  If I had an opening I would have hired him myself.  But not because of the resume.  I know his skill set and skill levels, but there is no way his standard resume can convey that.  A bare bones list of job titles and skills does not set you apart from your competition, nor does it convey whether you have junior or senior level skills and experience.  The solution is to not use the standard format.

Tell me a story.  I want to know what you were responsible for.  Describe a tough project and how you saved time/money/personnel on that project.  Link your work activity to business value.  Drop some technical bits in there since we do work in a technical field, but show me what you can do to add value to my business well above what I would pay you.  That will get my attention.

The resume exists for one primary and one secondary reason.  The primary reason is to get the interview.  A Resume won’t get you a job, so don’t expect it to.  The secondary reason is to give you and the interviewer a starting point for conversations.  If I can say “Tell me more about when….” and reference an item from your resume, then that is great for both of us.  Of course, you better be able to tell me more, both from the technical and the business side, at least if I am hiring a senior or higher level position. 

As for the junior DBAs, go ahead and tell your story too.  Don’t worry about how simple or basic your projects or solutions seem.  It is how you solved the problem and what you learned that I am looking for.  If you learn rapidly and think like a DBA, I can work with that, regardless of you current skill level.

posted @ Friday, August 19, 2011 10:45 AM | Feedback (2) |

Friday, July 15, 2011

Virtual Lab (part 1)

 

 

No, not that kind of Virtual Lab.  I mean a virtualization based laboratory so you can experiment with test systems.  While there are many uses and configurations for a virtual lab, my primary configuration is a laptop based environment because I am often at client sites that do not allow outbound access to remote systems for security reasons so I need something I can take with me.  Therefore, this series will focus on building out that type of environment.

The first requirement is a laptop with a LOT of CPU and Memory capacity.  8GB is the minimum I consider useful for a VM host.  16GB is better.  I use a Lenovo W510 (current model is the W520).  Most multi-core processors have hardware virtualization extensions so anything except the lightest netbook technically can do virtualization.

Disk space is also a big need. I use this to store my extra VMs, only without the pretty pattern on the case.  I wish I had the resources to afford a large SSD, but we all deal with an imperfect world.

So, now that we have a good hardware platform, we need to choose a virtualization platform.  Virtual PC is pretty much out since it does not support 64-bit guests.  This totally eliminates current-generation Server Operating Systems. 

Hyper-V is a possibility.  I experimented with “boot from VHD” to dual-boot my laptop without losing my regular OS.  It works very well, but I found myself reluctant to “disconnect” myself from my regular work platform in order to work on my virtual lab environment.  I wanted something that would run concurrently with my other workstation applications.  For my home lab, I use a dedicated Hyper-V box.

I used to use VMLite.  VMLite is based on the open source 3.x version of VirtualBox.  The big problem with VMLite is that it only supports 3 concurrent operating VMs.  And yes, this can become a problem in environments that are even moderately complex such as a SharePoint server with a SQL Cluster.  That takes a minimum of four VMs.

Next up was VirtualBox without the VMLite wrapper.  The cool thing is that VirtualBox can read and write VDI, VHD, and VMDK files (the three major types of virtual disk files). VirtualBox used to have a Media Manager where you registered virtual disks.   One goal for version 4.x was to remove this and have virtual disks registered to specific VMs, making each VM self-contained.  Great idea.  Terrible implementation. 

If you build a system using VHD files and differencing disks, VirtualBox registers and displays the BASE disk instead of the Differencing disk in the VM.  It registers the disk by a UUID instead of the disk path.  technically, the xml config file is supposed to contain both path and UUID information.  This becomes important later.  If the BASE disk is already registered in any VM, it won’t actually write any disk path information into the VM configuration file.  Since they removed the explicit “register this disk” commands from Media Manager and the command line tool vboxmanage.exe, you can’t fix this without editing the undocumented XML VM configuration files directly.  The frustrating part is that there are elements to fix this within the code, but they are intentionally blocked.  You can’t create a differencing disk of any type without creating a VM snapshot and the “clone VM” functionality is pathetic.  You can’t force a VM to register a disk and write the path to the configuration file.  You can’t get VirtualBox to create a VHD differencing disk.

Manually create a differencing disk using DISKPART.exe, wrap it in a VM, and it works great… right up until you exit VirtualBox.  Start up VirtualBox and it can’t find the disk anymore and the VM is marked “Inaccessable”.  Create a new VM, add the disk, refresh the original VM and it is back.  Right up until the next time you close VirtualBox.  A lot of investigation led me to see that VirtualBox does not write the path information for VHD files inot the VM correctly, especially if they share a common base disk without a corresponding VM snapshot. #FAIL

Don’t get me started on their idea of “community” support .  Read the forums and you will see typical open source arrogance towards anyone who isn’t a Linux guru, even when asking on the Windows forums.  Makes me appreciate the SQL community that much more.

That leaves us with the big dog; VMWare workstation.  I finally broke down and loaded it up.  It only works with VMDK files, but the management functions are feature complete.  It isn’t as fast as VirtualBox, but I needed a platform that works, not one I constantly have to work on.  Sometimes you have really do get what you pay for.

posted @ Friday, July 15, 2011 8:21 PM | Feedback (6) |

Wednesday, July 13, 2011

SQLSaturday #89 Atlanta 2011

The schedule is out and it is a hot one.  We have a super lineup of regional and national level speakers as well as some brand new local ones.  We have Adam Machanic (blog, twitter) coming down from Boston to share his expertise.  We even managed to coax Bob Ward (blog, twitter), Senior Escalation Engineer with Microsoft Product Support services, out of his lair near Dallas to come and be one of our Featured Speakers. He is giving the same session content he gave to packed rooms at the 2010 PASS Summit.

Our Pre-cons couldn’t have better presenters.  For the BI side, we have John Welch (blog, twitter) and for the DBAs we have Kevin Kline (blog, twitter).  Both are top-level experts in their fields and are well worth the $99 pre-con fee.

The location is also top-notch.  Georgia State University has graciously allowed us use of their Alpharetta facility, normally used for MBA classes, for our event.  This lets us up the total attendee count to about 400, but it is filling up fast.

For those who still need convincing, how about some numbers:

400: Registration capacity – going fast.

42: Total Sessions (includes lunch presentations)

14: Number of MVPs Presenting

8: Total Tracks

6: Blog Sponsors

5: Number of Gold Sponsors

3: Number of Silver Sponsors

3: Number of Microsoft Employees Speaking

2: High-value Pre-conference Seminars

1: Fantastic Event

 

So go register and I will see you there.

posted @ Wednesday, July 13, 2011 11:03 AM | Feedback (0) |

Tuesday, July 12, 2011

The first thing you will hate about SQL Denali…

OK, this is really the first thing “I” hate about SQL Denali, but I bet a fair number of you will hate this too. Once again, Microsoft has enforced the dictum that everyone must be either a Developer or an IT Pro. As Data Professionals, many of us have suffered as our employers can’t figure out where to hang us in the company Organization Chart, usually sticking us between Dev and Ops and letting the managers (and us) sort it out. This time, Microsoft has decided we are “Developers” and we shall get our help via the Web. Admittedly, Google makes a better index for Books-On-Line (BOL) than BOL index, but that is no excuse for crippling the documentation.

If you want a local copy of BOL, you will have to go through an insane number of steps to get it on your machine fully. It gets worse if this is a production machine (or operating under production protocols) where you don’t have direct Internet access. Let’s step through a BOL install to see just how bad it is.

First, you have to find BOL. It isn’t in the main download (CTP3). Tickle a search engine with the magic words “SQL Denali Feature Pack” and you will find yourself here eventually:

http://www.microsoft.com/download/en/details.aspx?id=26726

OK. Hit the “Download” button.

Gotcha!! You jumped to the actual feature pack page (http://www.microsoft.com/download/en/confirmation.aspx?id=26726

) while downloading a file with the following essential instructions:

clip_image002

Thanks Microsoft, I never would have figured that out on my own.

Scroll waaaayyy down and eventually get to here:

clip_image004

It is right between the Remote Blob Store download and the Upgrade Advisor Download, exactly where you would expect it to be.

Note this does not download any files (despite the essential instructions downloaded to get you here), it takes you to yet another download page:

http://www.microsoft.com/download/en/details.aspx?id=347

Jackpot, not only do we get a real download, we finally get an explanation:

clip_image006

OK, Click the download and 200 MB later we have an install executable. Well, not exactly. We have a self-extracting ZIP file that copies its contents somewhere (Default is down in an AppData temp folder. )

clip_image008

Note the folder size and the file download size are identical. Yes, zero compression. A little digging tells us why:

clip_image010

Cab files. They are already compressed.

Meanwhile most of us are waiting for the installer phase to start. And waiting…. And waiting…..

Hmm. No installer app. Remember the web page above? No installer, we have to go to an app and yet another unique step to install local help.

For this part to work, you have to have loaded the DenaliCPT3 SQL Server Management Studio.

Bring up the Help Library Manager. You can do this from Management Studio or from the Start menu.

clip_image012

First, you have to tell it to use local help because that isn’t the default:

clip_image014

Then you have to actually LOAD the help (for those counting, we are now up to three copies; the raw download, the source folders, and the actual local copy).

clip_image016

Hang on, we are almost there. Browse to the files you extracted from the self-extracting archive (you do remember where you put them, right?)

clip_image018

One more “Simon Says” moment while you tell help manager to add the files you took four web pages to find, then had to download , then had to extract.

clip_image020

Simple, obvious, and easy. NOT!!!!

Given install paths like this, Oracle can win the usability wars just by standing still.

posted @ Tuesday, July 12, 2011 4:29 PM | Feedback (6) | Filed Under [ SQL General Microsoft Denali ]

Wednesday, June 29, 2011

PASS Election Process 2011

Two years ago I wrote about how screwed up the SQL PASS elections process was, mainly due to the small candidate to seat ratio. That post included a harsh comment comparing the election to a game of musical chairs. Last year several of my colleagues urged me to run for the board, putting my seat where my mouth is (hmm, that doesn’t sound quite right). That led to being involved with the most controversial election ever for PASS, although the controversy was not about me. We listened and created an Election Review Committee to improve the process.
My view is we had a process that focused on transparency (Good!) while still lacking key direction or vision. The PASS board, along with recommendations from our Election Review Committee, has come out with the latest revision to the Elections Process.  Bill Graziano has a post here with the announcement and the process. I am pleased to say this process is much stronger regarding direction, purpose, and vision.
I would like to highlight and discuss a few key points.
1)      PASS Experience Required: The board has explicitly made PASS experience at more than the speaker level a mandatory requirement. We are looking for PASS leadership. You have to know what it is we do and how we do it in order to be eligible.
 
2)      The Nominations Committee (Nom-Com) now has specific requirements for its members. Short version is if you were eligible for a leadership (non-speaker, non MVP) comp to Summit 2010, you are eligible for the Nom-Com.  The actual document has a more detailed list. Not only do the candidates for Nom-Com come from this pool, this is the same group that elects Nom-Com members. The criteria for inclusion may miss a few qualified members, but this is the best fit answer that does not require gleaning through the 70,000 members on the mailing list to decide who is qualified. I would rather see those resources put towards actual community support that micro-tweaking this process.
 
3)      The Nom-Com has two distinct roles. One is to make sure that Candidates’ written applications meet the minimum requirements for Directorship. The second is to stack-rank the applicants after conducting interviews. In addition, there is an “Unqualified” rank that requires a unanimous vote of the Nom-Com. We don’t expect to use this, but it is there “just in case”. This dual role was not clear last year, leading to a lot of noise and confusion about the process.
 
4)      All candidates, including current directors running for re-election, must go through the same process. (Yes, this means me too).
 
5)      Some parts of the process, specifically the interviews, are intentionally confidential. The process is very similar to a job interview for an executive position. People ask tough questions and need honest answers. There are a lot of things discussed that during an interview lsuch as these that could unnecessarily cause misunderstandings and community conflict, especially when taken out of context. Some things should stay confidential. Feel free to disagree. Please post the transcript of your last three job interviews along with your dissent.
6)      Finally, the Board of Directors is still the ultimate backstop. You elected us, that makes it our responsibility.
I think this process addresses many of the issues from last year. It makes the goals and qualifications for the Nom-Com and the Candidates much more explicit. It also better defines the dual jobs of the Nom-Com. 
All in all, this qualifies as a major version upgrade.

posted @ Wednesday, June 29, 2011 10:07 AM | Feedback (4) | Filed Under [ PASS ]

Powered by:
Powered By Subtext Powered By ASP.NET