Despite the best laid plans, sometimes circumstances or project scope change and the definition of one or more tables in your database must be altered after they have been created. It's just a fact of life for a DBA.
Frequently these changes affect more than one database in your environment. The development, test, and eventually even the production databases must have their definitions updated to reflect the newly implemented changes.
For cases like this, I generally prefer to script out the necessary changes rather than using the Management Studio tools. Scripting reduces the likelihood of an unintentionally errant change while updating the 100+ tables in the database. If one database or table is done correctly, you can be assured that the rest are in order as well. That's not to say you shouldn't test; you should. But the changes will be much more consistent when they are implemented via a script rather than by hand.
Recently I was tasked with adding a column to every user table in a database. There were lots of tables in this database and doing it by hand was not a good alternative. Of course if the changes were successful in Dev, they would need to propagated to Test and Production as well. So, I used T-SQL to dynamically create a script for me.
There a number of ways to accomplish this. A cursor or a while loop come to mind. However, since I knew that the new column name did not exist in any of the tables, I could use a shortcut and make use of the sp_msforeachtable system stored procedure. I described its use in a prior blog.
The first step is to use the stored procedure to generate T-SQL code to implement the change. For this example, we want to add a datetime column called Date_Created to each table in the AdventureWorks database. In a new query window, type the following code.
'PRINT ''ALTER TABLE ? ADD Date_Created DATETIME DEFAULT GETDATE();''' ;
When this script is executed, the Messages pane of the query window is filled with a list of DDL (Data Definition Language) statements to implement our change. The output is shown in the following figure.
After reviewing these DDL statement to ensure they are indeed what we wish to do, we can then copy and paste them into a query window for execution.
Of course this implementation doesn't have error handling. If a column by that name already exists, the DDL statement will fail for that table. If you are unsure whether a column already exists, you should use one of the other implementations (cursor, while, etc) so you can check for the presence of the column before issuing the ALTER TABLE statement.
I hope you find this useful. If you have any favorite scripting techniques or comments on this one, please feel free to share in the comments section below.
The past couple of years has seen a great increase in the interest of server consolidation and virtualization. Once considered an almost impossible to achieve ideal, today more and more companies are taking a serious look at its benefits. And those benefits are quite alluring.
I have talked with colleagues who swear by virtualization. They claim that when properly configured, SQL Server can perform every bit as well on a virtual server as when on a dedicated box. So for them, the benefits greatly outweigh any potential drawbacks.
Although I don't have a great deal of experience with server virtualization, I do make quite extensive use of desktop virtualization. In fact, my laptop does not have a single piece of SQL Server or .NET related software installed it. Everything that I do professionally, I do in a virtualized environment.
A recent request from a long time client further highlighted the benefits of desktop virtualization. The client uses a ten year old VB6.0 / Crystal Reports 8.0 application that I support. The application has been quite stable and has met their needs for some time now and as a result I haven't used VB6.0 nor CR8.0 is years. However the client recently identified some relatively minor enhancements that would improve the application. Without desktop virtualization, creating the development environment would have been rather difficult. I wouldn't have had these older tools on my laptop already and I certainly wouldn't have wanted to install an older set of tools on top of a newer version.
But with virtualization, no problem. I can simply copy an existing virtual machine and install the required software. Existing virtual machines would not be affected. In this case I already had a virtual machine configured for this scenario, but if a hadn't it would have been easy to create.
I have six or seven different virtual machines configured for different purposes. For example, I have a virtual machine with each of the following configurations and uses.
- SQL Server 2005 / Visual Studio 2005 - This is my primary development machine that I use for most of my clients.
- SQL Server 2000 / Visual Studio 2000 - Some clients still use an older set of technologies, so I have a machine configured with that set of software.
- SQL Server 2000 / Visual Basic 6.0 - A few of my clients still use VB6.0 applications that I created for them years ago, so I have a virtual machine configured for this purpose.
- SQL Server 2008 - When I speak at conferences and user group meetings, I like having a dedicated environment that I can quickly configure without affecting my production environments.
- SQL Server 2005 / SQL Server 2008 - I used the system to test upgrade paths when writing some technical documentation for a recent project.
- Base O/S install - I use this as a the basis for new virtual machines that I create. It only has a fully patched operating system installed on it.
The beauty of this layout is that I can easily copy an existing virtual machine for a new purpose as need be. This provides me with a pristine environment in which to work.
If you're using desktop virtualization, I'd like to hear about your experiences. Drop me a comment below. If you're not already using desktop virtualization, I'd highly recommend it.