SQL Server Discovery

Better, faster, cheaper ...pick two.
posts - 13, comments - 80, trackbacks - 0

Tuesday, August 31, 2010

Test database connectivity

If you need to test connectivity from your workstation to a remote database server, here is a handy trick which will help you accomplish it.

Right-click on a blank area of your windows desktop and choose create new text file:

 photo 1_CreateNew_zps906a421d.png 

Create and save the file name with something like TestConn.udl

 photo 2_TestConn_udl_zpsea90c68b.png

Notice the file extension is UDL which stands for Universal Data Link. 

  photo 3_DataLink_Properties_zps5383568f.png

Double-click the file shortcut to launch the UDL applet and choose the data provider connectivity you wish to test: 

 photo 4_Provider_zpsfcec948b.png

 Click Next, then type server name, authentication, database and click Test Connection: 

 photo 5_ServerDB_Conn_zps6d935696.png

1. If Test Connection succeeds, a confirmation message displays: 

 photo 6_Success_zpsb3792a80.png

2. After test connection succeeds, click ok to save the file.

The valid connection string just created can be viewed by right-clicking the file shortcut and choose 'Open With' > Notepad: 

 photo 8_OpenWith_zpse822afaa.png

You can now copy-paste the connection string and use it in code, scripts or clients to make a successful database connection: 

 photo 9_Notepad_zpsec994371.png

 

Hope this helps!

posted @ Tuesday, August 31, 2010 3:33 PM | Feedback (1) |

Monday, April 19, 2010

SQL Server 2008 - Management Studio issue

This is a known issue with SQL Server 2008, that certain DDL operations like ALTERing a column datatype using Management Studio GUI fails.

For example, in Object Explorer, navigate to a table column > right-click on column > Modify. Change column datatype or length, then save and this error message displays:

 photo cantsavecolaltererror_zps286006e7.png

 
To workaround this problem, go to Query Editor and issue the following DDL statement instead:

ALTER TABLE dbo.FTPFile ALTER COLUMN CmdLine VARCHAR (100) ;

GO
 

The column change is successfuly applied now.

posted @ Monday, April 19, 2010 12:51 PM | Feedback (6) |

Wednesday, February 11, 2009

Find System information using SQL Server

If you don't have local login rights on a Server and need to find system resources onboard. Here is a little TSQL script that will display useful information like CPU, memory and pagefile size:


USE [master];
GO
select 
    cpu_count
,    hyperthread_ratio
,    physical_memory_in_bytes / 1048576 as 'mem_MB'
,    virtual_memory_in_bytes / 1048576 as 'virtual_mem_MB'
,    max_workers_count
,    os_error_mode
,    os_priority_class
from 
    sys.dm_os_sys_info 


Hope it helps!

posted @ Wednesday, February 11, 2009 6:31 PM | Feedback (4) |

Tuesday, November 27, 2007

Database Inventory - Part II

I worked on my Database Inventory script a bit and am releasing an enhanced version of the original.

When run in the Master database, the script displays a list of all user databases located in the specified sql named instance, along with file name, file type, file path and file size columns.  


USE 
[Master]
GO

SELECT
   
(SELECT 'file_type' =      CASE          WHEN sysaltfiles.groupid <> THEN 'data'        WHEN sysaltfiles.groupid THEN 'log'    ENDAS 'file_type' ,  sysdatabases.name AS 'db_name' ,  sysaltfiles.name AS 'logical_file_name' ,  sysaltfiles.filename AS 'physical_file_name' ,  (sysaltfiles.size 1024AS 'file_size(MB)' -- file size in MB FROM     dbo.sysdatabases JOIN    dbo.sysaltfiles ON    (dbo.sysdatabases.dbid=dbo.sysaltfiles.dbid) WHERE    sysdatabases.dbid NOT IN ('1','2','3','4') ORDER BY    dbo.sysdatabases.name

Hope this helps! 

posted @ Tuesday, November 27, 2007 2:08 PM | Feedback (0) |

Tuesday, March 27, 2007

SQL Server 2005 Best Practice Analyzer

I've been testing the re-released Best Practice Analyzer called SQL Server 2005 Best Practice Analyzer 2.0.

I find this tool to be very handy for identifying database Security & Performance issues and making fix recommendations, even though it is CTP and not final release yet.

It does NOT have to be installed on a remote server, just on a DBA's Mgmt Studio workstation. It DOES, however, require that you have Admin priviledges on the remote SQL Server in order to access the Windows registry.

Give it a spin and see what you think.

posted @ Tuesday, March 27, 2007 3:33 PM | Feedback (6) |

Wednesday, February 28, 2007

Gather a Database Inventory

As a DBA for a large corporation, I manage literally hundreds of databases. In order to help keep track of all the databases and log files, I needed a way to quickly gather an inventory of all databases located on each sql named instance, so I came up with this simple little script that also includes a filesize in megabytes column: 


USE [Master]
GO 
SELECT 
	sysdatabases.name AS 'db_name'  
,	(select 'file_type' =  
	CASE  
		WHEN sysaltfiles.groupid = 1 THEN 'data'
		WHEN sysaltfiles.groupid = 0 THEN 'log'
	END) as 'file_type'	
,	sysaltfiles.name AS 'file_name'
,	(sysaltfiles.size * 8 / 1024) AS 'file_size(MB)' -- file size in MB
FROM 
	dbo.sysdatabases
JOIN
	dbo.sysaltfiles ON
	(dbo.sysdatabases.dbid=dbo.sysaltfiles.dbid)
WHERE
	sysdatabases.dbid NOT IN ('1','2','3','4')
ORDER BY
	dbo.sysdatabases.name


Run this script in the Master database and it will output a database list that you can export to a spreadsheet for reporting or futher analysis. If anyone knows a better way to do this or some enhancement suggestions to include additional useful information, let me know.

posted @ Wednesday, February 28, 2007 9:40 AM | Feedback (7) |

Wednesday, May 31, 2006

Fast importing of text files with format files and bulk insert

When creating SQL Server 2000 ETL solutions like importing of text files, it is common for developers to employ DTS packages to accomplish results quickly. The downside of this approach is that DTS packages have certain programmability limitations and don't always offer the best performance.

If you are importing very large raw text files that don't include formatting like column headers, a high-performance solution is to use BULK INSERT with format files.

For example, the format file may be named major.fmt (note the file extension .fmt) and would be placed in the same file system directory where the import source text files are located.

Here is a sample of what a format file looks like: 

8.0 4 1 SQLCHAR 0 0 "\"" 0 x SQL_Latin1_General_CP1_CI_AS 2 SQLCHAR 0 0 "\",\"" 1 WTN SQL_Latin1_General_CP1_CI_AS 3 SQLCHAR 0 0 "\",\"" 2 Major SQL_Latin1_General_CP1_CI_AS 4 SQLCHAR 0 0 "\"\r\n" 3 MktSeg SQL_Latin1_General_CP1_CI_AS


Look in SQL Server Books On Line to find more details about how to use format files.

Now you can perform BULK INSERTS from the source text file like this:


BULK INSERT Database_name..MajorAccount FROM '\\ServerName\Textfiles$\major.txt' WITH (FORMATFILE = '\\ServerName\Textfiles$\major.fmt')


Your data loads will now complete way faster than trying to accomplish the same results with DTS packages in SQL   Server 2000.

Hope this helps!

posted @ Wednesday, May 31, 2006 9:24 PM | Feedback (4) |

Friday, April 28, 2006

Linked Server to Text Files - SQL2000

Sometimes it's handy to create a Linked Server to text files in order to perform distributed queries or bulk inserts of flat-file data into SQL Server.

In order to demonstrate this capability with SQL Server 2000, create a text file copy of the Categories table from Northwind database and place the text file named CATEGORIES.TXT in a subdirectory named TEXTFILES on a hard drive attached to the SQL Server machine.

Then add a Linked Server to the text file with a script like this:


exec sp_addlinkedserver NW_TEXT, 'Jet 4.0', 'Microsoft.Jet.OLEDB.4.0', 'Z:\TEXTFILES', NULL, 'Text'


Alternatively, you can add a Linked Server in Enterprise Manager:

Step 1
Linked Server Properties

 Step 2
New Linked Server dialog box 

 Step 3
Linked Server Properties - Security

Step 4
Linked Server - Server Options tab

This now allows you execute queries against the text file like this:


select
* from NW_TEXT...categories#txt



You can use a schema.ini file (Text File Driver) to provide formatting like column headers in a text file and make the file(s) browsable just like tables are when displaying them from the Linked Server node in Enterprise Manger.

For this example, create a schema.ini file that looks like this:


[CATEGORIES.TXT]
Format=CSVDelimited
CharacterSet=ANSI
ColNameHeaders=True
Col1=CategoryID Short
Col2=CategoryName Text Width 15
Col3=Description Text Width 100


For further information, check Schema.ini (Text File Driver) on Microsoft MSDN.

One caveate I ran across though, is that the maximum number of columns or fields that you can specify in a schema.ini file is 255. So, if you have a really W-I-D-E data file with lots of columns, and you are considering Linked Server with a schema.ini file, be aware of this limitation.

Hope this helps.
-pj

posted @ Friday, April 28, 2006 2:59 PM | Feedback (7) |

Tuesday, February 28, 2006

Linked Server to Teradata - SQL2000

1.) Download and install Teradata OLE DB driver on the target server where SQL Server 2000 is hosted.

2.) Right-click the Linked Server node under Security node in Enterprise Manager and select “New Linked Server...”

3.) On the General Tab:
     -Choose a Linked Server name.
     -Select “OLE DB Provider for Teradata“ from the “Provider Name“ Dropdown list under “Other data source“.
     -Choose a name for the Linked Server in the “Product Name“ textbox.
     -Enter the name of the Teradata database in the Product Name field.
     -Enter a server domain name or IP address of the server that Teradata database is hosted on.
     -Leave Provider String field blank.

Linked Server Properties - Teradata

4.) Click the “Provider Options“ button and
     -Check the “Allow InProcess“ checkbox
     -Check the “Index as Access Path“ checkbox

Teradata Linked Server - Provider Options

5.) On the Security Tab:
     -Under login section, select “Be made using this security context” Radio Button
     -Enter your Teradata login name in the “Remote Login” field
     -Enter your Teradata password in the “With Password” field

Teradata Linked Server - Security

6.) On the Server Options make sure the following options are checked:
     - Data Access
     - Use Remote Collation

Teradata Linked Server - Server Options

7.) OK out and you should be able to select the Tables node underneath the Linked Server you just created and the tables will appear as browsable objects.

For fast and efficient distributed queries use the OPENQUERY method like this:


SELECT * 
   FROM OPENQUERY
   (
   TERADATA_DATABASE, 
   'select 
       field1
   ,   field2
   from 
      table1'
   )


This performs all the query processing on the Teradata source server before pulling it across the wire to your local SQLServer machine making the operation much faster.

Hope this helps those of you needing to perform Teradata ETL with SQLServer 2000.

posted @ Tuesday, February 28, 2006 4:02 PM | Feedback (3) |

Wednesday, November 30, 2005

Linked Server to Teradata - SQL2005

Maybe this is known among those who have to connect to Teradata stores from SQL Server but I needed to accomplish this from SQL Server 2005 and didn't find any published resources on how to do it. With a little experimentation, this is what I came up with:

1) Fetch the Teradata OLEDB driver from teradata.com and install it on the SQL Server 2005 system.

2) In SQL Management Studio (AKA Enterprise Manager) create a new Linked Server with the following configuration:
-Linked Server:  choose any name (ex: ENCORE)
-Provider:  OLE DB Provider for Teradata (select from dropdown list)
-Product Name:  Database name (ex: databasename)
-Data Source:  Server name (ex: server1.corp.domain.com)
-Provider String:  (leave blank)

 photo 1_LinkSvrGen_2K5_zps52c12685.png

 3) Type in your Teradata username & password in the appropriate textboxes on the Security tab:

 photo 2_LinkSvrSec_2K5_zpsfa18aadc.png

4) You can build a SQL query like this:


SELECT
     field1
,    field2

FROM

      ENCORE..B8_PROD_VIEW.table_name
--(notice the 4 part naming string with second element skipped)


But using the OPENQUERY method will be much faster:

SELECT *
FROM OPENQUERY(ENCORE, 'SELECT field1 FROM B8_PROD_VIEW.table_name'
 


Viola! you have a dataset from the remote Teradata store returned inside of SQL Server and ready to perform distributed queries like joining to local tables.

 

Hope this helps!

-pj

posted @ Wednesday, November 30, 2005 8:56 PM | Feedback (29) |

Wednesday, August 17, 2005

SQL Server 2005: Top 10 New Features

I ran across a article on Builder AU entitled: SQL Server 2005 Secrets, posted August 10, that should be more aptly titled “SQL Server 2005: Top 10 New Features”. Either way, its a good update on what to expect.

Here's another article entitled: SQL Server 2005: The BI Release from the August issue of SQL Server Magazine.

I'm most excited about Integration Services, which is the successor to DTS. It's more of a totally new product than a simple upgrade. My first impression is that it appears to be an industrial strength ETL tool that will squeeze the major players in this market space. Along with Analysis Services and Reporting Services, these products comprise a toolset for developing enterprise BI solutions that will be hard to beat.

Anybody have thoughts or experience with the new Integration Services that they would like to share?

posted @ Wednesday, August 17, 2005 10:42 AM | Feedback (8) |

Monday, May 02, 2005

Performance Tuning Webcast

I found a pretty good webcast by Kimberly Tripp on Performance Tuning:

MSDN Webcast: SQL Server 2000 Performance Tuning - Finding the Right Balance of Indexes – Level 200

It's probably more detailed than what the average DBA needs but if you're really wanting to squeeze the last bit of performance out of your database, this could be the ticket.

-pj

posted @ Monday, May 02, 2005 7:53 PM | Feedback (2) |

Saturday, April 30, 2005

MCDBA - Here I come!

Well, I took the SQL Server Administration Exam (70-228) and passed with a score of 757 yesterday. It was my first MS Exam so I was a little nervous.

Next up: SQL Server Development (70-229), then C# Web Applications (70-315) and finally Windows 2000 Server Administration (70-215). If all goes as planned, I'll get these completed within a couple of months.

I'm more of a developer than administrator so I found the exam to be fairly difficult. I figured if I could survive my weakest subject on the first outing then it should be downhill after that. We'll see!

It's been a long time since college, but I actually kind of enjoyed the studying and have learned a lot I hope to apply in my job.

If anybody has any words of wisdom for studying or have opinions about whether I'm wasting my time, feel free to reply.

-pj

posted @ Saturday, April 30, 2005 4:28 PM | Feedback (3) |

Powered by:
Powered By Subtext Powered By ASP.NET