After about 6 months of work, the new book I've coauthored with Grant Fritchey (Blog|Twitter), Phil Factor (Blog|Twitter) and Alex Kuznetsov (Blog|Twitter) is out. They're all smart folks I talk to online and this book is packed with good ideas backed by years of experience. The book contains a good deal of information about things you need to think of when doing any kind of multi person database development. Although it's meant for SQL Server, the principles can be applied to any database platform out there. In the book you will find information on: writing readable code, documenting...
This post is part of the monthly community event called T-SQL Tuesday started by Adam Machanic (blog|twitter) and hosted by someone else each month. This month the host is Sankar Reddy (blog|twitter) and the topic is Misconceptions in SQL Server. You can follow posts for this theme on Twitter by looking at #TSQL2sDay hashtag. Let me start by saying: This code is a crazy hack that is to never be used unless you really, really have to. Really! And I don’t think there’s a time when you would really have to use it for real. Because...
Having the SSMS Tools Pack out in the wild enables me to get much “joy” from different bug reports people send me. And let me tell you, people send me back some seriously weird errors. But the most unexpected error message I’ve seen so far was the OverflowException when calling System.Data.SqlClient.SqlDataReader.GetDecimal(Int32 i). It turns out that SQL Server decimal data type is not mapped to the .Net decimal in it’s whole range. .Net decimal type only maps to SQL Server one in the decimal(29, X) range. That means that if you have a decimal column that has...
I've written an article here on SQL Team about Application Locks in SQL Server 2005 Application locks aren't a well known area of locking in SQL Server, but they can be very useful for special scenarios. They work in a analogous way to the lock() construct in .Net and are basicaly user defined mutexes in SQL Server. It's a second part in the locking series articles: Part 2: Application Locks (or Mutexes) in SQL Server 2005 Part 1: Introduction to locking in SQL Server 2005
it's my opinion that today we unfortunately live in a world that requires some kind of harder problem solving skills from maybe 20% of the population. And I'm being very very generous. So putting that percentage with the question in the title would give an answer similar to: "Because not every one can do it." DOH, right? I'd say you're wrong since everyone had a problem to solve at least once in a lifetime. So what's the problem? Why is it perceived as hard? Let's take a little trip through your past. When you were just an infant, what was...
In a previous post I explained where an SSIS FastParse option can be found
when importing data from Flat files.
I finally had some time to explore all four of SQL Server 2005 Bulk import utilities.
2. Bulk Insert
3. OpenRowset with BULK option
4. SQL Server Integration Services - SSIS
I ran each bulk import option 12 times, disregarded best and worst time and averaged the remaining ten times.
A select statement which returns all rows in nortwhwind..orders uses a clustered index scan.
That's great, since it has to go through all of the data... But look at this example:
usego northwindsetdeclareset nocount on @i int , @time datetime @i = 0set @time = getdate()while @i<50begin select * from orders set @i = @i+1endset @time = getdate() - @timeprint('iters scan:' + convert(varchar(20), @time, 114))godeclareset @i int , @time datetime @i = 0set @time = getdate()while @i<50begin select * from orders where orderid > -1 set...
well there was once a debate on SqlTeam in which i claimed that IsNull is slower than Coalesce when used in same fashion:
IsNull (col1, 'somevalue') vs Coalesce(col1, 'somevalue')
so i did a little testing. first i inserted 500.000 rows into a table with 5 columns:
create table TestTable (col1 varchar(50), col2 varchar(50), col3 varchar(50), col4 varchar(50), col5 varchar(50))
in which for each 100k rows one column was filled with 50 chars and the rest were null.
then i ran both of the statements (one for coalesce and one for isnull) 10 times and took 10 elapsed time measures. Out of those 10 elapsed times...