Sunday, August 30, 2009
New Blog Address

I’ve decided to invest more time in writing blog posts in English…I’ve blogged intensively in Italian so far, but I did only seldom post in English.

I decided to start from scratch and to be more constant in it, and as all new started I needed a new blog, that I decided to open here:

See you there and a BIG thanks to SQL Team for hosting me all these years!

posted @ Sunday, August 30, 2009 9:20 PM | Feedback (0)
Wednesday, July 22, 2009
SSIS Package Parameters

With the latest release of DTLoggedExec, I’ve also added a nice functionality to make package runtime configuration a little easier, introducing the concept of “Package Parameters”.

I’ve now updated the documentation adding details on how to use this new feature (is *really* easy, but is *really* useful) and, as usual, you can find it on DTLoggedExec wiki:

posted @ Wednesday, July 22, 2009 9:23 AM | Feedback (0)
Wednesday, July 08, 2009
DTLoggedExec mentioned in official SQL Server documentation!

In these days I discovered that DTLoggedExec  is now officially mentioned in the SQL Server Books Online documentation:

Implementing Logging in Packages

I’m really proud of that! It give me even more will and determination to make this tool more and more powerful and useful.

I have some more features I want to add. If you’re coming to PASS 2009, I’ll have a session on SSIS Logging, so you’ll be able to see it in action with your own eyes:

posted @ Wednesday, July 08, 2009 8:39 AM | Feedback (0)
Monday, June 29, 2009
DTLoggedExec Here comes CSV Logging!

In these last days I've released the latest version of DTLoggedExec. Now it has reached version

I planned the release to be done in May, but I added a l lot of stuff and then I realized that logging so much information without being able to automatically import it into SQL Server for further analsys has no clue.

So I decided to hold on a little bit and also improve the DTLoggedExec db that I firstly released with version just as an example of what can be done with the Data Profiling featture, and that news has grown to be official repository of all the data the DTLoggedExec can log.

Of course with the added capability to log to CSV file, this feature was really needed.

This new feature brought also the need to be able to associate CSV Logging with DataFlow Profiling, so I had to change the code a little bit so that everything can get a consistent and unique Execution ID, which allows the data to be tied all togheter, not matter if it comes from DataFlow Profiling or CSV Logging.

Add this with some bug fixed and a very handy feature that allows you to get rid of the complex SET syntax to make yout package parametric and you can imagine that some work had to be done.

Plus, add that before releasing any new release I want to test it for a while on myself, just to be sure that very basic bug does get unnoticed, at voilà, some months of delay.

Anyway, now its there, and I'm sure you'll enjoy the new features:

  • Added logging of expression used by variables for which the "EvaluateAsExpression" property is true.
  • Fixed the bug the prevented to log properly Custom Events (CodePlex id# 20513)
  • Fixed a bug that prevented to log properly all the container properties in case of error
  • Added support for "Params" commandline options
  • Released CSV log provider
  • Handled exceptions when loaded configuration tries to configure non-existing objects
  • Changed the ILogProvider interface to allow Log Provider to get the ExecutionGUID value from the main program
  • Changed the Console and Null log providers to support the new ILogProvider interface
  • Changed the header of the *.dtsDataFlowProfile in order to store package, version and execution GUIDs
  • Changed .sql files to import Profiled DataFlow data in SQL Server
  • Updated DTLoggedExecDB database
  • Released scripts to load CSV data proceduced by CSV Log Provider into SQL Server

I'm now in the process of updating the documentation. I plan to be able to do it during this week and weekend, but using the new feature is quite easy and as usual I have included in released code some samples to show how to use them.

So, if you cannot wait, go and play, otherwise just hold on a while, while I update the documentation here:

One last notice about the new CSV Log Provider. This is completely functional and you can start to use it right now, but is just the first release. I plan to improve it more and more. I have to deep test it for performance impact, so play with it freely but be warned that performance may be lower then the Console Log Provider. Inside it pack data into XML format so that it can be stored in SQL Server easily, but this has some costs (You know, XML is flexble, standard, whatever you want, but surealy is not performant).

In future I plan to change a little bit the Log Provider interface again so that everything can be redirected to the Log Provider, also the initial messages, so that in the CSV log can be put everything can currently be found in the Console Log Provider, event the initialization messages, so inside SQL Server you can have the complete picture of what happened, right from the initialization of DTLoggedExec, just as it now happens with the Console Log Provider. Anyway, and I wanted to share it with you so that you can have an idea of future improvements, but for now just playing with the CSV log provider can solve a lot of nice problems. For example Auditing packages :).

More on that coming.... ;)

posted @ Monday, June 29, 2009 11:40 PM | Feedback (0)
Sunday, April 26, 2009
Input Director – Software KVM Switch and even more!

When I’m at office or at home I’m used to use my desktop pc and  my laptop. The problem with this configuration is that I have two monitors, one per computer, and that’s fine, but two mouses and two keyboards. I’d like to have only one mouse and one keyboard while still being able to use the two different computers with their own monitors.

I’ve search a program or a hardware device that would allow me to that for a while, I finally I’ve found what I’m looking for: Input Director!

And it’s alse free!!!!!

One simple world: fabulous! Finally one mouse and one keyboard to manage all the computer I have! Give it a try, you’ll surely be satified!

posted @ Sunday, April 26, 2009 5:19 PM | Feedback (0)
Saturday, April 04, 2009
SQL Bits Demo & Slides available online

For all those who came at SQL Bits, and also for all those who didn’t manage to came, and would like to have slides and demos to play with my DTLoggedExec tool, everything can be found here:

Unfortunately due to some technical problems video registration of the session is not available.

I’ve also uploaded the slides to SlideShare so that anyone can easly access them:

posted @ Saturday, April 04, 2009 5:43 PM | Feedback (0)
Wednesday, February 25, 2009
PASS Europe 2009

PASS2009_021_120x240This year I’ll also speak at PASS Europe 2009. Here I’ll deliver a session dedicated to help developers to maximize performance with their SQL Queries, using a set-based approach, insteand of the simpler but usually less  scalable and efficient row-by-row solution:

Set Based solution: an approach for developers

Set based solution are what any developer needs to have performance and scalability on his database solution. Unfortunately thinking in set is something that may seems to be difficult at the beginning and as so many developers go for a seems-to-be-easier procedural approach. But this "easier" way leads to lower performance and poor scalability, and, at the end, to a way more complex code, difficult to maintain and bug-prone. In this session we'll see how to use a set-based approach to solve problems, even those problems that for most developers seems solvable only using a procedural approach and than we will compare the two different approach to show the difference in terms of scalability, performance and complexity. You'll be amazed of the performance and maintability boost you can give to your applications!

In general I must say that I’m impressed by the quality of this edition, the PASS European Conference is really growing better every year! Just look at the speakers list and you’ll have a good idea of what to expect. Then go and check the agenda: I’m pretty sure that you’ll be amazed as I did! All sessions are so interesting that I’d like to see all of them!

Last – but not leat – remember that if you register before March 6 you can take advantage of the Early Bird discount.

posted @ Wednesday, February 25, 2009 12:32 PM | Feedback (0)
Wednesday, February 18, 2009
SQLBits, Manchester 28th 2009

On the next 28th of March will take place one of the biggest SQL Server conference in Europe, the SQLBits conference, now arrived at the fourth edition! It’s completely free and it’s held on saturday so that you don’t have any working execuses for not participating :-)

SQLBits will be hosted by the Manchester Metropolitan University and with 28 Session will cover the whole SQL Server platform. You can take a look at the agenda here

while registration is done through this link

and if you plan to attend you’d better be fast since already 330 people already registered (!!!), and there aren’t many free seats left.

This time I’ll also be there and I’ll deliver a session related to Integration Services, focusing on how one can deeply monitor SSIS Packages perfomances over time:

Here I’ll show, among other things, how to use DTLoggedExec to have a really rocking monitoing solution of ETL processes, without gettin’ mad doing it, loosing yourself between RowCount transformations, Log Providers, Event Handling and so on.

See you there!

posted @ Wednesday, February 18, 2009 11:10 AM | Feedback (0)
Friday, February 13, 2009
Smart Business Intelligence Solutions with SQL Server 2008 now available

Finally the “Smart Business Intelligence Solutions with SQL Server 2008” is now available to worldwide stores!

  • ISBN: 0735625808
  • ISBN-13: 9780735625808
  • Format: Paperback, 624pp
  • Publisher: Microsoft Press
  • Pub. Date: February 2009

This books is dedicated to everyone who wants or needs to start to enter the big, complex, exciting world of Business Intelligence on the Microsoft platform. A BI solution is made of different tecnologies and specific architecture and design decision, and this book shows to the reader how to properly start a BI solution, avoiding all typical startup problems and doubt, giving advices taken from real-world scenarios. For this reason I’m sure you’ll appreciate the book even if you already have some experience in creating BI solutions.

It covers DWH Design, Analysis Services, Integration Services, Reporting Services and Data Mining: all the things you need to be able to handle in your BI project.

I’ve contributed to write two chapters on Integration Services of this book, and since this is my first book contribution, I’m also particulary proud of it :).

Happy reading!

posted @ Friday, February 13, 2009 3:46 PM | Feedback (0)
Thursday, January 08, 2009
SQL Server Performance Workshop: Siebel, JD Edwards, PeopleSoft

If you’re using SQL Server as the database for on of the well-known enterprise applications like Siebel, JD Edwards or Peoplesoft you may be very interested in the workshops that Frank McBath, a well known expert in the integration between Microsoft and Oracle field (btw did you know that a Microsoft Oracle Center of Excellence exists?) with a strong knowledge of SQL Server.

The workshop will focus on this topics:

  • Top Reasons for SQL Server 2005/2008
  • Query Repro: Reproducing Problem Queries Outside of the Application and Tools to Fix Them
  • Performance Tuning Methodology: A Repeatable Process
  • Hands on Performance Tuning Lab
  • Databases & Disk Drives: Architecting a Scalable Solution
  • SQL Server Settings
  • Case Study Handouts
  • Architecting for RSCI
  • SQL Server Compression
  • Hyper-V and Virtualization

and will be available for free in these cities:

Zurich January 21 & 22
Estonia January 26 & 27
Stockholm January 29 & 30
New York City February 4 & 5
Los Angeles February 17 & 18
San Francisco February 25 & 26

I hope to be able to attend to the Zurich or Tallin workshop. And I strongly suggest not to miss this opportunity!

If you’re interested in can find all the information you need here:

posted @ Thursday, January 08, 2009 6:45 PM | Feedback (0)
Sunday, January 04, 2009
DTLoggedExec Relased: profile your dataflow!

With the help of these last Holydays, I've been able to release the latest version of DTLoggedExec, the DTExec replacement tool I've been developing from 2006.

Beside the usual deep logging of Control Flow, this new version brings a very important and powerful feature: the ability to log and thus profile the Dataflow. This means that is it possible to deeply monitor the performance of ETL through time, also verifying how every change we do impact on it. Here's some examples that I've used on a customer:

Quality-Of-Service DataFlow-Performance-Overall-Processed-Rows Package-DataFlow-Performance-Over-Time


But where's the news here? Creating a Dataflow using RowCount transformations allows you to do that already! Yeah - I know - but here's the trick: you don't need to develop your packages using any specific transformation or technique in order to have such logging and profiling! Just run the package using DTLoggedExec with the option for profiling and that's it! Logging and profiling out-of-the-box served freshly!

DTLoggedExec is compatible with SQL Server 2005 and SQL Server 2008, and with any processor architecture.

To help people to start to use it, I've created a specific website you can refer to:

For any question don't hesitate to check the FAQ website section or contact me directly!

Executable and source code is hoste on CodePlex, under a Creative Common license:

posted @ Sunday, January 04, 2009 2:03 PM | Feedback (0)
Thursday, July 31, 2008
Next version of DTLoggedExec: Dataflow Profiling in sight!

It's been a long time since I've been able to work consistenly on my DTLoggedExec tool. Fortunately in these last days I've been able to spare some free time to continue the work, and now, within a few days in September 2008 I should be able realease a new version of it.

This new version will contain a new (and imho very very important) feature: DTLoggedExec will be able to profile DataFlow executions, generating a CVS-like file where all the informations on how many rows have been processed by each dataflow component will be available.

You can see an example of this file here:

I suggest to download it so that you can start to play and see how many things you can do with all that informations.

Here is the Package used as a sample:

Of course with such file it's very very very very easy to import everything in a SSIS Performance Database that you can create for that purpose, so that you can do query like this one:


With that you can aggregate data and monitor how your package is performing on daily basis.

These are the data currently avaiable in the .dtsprofile file:


DTLoggedExec is capable of handling correctly complex packages situations, if you have a DataFlow inside a loop, it detects it and populate the "ExecutionNumber" value accordingly, so that you can profile dataflow execution data for each iteration. Of course it also provides a timestamp so that you can calculate how many rows are processed per minute (o per second), creating a graph where you can see the trendline of your package execution times agains the number of processed rows (for example).

Basically you can do now some serious performance trendline analysis of packages without having to instrument the package itself. All the data you need will be extracted by DTLoggedExec itself, right from the SSIS Engine.

I now really feel that DTLoggedExec is really what DTExec should have been right from the start.

Of course DTLoggedExec will work also with SQL Server 2008 :-)

posted @ Thursday, July 31, 2008 11:42 AM | Feedback (0)
Wednesday, June 18, 2008
Using SQL Server PowerShell Snapin with other Powershell shells

SQL Server 2008 has a strong integration with PowerShell but actually it also provide a custom shell, SQLPS, which is somehow limited for me. Well actually is limited more in general speaking, since it is called "minishell".

"SQLPS.exe is a Minishell (also called “custom shell”). It is a form of pre-packaging of Powershell functionality, and it is available to anyone who wants to do this (make-shell). It is regular Powershell, albeit with limitations that the Powershell team decided to impose on it – it is a ‘closed’ shell, which doesn’t allow adding other snapins.

We are shipping SQLPS to make life of our DBA’s a whole lot easier. If they need to have quick access to the SQL providers, assemblies, cmdlets, default security settings, everything is there. We could have possibly done this through a startup script but not everything can be accomplished this way. We are changing the default security settings, without affecting the settings for overall Powershell. Minishells have their own settings."

Of course it give PowerShell access right out of the box and requires no configuration, but I need something more developer friendly when I have to write complex powershell scripts.

More in generale, when I use PowerShell I'd like to use my preferred IDE (actually PowerGUI). Is it possible to configure any shell other than SQLPS to have the same behaviour of that one? Fortunately yes!

Is just a matter of adding the snapins so that we can use the SQL: drives

# Manual procedure
Get-PSSnapin -registered
Add-PSSnapin SqlServerCmdletSnapin
Add-PSSnapin SqlServerProviderSnapin

we can also automatize everything:

# Verbose command
Get-PSSnapin -Registered | Where-Object { $_.Name -like "SqlServer*" } | ForEach-Object { Write-Host "Adding Snapin " + $_.Name; Add-PSSnapin $_.Name }

# Contracted version:
# gsnp -Registered  | ? { $_.Name -like "SqlServer*" } | % { "Adding Snapin " + $_.Name; asnp $_.Name }

then we also need to make sure that the default formating when dealing with SQL Server object is the one used by SQLPS. Again all we need to do is just register formating and format type data:

# Load Formatting Types Data
Update-TypeData "C:\Program Files\Microsoft SQL Server\100\Tools\Binn\SQLProvider.Types.ps1xml"
Update-FormatData "C:\Program Files\Microsoft SQL Server\100\Tools\Binn\SQLProvider.Format.ps1xml"

That's it! We can now enjoy SQL Server through our preferred PowerShell shell! SQL PSDrive, CmdLets and everything is ready to be used :-)

posted @ Wednesday, June 18, 2008 11:32 AM | Feedback (0)
Wednesday, June 27, 2007
SQL Server Best Practices and Performance Tuning for Oracle Enterprise Applications Workshop

In the last days Frank McBath send me an interesting email regarding a free workshop that will be delivered in Zurich on the 10 and 11 of July.

The workshop is really intersting, as you may see by yourself:

"There are many courses in the market that cover either the application or the database, but very little that focuses on how the two work together. The goal of this workshop is provide insight into that area-- where the database meets the enterprise application. With examples from Siebel, PeopleSoft, JD Edwards and SAP."

Info and registration here:

Don't miss it if you have that chance to go.

Other schedules and location can be found here:

posted @ Wednesday, June 27, 2007 3:20 PM | Feedback (0)
Tuesday, June 05, 2007
DTLoggedExec v Released

After a while I've realeased a new version of DTLoggedExec, the DTExec replacement:

  • Compiled to support 32bit and 64bit platforms (32bit and 64bit executables are available in the zipped file)
  • Updated the ConsoleLogProvider to add a more detailed logging of OnError events. Now all properties, along with related connection properties of the erroneous task are logged. This is IDEAL for post-mortem debugging
  • Corrected a little bug that prevented the abilility to load packages from SQL Server using SQL Authentication
  • Display loaded package version
  • Added Help File
  • Added Samples Packages
  • Added Usage Samples
  • More info and download on CodePlex!


    posted @ Tuesday, June 05, 2007 9:27 PM | Feedback (0)