syscomments

RocketScientist's Miscellaneous Ramblings
posts - 76, comments - 354, trackbacks - 3

Be careful with excel data import

Stupid stupid stupid excel.

It always looks like it imports data right. It tries really hard to guess at how to import data and convert it. And then it runs into data like this:

"00920"

Which, for some stupid reason, despite the quotes, it interprets as a number. Which it isn't. It's a zip code. In Puerto Rico. Which if you then copy-paste into SQL Server via Access linked tables (yeah, in a minute) you end up with 920 in the zip field. And if you a do a BETWEEN operator with that, you'll end up seeing Puerto Rico in the middle of Southern California.

Yeah, it's a long, stupid, non-integrated chain of user interaction to put the data into SQL Server, prone to lots of user error, plus all the"virus portal" programs like Excel, Access and Outlook (just missing out on Word for the 4-for-4 Office advertisement).

3 different users had 3 different problems, which were all this problem in 3 different tables where we store different addresses.

And the first person to recommend DTS as a "solution" for this gets knocked upside the head with a Louisville Slugger.

Print | posted on Tuesday, November 18, 2003 4:29 PM |

Feedback

Gravatar

# re: Be careful with excel data import

Sounds like DTS coul dbe a solution for this.... :P
11/18/2003 5:06 PM | Tim
Gravatar

# re: Be careful with excel data import

I am serious when I say use DTS. The problem you faced can be overcome easily when importing data from EXCEL. I have done it myself.
11/18/2003 5:47 PM | DavidM
Gravatar

# re: Be careful with excel data import

DTS works, but instead of exporting as Excel, you may want to export as a delimited text file. This method usually works best for me when I have to use an intermediate step. Export to CVS (or whatever format you need, depending on your data), then use DTS to import from the CVS file.
11/18/2003 6:10 PM | Shannon J Hager
Gravatar

# re: Be careful with excel data import

The problem is that DTS doesn't work. Because as soon as it was in DTS, someone would want to schedule it. And as soon as they scheduled it, I'd need to install the Excel Virus Portal on my production server. No. Huh-uh.

The biggest single problem I have with DTS is that it is a completely different app between when you run it interactively and when you schedule it.

Oh, and the file is CSV. Excel just doesn't handle CSV data types the obvious way.
11/18/2003 8:28 PM | rocketscientist
Gravatar

# re: Be careful with excel data import

if the file is CVS then there is no need for Excel, is there? Your statements are illogical, maybe you're leaving something out. "And as soon as they scheduled it, I'd need to install the Excel". You do not need Excel if you are using DTS to import data from a CVS. That is the reason for using DTS/CVS.
11/18/2003 8:47 PM | Shannon J Hager
Gravatar

# re: Be careful with excel data import

OK Chris, why are you going from CSV to Excel to Access to SQL Server? Doing a Microsoft commercial or something?

Regardless, I'll describe something that I no longer have the documentation for, but it might point you in the right direction. I ran across a situation where I was opening an Excel file as a database from within VB (using Jet OLEDB I think). Sometimes I would get null values from cells that contained real data. What I found out was going on was that it would do a 10-row scan to try to guess what the datatype was for each column. There was a setting in the Registry that I was able to set in order to force it to scan the entire sheet before it decided what datatype I had.

MAYBE you're running into a similar scenario with Excel trying to guess the datatype, deciding it's a number, and thus stripping your leading characters. Of course you could always import to a temp table and then pad with leading zeroes to 5 spaces upon insertion into the real table. Or you could create a computed column in the real table that does the padding and use the computed column in your BETWEEN statement instead of the real column.
11/18/2003 11:31 PM | AjarnMark
Gravatar

# re: Be careful with excel data import

OK, I found it. Check http://support.microsoft.com/default.aspx?scid=http://support.microsoft.com:80/support/kb/articles/q257/8/19.asp&NoWebContent=1 and look about 1/3 way down for "Rows to Scan". This only talks about the ADO connection. It's just a WAG on my part that it might also apply to the Excel Import.

You can also check the TypeGuessRows setting in HKLM\Software\Microsoft\Jet\4.0\Engines\Excel\TypeGuessRows

or the MaxScanRows in HKLM\Software\Microsoft\Jet\4.0\Engines\Text\MaxScanRows
11/18/2003 11:43 PM | AjarnMark
Gravatar

# re: Be careful with excel data import

AM, that's EXACTLY the right way to do it. Perfectly and exactly. Which means it'll never happen in the organization I'm working with now :). It was a one-time (ok, a one-time 3 times) import, and people never check their work. The stuff that is automated I deal with using BCP, because BCP works :). However, the import from Excel thing not only works, but works amazingly well, and I recommend it if you have enough brain cells in your organization to carry it through. I don't have the time, the energy, or the budget for the baseball bats I'd go through to teach our end users how to use DTS, Excel or anything. I'm amazed they all wear shoes that tie instead of slip-ons, because that means they've mastered certain intricacies I thought were above their capability.

Second..Shannon...How can you import a CSV file into SQL Server? I mean, a real, honest-to-goodness CSV? Little hint, you can't. Not possible. At least, not without using DTS, which will use the same CSV reader as Excel, which will screw up the data. Here's why: Microsoft defined the CSV format with double quotes around strings (note that our friends in Redmond conveniently ignore this spec when they pull the data INTO excel, but they'll create it going OUT of excel). SQL Server can't deal with the double quotes the same way. For example, it's perfectly legal to have commas in a quoted string in a CSV file. BCP explodes spectactularly when you do that (or, even worse, it just corrupts the data in all following columns on each line). So the sensible thing to do is to pull the CSV into Excel, and let Excel screw up all the data, then put the screwed up data into SQL Server.

Furthermore, let's just guess that the file came from a vendor, and one of the users needed to manipulate the data a little before they did anything with it. Excel is the obvious tool to do this with, however, it can't handle CSV correctly. Not even Microsoft's particular bastardized double-quoted version of CSV.


11/19/2003 9:15 AM | rocketscientist
Gravatar

# re: Be careful with excel data import

I import "honest-to-goodness CSV" files to SQL Sever frequently. I use DTS, of course, which is why I've said that in every reply to this post so far.
11/19/2003 8:43 PM | Shannon J Hager
Gravatar

# re: Be careful with excel data import

Watch out for this too, bit me in the a**.

When connecting to an Excel Spreadsheet using the DAO OpenRecordset method, some values in an Excel column may be returned as Null when the underlying value is not a Null. This typically occurs when numeric and text datatypes are intermixed within the same Excel column.

Add the option IMEX=1; to the Excel connect string to force mixed data types to convert to text. (Only use this when just reading the data and non manipulating the excel spreadsheet as it sets it to Import mode. see link for further details).

http://support.microsoft.com/default.aspx?scid=kb;EN-US;194124
1/20/2004 3:06 PM | Chris
Gravatar

# ways to do the import w/o DTS

Run a csv->xml converter on your data (or write your own in vbscript), then use sql server's xml bulk load?

It works well from a batch file, is supported by MS, and has a smaller server footprint than what I'd really like to recommend - Microsoft SFU's Perl.




5/3/2004 1:22 PM | John Sequeira
Gravatar

# I'd never heard of SYLK format before...

and I'd heartily suggest NOT doing a google search for it from work. Apparently, it's the trade name for a....uh...personal lubricant. Not to say this isn't a good idea, and I'm looking into it, just thought that it was a bit....odd.
6/22/2004 3:58 PM | rocketscientist
Gravatar

# Reading data

please send me information on how to read data from excel to mysql-front using optiperl3.
9/7/2004 6:44 AM | Rabothe
Gravatar

# re: Be careful with excel data import

Have you tried formatting the zip code column to be "00000" in Excel? I remember having this issue with those pesky new england zip codes.
11/5/2004 6:17 PM | Bill
Gravatar

# re: Be careful with excel data import

I am facing one mysterious problem. While I am trying to import a data from Excel to Sql Server Database.Some of the Cell values are not being uploaded to database. I cannot able to simulate the problem in some of the systems.
7/15/2005 12:04 AM | Commander
Gravatar

# re: Be careful with excel data import

Hi!
I am trying to import excel file to SQL using BCP. I don't want to use DTS. Can you please suggest.
8/24/2005 4:56 AM | Arunabh Baruah
Gravatar

# re: Be careful with excel data import

gfhfh
9/5/2005 6:08 AM | raja
Gravatar

# Importing Data into SQL Server

I have to take data from several sources and import it into SQL Server. The sources are a variety of databases of which I know nothing at the moment. Is the best way of importing them by exporting them as CSV files? What should I use to import it? Can anyone help?
9/6/2005 4:49 AM | Paul Middleton
Gravatar

# re: Be careful with excel data import

I typically like tab delimited over comma delimited. Seems like there's less chance of an embedded tab in a data stream than an embedded comma.

If you can get delimited data, that's probably the best from a space usage and import speed perspective. If you can only get column-delimited data, then that's fine too, but it'll take up more space (lots of embedded spaces).

rs.
9/6/2005 6:22 PM | rocketscientist
Gravatar

# re: Be careful with excel data import

THIS IS VERY SIMPLE
9/15/2005 1:11 AM | santa singh
Gravatar

# re: Be careful with excel data import

SYLK format works just fine for me. I've been using it ever since my MS Multiplan days! SYLK is the RTF of the spreadsheet world and I have found it to be rock-steady over a 15-year period, and still working today on excehnages between Excel 2000 and Excel "everything else" Don't know about all that other stuff you guys are talking about, but SYLK definitely protect your data and its format.
10/18/2005 8:10 AM | FangSmith
Gravatar

# re: Be careful with excel data import

ITS VERY SIMPLE MAN
I have used the DTS on SQL SERVER 2000 AND saved the file as VB6 FILE as a .bas file.

create a new vb project file
copy pest the folloing file and run.

'****************************************************************
'Microsoft SQL Server 2000
'Visual Basic file generated for DTS Package
'File Name: C:\excel import\New Package.bas
'Package Name: New Package
'Package Description: DTS package description
'Generated Date: 26/10/2005
'Generated Time: 11:56:19
'Generated by Deepak Nayak
'****************************************************************

Option Explicit
Public goPackageOld As New DTS.Package
Public goPackage As DTS.Package2
Private Sub Main()

Set goPackage = goPackageOld

goPackage.Name = "New Package"
goPackage.Description = "DTS package description"
goPackage.WriteCompletionStatusToNTEventLog = False
goPackage.FailOnError = False
goPackage.PackagePriorityClass = 2
goPackage.MaxConcurrentSteps = 4
goPackage.LineageOptions = 0
goPackage.UseTransaction = True
goPackage.TransactionIsolationLevel = 4096
goPackage.AutoCommitTransaction = True
goPackage.RepositoryMetadataOptions = 0
goPackage.UseOLEDBServiceComponents = True
goPackage.LogToSQLServer = False
goPackage.LogServerFlags = 0
goPackage.FailPackageOnLogFailure = False
goPackage.ExplicitGlobalVariables = False
goPackage.PackageType = 0


Dim oConnProperty As DTS.OleDBProperty

'---------------------------------------------------------------------------
' create package connection information
'---------------------------------------------------------------------------

Dim oConnection As DTS.Connection2

'------------- a new connection defined below.
'For security purposes, the password is never scripted

Set oConnection = goPackage.Connections.New("Microsoft.Jet.OLEDB.4.0")

oConnection.ConnectionProperties("Data Source") = "C:\myExcel.xls" 'your excel file name
oConnection.ConnectionProperties("Extended Properties") = "Excel 8.0;HDR=YES;"

oConnection.Name = "Connection 1"
oConnection.ID = 1
oConnection.Reusable = True
oConnection.ConnectImmediate = False
oConnection.DataSource = "C:\myExcel.xls"
oConnection.ConnectionTimeout = 60
oConnection.UseTrustedConnection = False
oConnection.UseDSL = False

'If you have a password for this connection, please uncomment and add your password below.
'oConnection.Password = "<put the password here>"

goPackage.Connections.Add oConnection
Set oConnection = Nothing

'------------- a new connection defined below.
'For security purposes, the password is never scripted

Set oConnection = goPackage.Connections.New("SQLOLEDB")

oConnection.ConnectionProperties("Integrated Security") = "SSPI"
oConnection.ConnectionProperties("Persist Security Info") = True
oConnection.ConnectionProperties("Initial Catalog") = "CMS"
oConnection.ConnectionProperties("Data Source") = "(local)"
oConnection.ConnectionProperties("Application Name") = "DTS Import/Export Wizard"

oConnection.Name = "Connection 2"
oConnection.ID = 2
oConnection.Reusable = True
oConnection.ConnectImmediate = False
oConnection.DataSource = "(local)"
oConnection.ConnectionTimeout = 60
oConnection.Catalog = "CMS" 'your SQL DATABASE NAME
oConnection.UseTrustedConnection = True
oConnection.UseDSL = False

'If you have a password for this connection, please uncomment and add your password below.
'oConnection.Password = "<put the password here>"

goPackage.Connections.Add oConnection
Set oConnection = Nothing

'---------------------------------------------------------------------------
' create package steps information
'---------------------------------------------------------------------------

Dim oStep As DTS.Step2
Dim oPrecConstraint As DTS.PrecedenceConstraint

'------------- a new step defined below

Set oStep = goPackage.Steps.New
'Replace the database name(CMS) AS yOUR Database name and tablename(Payments$) as your table name
oStep.Name = "Create Table [CMS].[dbo].[Payments$] Step"
oStep.Description = "Create Table [CMS].[dbo].[Payments$] Step"
oStep.ExecutionStatus = 1
oStep.TaskName = "Create Table [CMS].[dbo].[Payments$] Task"
oStep.CommitSuccess = False
oStep.RollbackFailure = False
oStep.ScriptLanguage = "VBScript"
oStep.AddGlobalVariables = True
oStep.RelativePriority = 3
oStep.CloseConnection = False
oStep.ExecuteInMainThread = False
oStep.IsPackageDSORowset = False
oStep.JoinTransactionIfPresent = False
oStep.DisableStep = False
oStep.FailPackageOnError = False

goPackage.Steps.Add oStep
Set oStep = Nothing

'------------- a new step defined below

Set oStep = goPackage.Steps.New

oStep.Name = "Copy Data from Payments$ to [CMS].[dbo].[Payments$] Step"
oStep.Description = "Copy Data from Payments$ to [CMS].[dbo].[Payments$] Step"
oStep.ExecutionStatus = 1
oStep.TaskName = "Copy Data from Payments$ to [CMS].[dbo].[Payments$] Task"
oStep.CommitSuccess = False
oStep.RollbackFailure = False
oStep.ScriptLanguage = "VBScript"
oStep.AddGlobalVariables = True
oStep.RelativePriority = 3
oStep.CloseConnection = False
oStep.ExecuteInMainThread = True
oStep.IsPackageDSORowset = False
oStep.JoinTransactionIfPresent = False
oStep.DisableStep = False
oStep.FailPackageOnError = False

goPackage.Steps.Add oStep
Set oStep = Nothing

'------------- a precedence constraint for steps defined below

Set oStep = goPackage.Steps("Copy Data from Payments$ to [CMS].[dbo].[Payments$] Step")
Set oPrecConstraint = oStep.PrecedenceConstraints.New("Create Table [CMS].[dbo].[Payments$] Step")
oPrecConstraint.StepName = "Create Table [CMS].[dbo].[Payments$] Step"
oPrecConstraint.PrecedenceBasis = 0
oPrecConstraint.Value = 4

oStep.PrecedenceConstraints.Add oPrecConstraint
Set oPrecConstraint = Nothing

'---------------------------------------------------------------------------
' create package tasks information
'---------------------------------------------------------------------------

'------------- call Task_Sub1 for task Create Table [CMS].[dbo].[Payments$] Task (Create Table [CMS].[dbo].[Payments$] Task)
Call Task_Sub1(goPackage)

'------------- call Task_Sub2 for task Copy Data from Payments$ to [CMS].[dbo].[Payments$] Task (Copy Data from Payments$ to [CMS].[dbo].[Payments$] Task)
Call Task_Sub2(goPackage)

'---------------------------------------------------------------------------
' Save or execute package
'---------------------------------------------------------------------------

'goPackage.SaveToSQLServer "(local)", "sa", ""
goPackage.Execute
goPackage.UnInitialize
'to save a package instead of executing it, comment out the executing package line above and uncomment the saving package line
Set goPackage = Nothing

Set goPackageOld = Nothing
MsgBox "Job Over"
End Sub


'------------- define Task_Sub1 for task Create Table [CMS].[dbo].[Payments$] Task (Create Table [CMS].[dbo].[Payments$] Task)
Public Sub Task_Sub1(ByVal goPackage As Object)

Dim oTask As DTS.Task
Dim oLookup As DTS.Lookup

Dim oCustomTask1 As DTS.ExecuteSQLTask2
Set oTask = goPackage.Tasks.New("DTSExecuteSQLTask")
Set oCustomTask1 = oTask.CustomTask

oCustomTask1.Name = "Create Table [CMS].[dbo].[Payments$] Task"

oCustomTask1.Description = "Create Table [CMS].[dbo].[Payments$] Task"
oCustomTask1.SQLStatement = "CREATE TABLE [CMS].[dbo].[Payments$] (" & vbCrLf
oCustomTask1.SQLStatement = oCustomTask1.SQLStatement & "[Date] smalldatetime NULL, " & vbCrLf
oCustomTask1.SQLStatement = oCustomTask1.SQLStatement & "[Reference] float NULL, " & vbCrLf
oCustomTask1.SQLStatement = oCustomTask1.SQLStatement & "[Amount] float NULL, " & vbCrLf
oCustomTask1.SQLStatement = oCustomTask1.SQLStatement & "[Code] nvarchar (255) NULL, " & vbCrLf
oCustomTask1.SQLStatement = oCustomTask1.SQLStatement & "[Purchase Ledger] float NULL, " & vbCrLf
oCustomTask1.SQLStatement = oCustomTask1.SQLStatement & "[S/W] float NULL, " & vbCrLf
oCustomTask1.SQLStatement = oCustomTask1.SQLStatement & "[PAYE/NIC] float NULL, " & vbCrLf
oCustomTask1.SQLStatement = oCustomTask1.SQLStatement & "[Other] float NULL, " & vbCrLf
oCustomTask1.SQLStatement = oCustomTask1.SQLStatement & "[Bank Charges] float NULL, " & vbCrLf
oCustomTask1.SQLStatement = oCustomTask1.SQLStatement & "[Co-op] float NULL, " & vbCrLf
oCustomTask1.SQLStatement = oCustomTask1.SQLStatement & "[B of Scot] float NULL" & vbCrLf
oCustomTask1.SQLStatement = oCustomTask1.SQLStatement & ")"
oCustomTask1.ConnectionID = 2
oCustomTask1.CommandTimeout = 0
oCustomTask1.OutputAsRecordset = False

goPackage.Tasks.Add oTask
Set oCustomTask1 = Nothing
Set oTask = Nothing

End Sub

'------------- define Task_Sub2 for task Copy Data from Payments$ to [CMS].[dbo].[Payments$] Task (Copy Data from Payments$ to [CMS].[dbo].[Payments$] Task)
Public Sub Task_Sub2(ByVal goPackage As Object)

Dim oTask As DTS.Task
Dim oLookup As DTS.Lookup

Dim oCustomTask2 As DTS.DataPumpTask2
Set oTask = goPackage.Tasks.New("DTSDataPumpTask")
Set oCustomTask2 = oTask.CustomTask

oCustomTask2.Name = "Copy Data from Payments$ to [CMS].[dbo].[Payments$] Task"
oCustomTask2.Description = "Copy Data from Payments$ to [CMS].[dbo].[Payments$] Task"
oCustomTask2.SourceConnectionID = 1
oCustomTask2.SourceSQLStatement = "select `Date`,`Reference`,`Amount`,`Code`,`Purchase Ledger`,`S/W`,`PAYE/NIC`,`Other`,`Bank Charges`,`Co-op`,`B of Scot` from `Payments$`"
oCustomTask2.DestinationConnectionID = 2
oCustomTask2.DestinationObjectName = "[CMS].[dbo].[Payments$]"
oCustomTask2.ProgressRowCount = 1000
oCustomTask2.MaximumErrorCount = 0
oCustomTask2.FetchBufferSize = 1
oCustomTask2.UseFastLoad = True
oCustomTask2.InsertCommitSize = 0
oCustomTask2.ExceptionFileColumnDelimiter = "|"
oCustomTask2.ExceptionFileRowDelimiter = vbCrLf
oCustomTask2.AllowIdentityInserts = False
oCustomTask2.FirstRow = 0
oCustomTask2.LastRow = 0
oCustomTask2.FastLoadOptions = 2
oCustomTask2.ExceptionFileOptions = 1
oCustomTask2.DataPumpOptions = 0

Call oCustomTask2_Trans_Sub1(oCustomTask2)


goPackage.Tasks.Add oTask
Set oCustomTask2 = Nothing
Set oTask = Nothing

End Sub

Public Sub oCustomTask2_Trans_Sub1(ByVal oCustomTask2 As Object)

Dim oTransformation As DTS.Transformation2
Dim oTransProps As DTS.Properties
Dim oColumn As DTS.Column
Set oTransformation = oCustomTask2.Transformations.New("DTS.DataPumpTransformCopy")
oTransformation.Name = "DirectCopyXform"
oTransformation.TransformFlags = 63
oTransformation.ForceSourceBlobsBuffered = 0
oTransformation.ForceBlobsInMemory = False
oTransformation.InMemoryBlobSize = 1048576
oTransformation.TransformPhases = 4

Set oColumn = oTransformation.SourceColumns.New("Date", 1)
oColumn.Name = "Date"
oColumn.Ordinal = 1
oColumn.Flags = 118
oColumn.Size = 0
oColumn.DataType = 7
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True

oTransformation.SourceColumns.Add oColumn
Set oColumn = Nothing

Set oColumn = oTransformation.SourceColumns.New("Reference", 2)
oColumn.Name = "Reference"
oColumn.Ordinal = 2
oColumn.Flags = 118
oColumn.Size = 0
oColumn.DataType = 5
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True

oTransformation.SourceColumns.Add oColumn
Set oColumn = Nothing

Set oColumn = oTransformation.SourceColumns.New("Amount", 3)
oColumn.Name = "Amount"
oColumn.Ordinal = 3
oColumn.Flags = 118
oColumn.Size = 0
oColumn.DataType = 5
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True

oTransformation.SourceColumns.Add oColumn
Set oColumn = Nothing

Set oColumn = oTransformation.SourceColumns.New("Code", 4)
oColumn.Name = "Code"
oColumn.Ordinal = 4
oColumn.Flags = 102
oColumn.Size = 255
oColumn.DataType = 130
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True

oTransformation.SourceColumns.Add oColumn
Set oColumn = Nothing

Set oColumn = oTransformation.SourceColumns.New("Purchase Ledger", 5)
oColumn.Name = "Purchase Ledger"
oColumn.Ordinal = 5
oColumn.Flags = 118
oColumn.Size = 0
oColumn.DataType = 5
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True

oTransformation.SourceColumns.Add oColumn
Set oColumn = Nothing

Set oColumn = oTransformation.SourceColumns.New("S/W", 6)
oColumn.Name = "S/W"
oColumn.Ordinal = 6
oColumn.Flags = 118
oColumn.Size = 0
oColumn.DataType = 5
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True

oTransformation.SourceColumns.Add oColumn
Set oColumn = Nothing

Set oColumn = oTransformation.SourceColumns.New("PAYE/NIC", 7)
oColumn.Name = "PAYE/NIC"
oColumn.Ordinal = 7
oColumn.Flags = 118
oColumn.Size = 0
oColumn.DataType = 5
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True

oTransformation.SourceColumns.Add oColumn
Set oColumn = Nothing

Set oColumn = oTransformation.SourceColumns.New("Other", 8)
oColumn.Name = "Other"
oColumn.Ordinal = 8
oColumn.Flags = 118
oColumn.Size = 0
oColumn.DataType = 5
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True

oTransformation.SourceColumns.Add oColumn
Set oColumn = Nothing

Set oColumn = oTransformation.SourceColumns.New("Bank Charges", 9)
oColumn.Name = "Bank Charges"
oColumn.Ordinal = 9
oColumn.Flags = 118
oColumn.Size = 0
oColumn.DataType = 5
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True

oTransformation.SourceColumns.Add oColumn
Set oColumn = Nothing

Set oColumn = oTransformation.SourceColumns.New("Co-op", 10)
oColumn.Name = "Co-op"
oColumn.Ordinal = 10
oColumn.Flags = 118
oColumn.Size = 0
oColumn.DataType = 5
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True

oTransformation.SourceColumns.Add oColumn
Set oColumn = Nothing

Set oColumn = oTransformation.SourceColumns.New("B of Scot", 11)
oColumn.Name = "B of Scot"
oColumn.Ordinal = 11
oColumn.Flags = 118
oColumn.Size = 0
oColumn.DataType = 5
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True

oTransformation.SourceColumns.Add oColumn
Set oColumn = Nothing

Set oColumn = oTransformation.DestinationColumns.New("Date", 1)
oColumn.Name = "Date"
oColumn.Ordinal = 1
oColumn.Flags = 118
oColumn.Size = 0
oColumn.DataType = 135
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True

oTransformation.DestinationColumns.Add oColumn
Set oColumn = Nothing

Set oColumn = oTransformation.DestinationColumns.New("Reference", 2)
oColumn.Name = "Reference"
oColumn.Ordinal = 2
oColumn.Flags = 118
oColumn.Size = 0
oColumn.DataType = 5
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True

oTransformation.DestinationColumns.Add oColumn
Set oColumn = Nothing

Set oColumn = oTransformation.DestinationColumns.New("Amount", 3)
oColumn.Name = "Amount"
oColumn.Ordinal = 3
oColumn.Flags = 118
oColumn.Size = 0
oColumn.DataType = 5
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True

oTransformation.DestinationColumns.Add oColumn
Set oColumn = Nothing

Set oColumn = oTransformation.DestinationColumns.New("Code", 4)
oColumn.Name = "Code"
oColumn.Ordinal = 4
oColumn.Flags = 102
oColumn.Size = 255
oColumn.DataType = 130
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True

oTransformation.DestinationColumns.Add oColumn
Set oColumn = Nothing

Set oColumn = oTransformation.DestinationColumns.New("Purchase Ledger", 5)
oColumn.Name = "Purchase Ledger"
oColumn.Ordinal = 5
oColumn.Flags = 118
oColumn.Size = 0
oColumn.DataType = 5
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True

oTransformation.DestinationColumns.Add oColumn
Set oColumn = Nothing

Set oColumn = oTransformation.DestinationColumns.New("S/W", 6)
oColumn.Name = "S/W"
oColumn.Ordinal = 6
oColumn.Flags = 118
oColumn.Size = 0
oColumn.DataType = 5
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True

oTransformation.DestinationColumns.Add oColumn
Set oColumn = Nothing

Set oColumn = oTransformation.DestinationColumns.New("PAYE/NIC", 7)
oColumn.Name = "PAYE/NIC"
oColumn.Ordinal = 7
oColumn.Flags = 118
oColumn.Size = 0
oColumn.DataType = 5
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True

oTransformation.DestinationColumns.Add oColumn
Set oColumn = Nothing

Set oColumn = oTransformation.DestinationColumns.New("Other", 8)
oColumn.Name = "Other"
oColumn.Ordinal = 8
oColumn.Flags = 118
oColumn.Size = 0
oColumn.DataType = 5
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True

oTransformation.DestinationColumns.Add oColumn
Set oColumn = Nothing

Set oColumn = oTransformation.DestinationColumns.New("Bank Charges", 9)
oColumn.Name = "Bank Charges"
oColumn.Ordinal = 9
oColumn.Flags = 118
oColumn.Size = 0
oColumn.DataType = 5
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True

oTransformation.DestinationColumns.Add oColumn
Set oColumn = Nothing

Set oColumn = oTransformation.DestinationColumns.New("Co-op", 10)
oColumn.Name = "Co-op"
oColumn.Ordinal = 10
oColumn.Flags = 118
oColumn.Size = 0
oColumn.DataType = 5
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True

oTransformation.DestinationColumns.Add oColumn
Set oColumn = Nothing

Set oColumn = oTransformation.DestinationColumns.New("B of Scot", 11)
oColumn.Name = "B of Scot"
oColumn.Ordinal = 11
oColumn.Flags = 118
oColumn.Size = 0
oColumn.DataType = 5
oColumn.Precision = 0
oColumn.NumericScale = 0
oColumn.Nullable = True

oTransformation.DestinationColumns.Add oColumn
Set oColumn = Nothing

Set oTransProps = oTransformation.TransformServerProperties


Set oTransProps = Nothing

oCustomTask2.Transformations.Add oTransformation
Set oTransformation = Nothing

End Sub
10/26/2005 2:41 AM | Deepak R. Nayak
Gravatar

# re: Be careful with excel data import

Does anyone know where to change the number of rows OLDE DB uses to read in Excel Files?
11/18/2005 11:33 AM | BBQman
Gravatar

# re: Be careful with excel data import

If you know what the file looks like, BCP all data into a #table via Stored Procedure with all fields set as varchar. Query out the quotes you don't need then rip it across to a User table, converting the fields you need to int or other data types.
12/1/2005 6:09 PM | Pfitzy
Gravatar

# re: Be careful with excel data import

P.S. DTS blows - too slow for any data volume of serious size and won't come close to BCP unless particular criteria are met in terms of the data structure. However DTS has a positive in that it can be learned by monkeys via the GUI.

BCP takes a little bit more smarts but provides awesome capabilities when combined with dynamic T-SQL. I operate in tab-delimited text where possible as it makes the files much clearer, and all you need to know is how to deal with true NULL values (Hex00).
12/1/2005 6:16 PM | Pfitzy
Gravatar

# re: Be careful with excel data import

I like cheese...
4/21/2006 4:50 PM | The Whiff
Gravatar

# re: Be careful with excel data import

How to import data from excel sheet to ms sql table useing Vb
9/19/2006 7:58 AM | chaudhari bharat v.
Comments have been closed on this topic.

Powered by:
Powered By Subtext Powered By ASP.NET