Load Excel/Delimited files as tables in SQL server

I am being asked to load data for the data quality assurance team. Someone else is extracting the data from the application data source for me. I receive the data as .xls/.xlsx files or sometimes as a delimited file if there are more than 300k records.

So when anywhere from 20 to 60 files are handed over to me, I am expected to load the data in a 'few hours' to a half day. What would be your first approach the task described above? There are no answers out of scope. I have all the tools you'd probably mention. I am only limited by my experience.... what do you think SSIS? The data quality team likes the data loaded with the appropriate data type (not all just VARCHAR(255)).

Just a few requirements: Each file needs loaded as it's own table(that does not exist) and to use the file's header row as the fields and name the table using the filename. Here are the two approaches I've been using:

  1. I have been using an VB module which converts all xls files in a directory to Access objects then uses an ODBC connection to upload the each file as a table into SQL. The drawback is that it runs slowly from my local machine. Will it run faster if the file is run on the server and possibly change the connection from ODBC to something else. Code below:

  2. Another method I used was BULK INSERT where I a.) use the header row to create the table and then b.) load the file with BULK INSERT

What is a better approach? Is there a SSIS tutorial on how to load a directory of files each as its own table into SQL Server?

I created a package from this tutorial: http://beyondrelational.com/modules/12/tutorials/24/tutorials/9011/getting-started-with-ssis-part-6-import-export-wizard-part-1.aspx

Could a loop be added to repeat the process for multiple files?

more ▼

asked Aug 28, 2012 at 06:59 PM in Default

avatar image

30 2 2 4

Can you request that all files are the same format? Having them all delimited will be a big help.

How can you tell the datatype from a csv file? surely they need to provide the DDL with the data?

Aug 28, 2012 at 08:33 PM Fatherjack ♦♦
(comments are locked)
10|1200 characters needed characters left

2 answers: sort voted first

I would link to them either as an Excel or Text(CSV) source. Sure, it isn't as fast as BCP, but you are keeping all that import stuff at arms length and don't then need staging tables. You can perform SQL Queries on the linked tables in order to import the data, and do all your transformations in SQL. Robyn and I explained quite a bit about how to do it here http://www.simple-talk.com/sql/t-sql-programming/sql-server-excel-workbench/ and I recently showed how to do CSV format imports via a linked server here http://www.simple-talk.com/sql/t-sql-programming/the-tsql-of-csv-comma-delimited-of-errors/

more ▼

answered Sep 12, 2012 at 04:45 PM

avatar image

Phil Factor
4.2k 8 26 21

I'd love to see an updated article that uses the ACE drivers, Phil.

Sep 16, 2013 at 03:31 AM Jeff Moden
(comments are locked)
10|1200 characters needed characters left

This is what I would do:

  1. Get the files in csv (comma seperated format if possible)

  2. Use a loop to go through all the files.

  3. Use Bulk Insert to load data, here is the syntax on Bulk Insert:


p.s. One option is to use xp_cmdshell when working with OS files

more ▼

answered Sep 11, 2012 at 08:39 PM

avatar image

10 1

(comments are locked)
10|1200 characters needed characters left
Your answer
toggle preview:

Up to 2 attachments (including images) can be used with a maximum of 524.3 kB each and 1.0 MB total.

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here



Answers and Comments

SQL Server Central

Need long-form SQL discussion? SQLserverCentral.com is the place.



asked: Aug 28, 2012 at 06:59 PM

Seen: 3897 times

Last Updated: Oct 14, 2013 at 03:11 PM

Copyright 2018 Redgate Software. Privacy Policy