passing table of rows from excel/access to SQL Stored Proc
Given a scenario of passing a table of values (lets say anything from 5,000 or 10,000 rows to anything upto 50,000 rows) from a sheet in Excel / Table in Access to SQL Server stored procedure, I have seen articles on the net which lists out TVP's, XML, CLR etc as options for the same. Sadly, TVP's are not available in classic ADO / VBA. XML appears to me an easier option rather than CLR (although i have seen articles that say CLR is faster than XML).... However, my basic question is this - Why dont I find any articles on the net on "pulling" data from the client machine into SQL Server from an SQL Server stored procedure. (The procedure will be executed from the client using classic ADO in Excel VBA or an access pass through query). e.g. of such a procedure - create proc mysp_something @mylinkedserver as nvarchar(200) as declare @mystr as nvarchar(500) set @mystr = 'insert into sqlserverTableName select * from ' + @mylinkedserver' + '...AccessTableName' --(or alternatively using openrowset etc.) exec sp_executesql @mystr Ofcourse, the main issue with this method is that one has to use Dynamic SQL (which can be insecure) since each client will have to pass info on path/file or linked server details to the SQL Stored procedure.... However, assuming one still uses dynamic SQL taking due care of security concerns, what would be the speed of this method, and how would it compare with the various other methods listed above? What is the reason why this method is never listed anywhere as a means of transferring a table of data from client to SQL Server? Am I missing out on something basic? e.g. possibly there will be windows permission issues in the SQL Server accessing the client? But what if the client excel/mdb is on a shared folder? Also, if the client has a dynamic IP, this will also need to be retrieved by SQL Server...however, these issues can be tackled if one uses dynamic sql. Would very much appreciate a clarification on this as this is an issue that has been troubling me a lot on whether the method listed by me has some basic flaw due to which it should never be used....!!!
Why are you using a sproc to do the import? You could equally get the excel files placed into a predifined import folder and use SSIS/DTS to load the data into the server. That way you are not messing around with trying to make things fit on the fly. The data is predifined, the target table is predefined, then use the built in data import/export tools to do the job.
I still agree with @WilliamD: this is in SSIS's wheelhouse. But you do bring up a point which makes that answer harder--if you need to deal with SQL Server Express editions and there are absolutely no Standard Edition instances available. From looking at the stored procedure, it could work, but it is a very fragile solution, in good part because of what you pointed out: it depends upon the Access/Excel file being in a specific location and available. Basically, it sounds like you are doing the following: 1. Access (or Excel) calls SQL Server SP, passing in the linked server connection. 2. SQL Server acquires a connection to Access (or Excel) and retrieves data. At this point, you are now coupling the spreadsheets and the SQL Server instance together pretty tightly. If somebody moves the Excel spreadsheet, the linked server connection would need to be updated. And if somebody makes a copy of the spreadsheet, you end up with a terrible situation: they could edit the copy, click the "Send my data over!" button, and...have the old version's data sent over. Now you have a "bug." I think the reason you haven't seen this particular solution isn't that it is impossible or could not work, but rather that the range in which it is a good answer is rather constrained. The way I see it, you would probably need the following conditions to hold true in order for it to be valuable: 1. At least some of the people using the process are using SQL Server Express installations and have absolutely no access to SQL Server Standard or above. 2. There is enough institutional knowledge (or fear!) to keep people from moving the Excel/Access files around without updating the SQL Server linked servers. 3. The SQL Server instance acts more as a historical repository than a real-time source of truth. Maybe the scenario is that people go on the road with their laptops and enter information into an Access form. When they get back to the office (or hotel), they upload the data to the central server for further analysis. 4. [SQL Server Compact Edition](
http://www.sqlite.org/), and other product choices are out of the question. So I wouldn't say that this should _never_ be used. Instead, I would say that there are a limited number of circumstances in which it could be a really good solution (in comparison to other, better-known alternatives) and with Compact Edition/SQLite around, the scope gets smaller. On the downside, there are some configuration issues to keep in mind--dynamic IPs would make things even worse--and unless you build a number of robustness checks in (like making sure the named server connection exists, that it is valid, that the current file is the same one that is listed in the named server connection, that people haven't monkeyed around with the structure of the files, etc. etc.), there will be a high probability of some pretty substantial maintenance costs.