I've inherited a horrible stored proc to load an XSL file from a text file into one TEXT column in one row. It BULK INSERTs into a temporary table and then cursors through that somehow appending the lines together into a TEXT variable using TEXTPTR and UPDATETEXT. The whole thing is in a transaction. What could go wrong? Well, it's suddenly started occasionally not loading some of the start of the file. No errors, it just sometimes misses off the first n bytes (always the same number of bytes). I have a recipe like this that works: create table #FileContents ( LineNumber int identity , LineContents nvarchar(4000) ); declare @FileName varchar(255); declare @NewLine char(2) = char(13) + char(10); declare @CmdLine varchar(300); declare @XSL varchar(max); set @FileName = '' set @CmdLine = 'type ' + @FileName; insert #FileContents exec master.dbo.xp_cmdshell @CmdLine; select @XSL = isnull(@XSL, '') + @NewLine + isnull(LineContents, '') from #FileContents order by LineNumber; Is there a cleaner way? Maybe one that doesn't use xp_cmdshell?
Here is the sample showing how to insert the guts of the file to the table without using the xp_cmdshell: create table #FileContents ( LineNumber int identity (1, 1), LineContents nvarchar(4000) ); insert into #FileContents select f.BulkColumn from openrowset ( bulk 'C:\Useless\Temp\some_file.txt', single_clob ) f; select * from #FileContents; The results display (because this what was in the file): LineNumber LineContents ----------- ------------ 1 Hello, Jerry Hello, Newman Of course it goes without saying that the file path is relative to the server. Oleg
If you can create a CLR assembly with EXTERNAL_ACCESS, then I would go that route. Given that you have a nvarchar(4000) column, you will want to split larger text files into smaller chunks. You will also want it not to epic fail on Unicode. A CLR table function could be a neat way to achieve the above two without causing the massive hole that xp_cmdshell does.