x

I am trying to send the results of sqlquery to an email in excel sheet using msdb.dbo.sp_send_dbmail

The results of my query has 701 rows. but when executing the msdb.dbo.sp_send_dbmail. i am receiving the email in excel sheet but all the rows are not displaying in the excel sheet only top 40 rows are getting displayed. The data is missing in excel sheet.

My query is as below:

 DECLARE
     @tab char(1) = CHAR(9)
 
 EXEC msdb.dbo.sp_send_dbmail
     @profile_name = 'admin',
     @recipients = 'xxxxxxxxxxxx',
     @query = 'SET NOCOUNT ON 
     EXECUTE  [TMS].[dbo].[SearchTickets] 
    @CustomerId="C5C2594A-29A1-47FC-A284-2F42A1F5D4D3"
   ,@ProjectId=null
   ,@CustomerReferenceNumber=null
   ,@ProblemDetails=null
   ,@AssignedTo=null
   ,@AssignedTeamId=null
   ,@CurrentStatus=null
   ,@RespondedBy=null
   ,@ApplicationId=null
   ,@PriorityId=null
   ,@DetailedCategoryId=null
   ,@SubCategoryId=null
   ,@TicketNo=null
   ,@CallTypeCode=null
   ,@CallSourceCode=null
   ,@ResolvedBy=null
   ,@ContactName=null
   ,@Submitter=null
   ,@ProblemSummary=null
   ,@AssignedDateGreater=null
   ,@AssignedDateLesser=null
   ,@ResolvedDateGreater=null
   ,@ResolvedDateLesser=null
   ,@IsResponseOverDue=null
   ,@IsResolutionOverDue=null
   ,@Country=null',
     @subject = 'Testing',
     @attach_query_result_as_file = 1,
     @query_attachment_filename='filename.xls',
     @query_result_separator=@tab,
     @query_result_no_padding=1,
     @query_result_header = 1,
     @query_result_width = 32767
more ▼

asked Nov 28 at 10:45 AM in Default

avatar image

dantenan
1 4

How big is the file you're receiving? sp_send_mail has a file size limit.

Nov 28 at 10:47 AM ThomasRushton ♦♦

Hi Thomas thanks for responding to my question.

If i manually copy the results of a query (701 rows) and paste it in excel sheet it is showing the size as 236 KB.

If there is a limit is there a way to extend it? if so please let me know how can i

Thanks and Regards

Nov 28 at 11:35 AM dantenan

@dantenan If you manually copy the results of a query and paste it in excel sheet then the size of the file is much, much smaller than the size of the file produced by msdb.dbo.sp_send_dbmail procedure because of the unfortunate file name, or rather extension, that you use. The extension is XLS (old Excel 97-2003 format), these files are far greater in size than their respective XLSX counterparts. It looks like you have a correct setting for @query_result_no_padding=1 which should overwrite the @query_result_width parameter according to documentation, thus reducing the file size, but the XLS might be a poor choice. Please try to change it to XLSX. Database mail default limit for all attachments is 1 MB and it is possible that you exceed it.

Nov 28 at 02:47 PM Oleg

@Oleg regarding the default limit of 1MB. I have even changed the maximum file size limit to 10 mb in database mail configuration wizard.

And when I change the extension type to .xlsx, after receiving the email, I am unable to open the file in excel it is throwing a error saying file might be corrupted.

The file i received through email when i keep the extension as .xls and .xlsx is 64 kb same for both extensions

Nov 28 at 05:46 PM dantenan

This is the file i am receiving through email with .xls extension[link text

I have attached 2 excel sheets one is manually copied and the other one is received through email.

I want the email copy also to have all the rows of the query result as available in manually copied excel sheet

gil.xls (65.6 kB)
Nov 28 at 05:55 PM dantenan
show all comments (comments are locked)
10|1200 characters needed characters left

4 answers: sort voted first

This problem does not appear to be related to any data or configuration issues. The attachments appear to be truncated before email request is forwarded to SMTP server, snipping the file somewhere in the middle of the record. If the problem was related to some sort of mysterious row count limitation then the files would be truncated at the end of the last record, not in the middle. Running the following query will show some information about the sent mail items:

 select
     ma.mailitem_id, mi.recipients, mi.[subject], mi.query,
     [filename], ma.filesize, datalength(ma.attachment) actual_attachment_size,
     cast(ma.attachment as nvarchar(max)) attachment_text
     from msdb.dbo.sysmail_attachments ma inner join msdb.dbo.sysmail_mailitems mi 
         on ma.mailitem_id = mi.mailitem_id
     where filename = '[file_name_goes_here]';

The data in filesize and actual_attachment_size columns should match of course, but it is a good idea to confirm. In case if the reported filesize numbers are close to about 65,000 then the problem is at the source (msdb procedure sending email), not any configuration. Configuration may be checked by running the following query:

 select
     paramname, [description], paramvalue / 1024 size_in_kb
     from msdb.dbo.sysmail_configuration
     where paramname = 'MaxFileSize';

Default is 1,000,000 bytes (comes to about 976 KB or a bit less than 1 MB).

I believe that the problem is related to the sp_send_dbmail procedure truncates attachments to 64 KB bug, which was already fixed by Microsoft about 3 years ago. Here is the quote from the Symptoms section:

Assume that you use Database Mail and sp_send_dbmail stored procedure with the @query and @attach_query_result_as_file option in Microsoft SQL Server 2014. The stored procedure run a query and set the result-set as an attachment of the email. When the query has a large result-set, the attachment is truncated to 64 kilobyte (KB) maximum.

According to this article, the issue was fixed in cumulative update 4 (CU4) for SQL Server 2014 which was released on October 21, 2014, build number 12.0.2430.0. If the SQL Server in question is indeed 2014 and its build number is smaller than 12.0.2430.0, then the cumulative update 4 or higher, and better still the service pack need to be applied to resolve this issue with attachments truncation.

Hope this helps.

Oleg

more ▼

answered Nov 30 at 01:53 PM

avatar image

Oleg
19k 3 7 28

@Oleg yes I am using SQL Server 2014 and its build number is smaller than 12.0.2430.0

I have started downloading the (CU4) update and after the update will let you know the result.

Thanks once again for your time

Nov 30 at 03:45 PM dantenan

@dantenan It would be better to just install the latest service pack instead, because all service packs always include whatever was included in all previous cumulative updates, but CU4 as a minimum should do the trick for now.

Nov 30 at 06:37 PM Oleg

@Oleg Yes finally able to see all the 701 rows in the excel sheet after upgrading the SQL Server 2014 with CU4.

Below are the links to upgrade CU4

https://support.microsoft.com/en-us/help/2999197/cumulative-update-package-4-for-sql-server-2014

Dec 01 at 03:39 AM dantenan

@Oleg I have attached my excel sheet please go through it.

Even after changing TicketDetail.TICKET_PROBLEM_SUMMARY AS ProblemSummary,

to read this instead:

cast(ltrim(rtrim(replace(replace(TicketDetail.TICKET_PROBLEM_SUMMARY, char(13), ''''), char(10), '' ''))) as varchar(max) ) AS ProblemSummary

I still see the column getting across other rows.

I also want to fix the below things in the excel sheet, could you please let me know is there a way to fix these things

1) How can I adjust the size of each column as per the data available, because every time i open the excel sheet I am adjusting the size of the column manually to read the complete line in each column.

2) And at the bottom of the excel sheet there is a message displaying "701 rows executed" I would also like to get rid of this message.

Thanks and Regards

Dec 01 at 03:55 AM dantenan
(comments are locked)
10|1200 characters needed characters left

alt text

PFA

messages-menu.png (227.1 kB)
more ▼

answered Nov 29 at 07:32 AM

avatar image

dantenan
1 4

@dantenan I just tested some sample data on my end I have no problems with sending out email with attachment including thousands of rows in the file. It really looks like your problem is related to the data and the procedure, so please try the following:

  • In the procedure text, add the line reading set nocount on after the line reading BEGIN

  • In the procedure text, change the data type of @vSQL to nvarchar(max) from varchar(8000), so make the line read DECLARE @vSQL nvarchar(max)

  • Toward the bottom of procedure text, please remove (or comment out) the line reading print @vSQL

  • Change the line reading Execute (@vSQL) to read exec sp_executesql @statement = @vSQL

  • You can change @query_result_header = 0 to be set to 1 to get your headers back in, because the removing PRINT statement already helps to get rid of that query text.

Please let me know if the above works.

Nov 29 at 01:58 PM Oleg

@dantenan Also, and this one is important: it looks like the column TicketDetail.TICKET_PROBLEM_SUMMARY values have line breaks in some rows, which will cause the problem with CSV files (the values of other columns will shift). The resulting row MUST NOT be returned as varchar(8000) even if it is designed as such, it should be converted to varchar(max). For example, you can replace the line breaks with spaces to fix the columns' shifting problem in some rows and cast the column as varchar(max) by changing the text in the procedure currently reading

 TicketDetail.TICKET_PROBLEM_SUMMARY AS ProblemSummary,

to read this instead:

 cast(ltrim(rtrim(replace(replace(TicketDetail.TICKET_PROBLEM_SUMMARY, 
     char(13), ''''), char(10), '' ''))) as varchar(max)
 ) AS ProblemSummary

If there are any other varchar(8000) columns, please cast them to varchar(max) as well.

Nov 29 at 02:13 PM Oleg

@olego first of all thanks for your detailed information step by step.

I was able to implement the steps you mentioned as above

and by doing this i was able to get rid of select query at the first row of the excel sheet.

but the rows are still 46 only out of 701. i will attach the excel please go through it.

Nov 29 at 06:33 PM dantenan

link text

PFA of the excel after the changes in the stored proc

gil2.csv (65.5 kB)
Nov 29 at 06:35 PM dantenan

@dantenan This is interesting. The data in both spreadsheets (with and without select statement included) has practically the same number of characters (32,768 in today's file and 32,781 in yesterday's file). The size is reported as 65 KB for each file, which probably means that the procedure generates the files in Unicode. Something causes the files to be truncated once the 32,767 is reached which is your value set for query result width. It is almost as though the engine believes that the query results consists of one row and one column, and that column is chopped off once the limit is reached. Just to confirm that this is the case, please put the print back in and execute the procedure manually. This will yield the select statement which is then fed to dynamic execution. Now, take this (static at this time) select statement and provide it as the value of the @query parameter to the procedure sending email. I highly suspect that this will produce the file with all 701 rows included in it. If what I suspect is correct, this exercise will show the way to cure the problem.

Nov 29 at 07:58 PM Oleg
(comments are locked)
10|1200 characters needed characters left

link text

PFA of excel sheet

gil.zip (86.0 kB)
more ▼

answered Dec 01 at 03:55 AM

avatar image

dantenan
1 4

@Oleg And would also want to get rid of the line which is appearing at the 2nd row in excel sheet after the headers rows.

Thanks and Regards.

Dec 01 at 03:57 AM dantenan

@oleg https://stackoverflow.com/questions/35309739/define-proper-width-in-csv-generated-by-sp-send-dbmail

Please see the above link this is exactly what is happening in the excel file

Dec 01 at 07:45 AM dantenan

@dantenan In order to get rid of the "701 row(s) affected" message you are going to have to use the SET NOCOUNT ON, this will remove the message. As far as formatting is concerned then no, it is not possible with the send mail procedure. You call the procedure with instructions to use tab character as a field delimiter and then attach the results as a file. The file that the procedure generates and attaches is NOT a real Excel spreadsheet, it is merely the tab-delimited text which happens to be in the file with XLS extension courtesy of the value of the query attachment filename parameter. Setting the extension of the file to XLS does not make the file to magically become Excel file. Excel will still open such file but because the guts of the file are just a tab-delimited text values with no formatting, the file will end up not formatted, and it will take a single-click in the top left corner followed by double-click between A and B columns to auto-size all columns at once.

Sending out properly formatted Excel files with the results of the stored procedure is possible, but not by the means of the msdb procedure used to send emails.

Dec 01 at 02:58 PM Oleg

@dantenan Wrapping of the values to the next line is not caused by the ProblemSummary column, this one is already handled, but is caused by the TICKET_PROBLEM_DETAIL which you left out for some reason. I mean that if you have columns which might have line breaks in the values then you have to handle all of them in the similar fashion, just like you did with the ProblemSummary.

Dec 01 at 03:06 PM Oleg

@olego got it, thanks a lot once again for your time and information. the information you provided was really helpfull

Dec 01 at 06:35 PM dantenan
(comments are locked)
10|1200 characters needed characters left

link text

@Oleg as discussed this is the stored procedure.

There is a parameter @CustomerID where i am trying to add multiple values.

Thanks and Regards

more ▼

answered Dec 07 at 03:44 PM

avatar image

dantenan
1 4

@dantenan Because the statement is dynamic, there is not much to so to add the support for multiple customers. First, increase the size of the @CustomerId parameter from varchar(50) so it can accommodate list of customers.

Once this this done, please find the line of code which needs to be modified. Currently, it is reading:

 if(@CustomerId <> '00000000-0000-0000-0000-000000000000' AND @CustomerId IS not NULL)
 BEGIN
     select @vSQL = @vSQL + 'dbo.Project.FK_CUSTOMER_ORG_ID = ''' + convert(varchar(36), @CustomerId) + '''' 
 END

The select statement needs to be modified to read this:

 select @vSQL = @vSQL + 'dbo.Project.FK_CUSTOMER_ORG_ID in (''' + 
     replace(replace(@CustomerId, ', ', ','), ',', ''',''') + ''')'

This will add support for multiple customer ID values, provided that they are passed as a comma-delimited list to the procedure. For example, 2 customers:

 3EE99669-FC20-442E-81AA-0A58D3C84C9F,6692892C-E2A6-412F-89AD-CC058B5A4033

Also, the current procedure has a bug (if the @customerID is null then the procedure raises error instead of running for all customers).

2 days ago Oleg

@Oleg thanks a lot for your time its working like gold

2 days ago dantenan
(comments are locked)
10|1200 characters needed characters left
Your answer
toggle preview:

Up to 2 attachments (including images) can be used with a maximum of 524.3 kB each and 1.0 MB total.

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here

By RSS:

Answers

Answers and Comments

SQL Server Central

Need long-form SQL discussion? SQLserverCentral.com is the place.

Topics:

x2031
x782
x402
x23
x3

asked: Nov 28 at 10:45 AM

Seen: 69 times

Last Updated: 2 days ago

Copyright 2017 Redgate Software. Privacy Policy