question

David 2 1 avatar image
David 2 1 asked

Question About JSON and External Table

Hi there, I have a table that holds JSON data of varying lengths with a data type of NVARCHAR(MAX). Storing this column within the table itself causes the query on the table to run exceptionally slowly. Is it possible to store this JSON data externally to be handled by the file system? I know that FILESTREAM allows VARBINARY(MAX) to be stored externally however I don't know how best to handle this large NVARCHAR(MAX) JSON column to speed up the queries on this table. TIA -- Edit to include example [2017-06-23] CREATE TABLE [dbo].[TestJSON]( [ID] [int] NOT NULL, [ROWGUID] [uniqueidentifier] NOT NULL, [JSONDATA] [varbinary](max) FILESTREAM NULL, [ZIPDATA] [varbinary](max) FILESTREAM NULL, [JSON] AS (CONVERT([nvarchar](max),[JSONDATA])) PERSISTED, [Submitted] [datetime] NOT NULL) SELECT TOP 1 ID, ROWGUID, JSONDATA, ZIPDATA FROM TestJSON WHERE ID = 32 ORDER BY Submitted DESC -- executes in 1 second SELECT TOP 1 ID, ROWGUID, JSONDATA, ZIPDATA, JSON FROM TestJSON WHERE ID = 32 ORDER BY Submitted DESC -- executes in 1 hour 12 minutes 3 seconds JSON is persisted computed column. I doesn't matter if it's persisted or not. It still takes 1+ hours to complete. Also the JSON data is too large to provide sample data here as it's 12MB in size. Both query plans are almost identical with identical cost and both use the clustered index (cost ~26%) and a sort (~73%). Including the JSON column, the query plan includes Filter, Compute Scalar but it's cost is ~0%. ![![alt text][1]][1] [1]: /storage/temp/4157-capture.jpg
tsqltablenvarcharjson
capture.jpg (79.2 KiB)
4 comments
10 |1200 characters needed characters left characters exceeded

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Just a follow up question -- "Storing this column within the table itself causes the query on the table to run exceptionally slowly." -- could you please provide a bit more info? Are you using this column in the WHERE clause as a predicate? Are you returning this in SELECT? Just curios what does slowly mean?
1 Like 1 ·
Added additional info above.
1 Like 1 ·
In addition to what @DenisT says, if you could provide some samples (table schema, indexes, small data sample) to show what it is that you're doing, and maybe some performance information (actual query plan) for the slow stuff, we might be able to offer more constructive help.
0 Likes 0 ·
I cannot reproduce in my test instance. I assumed that ROWGUID is your cluster key, right? Looking at the plan, I am guessing is that your plan is not taking advantage of what's called a pushdown predicate. Basically, you scan the whole clustered index, compute the JSON column, and then Filter gets the rows = 32.
0 Likes 0 ·

1 Answer

· Write an Answer
David 2 1 avatar image
David 2 1 answered
ID is the clustered index. No idea what a push down predicate is. Can you expand? JSON is a NVARCHAR((MAX) is that not too large to take advantage of indexing? Can you provide an example? I still believe that the problem lies with the large JSON column being processed through the SQL engine. Persisted or not it is sloooow.
2 comments
10 |1200 characters needed characters left characters exceeded

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

A pushdown predicate is when the predicate is handled by the Index Scan operator instead of a Filter. Why do you need a computed column? Can you just CONVERT in SELECT? Have you looked at how many rows are being returned in both plans by the Scan? What version of SQL Server are you running this on?
0 Likes 0 ·
Thanks. Unfortunately it doesn't matter if the column is persisted or I place the CONVERT in the SELECT it doesn't improve the performance either way. I don't think that having a NVARCHAR(MAX) column can take advantage of indexing because the of the column size?
0 Likes 0 ·

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.