Hi there, I am trying to bulk insert a large batch of table stored JSON data into a separate table where each element has its own row. I am using a cursor for this bulk insert, illustrated below, because I need to mark each individual JSON row as completed once done. As this is a cursor it is taking some amount of time, so is there a quicker way using a set based query? The script I am using uses the parseJSON function from Phil Factor [
https://www.red-gate.com/simple-talk/sql/t-sql-programming/consuming-json-strings-in-sql-server/ ] : DECLARE @id INT DECLARE @cid INT DECLARE @cur1 as CURSOR; SET @cur1 = CURSOR FOR SELECT ID,CustID FROM Forms WHERE Done IS NULL OPEN @cur1; FETCH NEXT FROM @cur1 INTO @id,@cid WHILE @@FETCH_STATUS = 0 BEGIN DECLARE @json NVARCHAR(MAX) SET @json = (SELECT CAST(JSONDATA AS NVARCHAR(MAX)) FROM Forms WHERE ID = @id AND CustID = @cid) INSERT INTO FormsParseJSON(element_id,sequenceNo,parent_ID,Object_ID,NAME,StringValue,ValueType) SELECT element_id,sequenceNo,parent_ID,Object_ID,NAME,StringValue,ValueType FROM parseJSON(@json) UPDATE FormsParseJSON SET ID = @id, CustID = @cid WHERE ID IS NULL AND CustID IS NULL UPDATE Forms SET Done = 1 WHERE ID = @id AND CustID = @cid FETCH NEXT FROM @cur1 INTO @id,@cid END CLOSE @cur1 DEALLOCATE @cur1 TIA.