Table Variable Tip
Table variables aren't affected by user transaction rollback
February 8, 2015
As a SQL practitioner, you might have wondered whether there's ever a good reason to use table variables instead of temporary tables. One of the benefits of table variables is that, like normal variables, a rollback of a user transaction doesn't affect them. If a single statement doesn't complete, its changes are rolled back. But once a single statement completes, a rollback of the user transaction—whether intentional or not—doesn't undo the changes to the table variable. Conversely, temporary tables are affected by a rollback just like normal tables are. Any changes that occur to a temporary table within a user transaction are undone if the transaction is rolled back.
This characteristic of table variables can be especially useful in error handling with transactions and triggers. Suppose that when an error occurs in a transaction or a trigger, or when some condition requires you to roll back the transaction, you need to save to a log table the data that was generated by the transaction.
If you first try to copy the data to the log table and then roll back the transaction, you undo the work that wrote to the log table. Furthermore, if you use TRY-CATCH to handle errors and your transaction enters a doomed state, you aren't allowed to make any writes to the user tables. If you issue a rollback and then try to copy the data to the log table, it's too late because the rollback undoes the changes that were created by the transaction.
This might seem like a catch-22 situation. The solution is to save the data to a table variable before you issue the rollback, then issue the rollback (which doesn't affect the table variable, unlike with temporary tables), and then in a new transaction copy the data from the table variable to the log table.
I'll demonstrate this technique with two examples: one with a doomed transaction and the other with a trigger.
Example with a Doomed Transaction
Run the code in Listing 1 to create a table called T1 and a sequence called SeqT1batch.
SET NOCOUNT ON;USE tempdb;IF OBJECT_ID(N'dbo.T1', N'U') IS NOT NULL DROP TABLE dbo.T1;IF OBJECT_ID(N'dbo.SeqT1batch', N'SO') IS NOT NULL DROP SEQUENCE dbo.SeqT1batch;CREATE SEQUENCE dbo.SeqT1batch AS INT MINVALUE 1 CYCLE CACHE 10000;CREATE TABLE dbo.T1( batch INT NOT NULL, step INT NOT NULL, data FLOAT NOT NULL, CONSTRAINT PK_T1 PRIMARY KEY(batch, step));GO
Now, examine the code in Listing 2.
BEGIN TRAN;DECLARE @batch AS INT = NEXT VALUE FOR dbo.SeqT1batch, @i AS INT = 1;WHILE @i <= 5BEGIN INSERT INTO dbo.T1(batch, step, data) VALUES(@batch, @i, LOG(ABS(CHECKSUM(NEWID()))%10)); SET @i += 1;END;COMMIT TRAN;GO
This code represents a process that runs as a single transaction. The process creates a new batch number using the sequence object and then adds five rows to T1 with the batch number, the step number, and the result of some computation.
I'm using a LOG function just as an illustration for code that can fail. In the case of LOG, it fails when you pass a nonpositive input.
I use randomization to generate a value in the range 0 through 9, so 1 in every 10 computations will cause a failure on average. Because the code issues 5 inserts with the computation, 1 in every 2 executions of the code will fail on average.
This type of error happens to be a transaction-aborting error; therefore, as soon as it occurs, the transaction is rolled back and all changes are undone. Try running the code a few times until you get an error such as the following:
Msg 3623, Level 16, State 1, Line 40An invalid floating point operation occurred.
Suppose that you need to add logic that in case of an error you save the rows that were generated in the current batch to a log table before you roll back the transaction. Use the code in Listing 3 to create the log table.
IF OBJECT_ID(N'dbo.T1Log', N'U') IS NOT NULL DROP TABLE dbo.T1Log;CREATE TABLE dbo.T1Log( lsn INT NOT NULL IDENTITY CONSTRAINT PK_T1Log PRIMARY KEY, ts DATETIME2(0) NOT NULL DEFAULT(SYSDATETIME()), batch INT NOT NULL, step INT NOT NULL, data FLOAT NOT NULL);GO
Listing 4 demonstrates how you might want to implement the solution using a TRY-CATCH construct and a table variable.
BEGIN TRY BEGIN TRAN; DECLARE @batch AS INT = NEXT VALUE FOR dbo.SeqT1batch, @i AS INT = 1; WHILE @i <= 5 BEGIN INSERT INTO dbo.T1(batch, step, data) VALUES(@batch, @i, LOG(ABS(CHECKSUM(NEWID()))%10)); SET @i += 1; END; COMMIT TRAN;END TRYBEGIN CATCH IF XACT_STATE() <> 0 BEGINPRINT 'Error occurred. Saving partially generated data for batch ' + CAST(@batch AS VARCHAR(36)) + ' to T1Log.';-- copy data from T1 into table variable DECLARE @T1Temp AS TABLE ( batch INT NOT NULL, step INT NOT NULL, data FLOAT NOT NULL );INSERT INTO @T1Temp(batch, step, data) SELECT batch, step, data FROM dbo.T1 WHERE batch = @batch;-- roll back transaction ROLLBACK TRAN;-- copy data from table variable into log table INSERT INTO dbo.T1Log(batch, step, data) SELECT batch, step, data FROM @T1Temp WHERE batch = @batch; END ELSE PRINT 'Error occurred. Transaction terminated. Cannot save data for batch ' + CAST(@batch AS VARCHAR(36)) + ' to T1Log.';END CATCH;GO
When a dooming error occurs in a transaction within a TRY block, the transaction enters a failed, or doomed, state and control is passed to the CATCH block. Under a doomed state, you're not allowed to make changes to a user table, and eventually you'll have to roll back the transaction.
The good news is that you're allowed to write data to a table variable. So in the CATCH block the code declares a table variable, copies the rows of the current batch that did make it to T1 to the table variable, rolls back the transaction, and then copies the rows from the table variable to the log table.
Run this code a few times until you get a message saying that there was an error. Then query the log table by running the following code:
SELECT * FROM dbo.T1Log;
When I ran this code after three failures, I got the output that Figure 1 shows.
lsn ts batch step data---- -------------------- ------ ----- ------------------1 2014-12-02 07:29:09 10 1 0.6931471805599452 2014-12-02 07:29:09 10 2 2.197224577336223 2014-12-02 07:29:09 10 3 1.945910149055314 2014-12-02 07:29:09 10 4 2.197224577336225 2014-12-02 07:29:12 11 1 1.098612288668116 2014-12-02 07:29:13 12 1 1.791759469228057 2014-12-02 07:29:16 15 1 2.079441541679848 2014-12-02 07:29:16 15 2 09 2014-12-02 07:29:16 15 3 1.79175946922805
Example with a Trigger
Another case in which table variables can come in handy is with triggers. Suppose that in a certain condition you need to issue a rollback in a trigger, but you need to save the data from the inserted and/or deleted tables in log tables. If you first copy the data from the tables inserted and deleted to the log tables and then issue a rollback, the write to the log tables is undone. If you issue a rollback before copying the data, the inserted and deleted tables are emptied. The solution is to copy the data from the tables inserted and deleted to table variables, then issue a rollback, and then copy the data from the table variables to the log tables.
As an example, run the code in Listing 5 to create a table called T2 and an AFTER INSERT, UPDATE trigger on the table.
SET NOCOUNT ON;USE tempdb;IF OBJECT_ID(N'dbo.T2', N'U') IS NOT NULL DROP TABLE dbo.T2;CREATE TABLE dbo.T2( keycol INT NOT NULL IDENTITY CONSTRAINT PK_T2 PRIMARY KEY, data INT NOT NULL);GOCREATE TRIGGER trg_T2_insert ON dbo.T2 AFTER INSERT, UPDATEASPRINT 'Trigger trg_T2_insert at work...';GO
Currently the trigger has just a PRINT statement saying that it's at work, but normally you would have some more meaningful work there.
Suppose that when a user inserts or updates rows in T2 with a value of 0 in the data column in any of the rows, you're supposed to roll back the transaction. Why not use a simple CHECK constraint? Suppose you need to write to a log table the rows that the user attempted to insert or update, along with the action type (INSERT or UPDATE) and a batch number you obtain from a sequence object that represents the failed statement.
To achieve this, you'll need to create the log table, create a sequence object to number the failed statements, and alter the trigger with the new logic. Run the code in Listing 6 to create the sequence object SeqT2FailedStmt and the log table T2Log.
IF OBJECT_ID(N'dbo.T2Log', N'U') IS NOT NULL DROP TABLE dbo.T2Log;IF OBJECT_ID(N'dbo.SeqT2FailedStmt', N'SO') IS NOT NULL DROP SEQUENCE dbo.SeqT2FailedStmt;CREATE SEQUENCE dbo.SeqT2FailedStmt AS INT MINVALUE 1 CYCLE CACHE 10000;CREATE TABLE dbo.T2Log( lsn INT NOT NULL IDENTITY CONSTRAINT PK_T2Log PRIMARY KEY, ts DATETIME2(0) NOT NULL DEFAULT(SYSDATETIME()), failedstmt INT NOT NULL, theaction VARCHAR(6) NOT NULL, keycol INT NOT NULL, data INT NOT NULL);GO
Run the code in Listing 7 to alter the trigger, adding the new logic.
ALTER TRIGGER trg_T2_insert ON dbo.T2 AFTER INSERT, UPDATEASPRINT 'Trigger trg_T2_insert at work...';IF EXISTS(SELECT * FROM inserted WHERE data = 0)BEGIN PRINT 'Rolling back transaction. Data can be found in T2Log.'; DECLARE @theaction AS VARCHAR(6) = CASE WHEN EXISTS(SELECT * FROM deleted) THEN 'UPDATE' ELSE 'INSERT' END, @failedstmt AS INT = NEXT VALUE FOR dbo.SeqT2FailedStmt; -- copy data from the table inserted into table variable DECLARE @T2Temp AS TABLE ( keycol INT NOT NULL, data INT NOT NULL ); INSERT INTO @T2Temp(keycol, data) SELECT keycol, data FROM inserted; -- roll back the transaction ROLLBACK TRAN; -- copy data from table variable into log table INSERT INTO dbo.T2Log(keycol, failedstmt, theaction, data) SELECT keycol, @failedstmt, @theaction, data FROM @T2Temp;END;GO
The trigger uses an EXISTS predicate to check if any row with data = 0 exists in the table inserted. If so, it needs to log the rows from inserted in the table T2Log. The code determines whether the action was UPDATE or INSERT based on whether rows exist in the table deleted. It also obtains a new sequence value from the sequence SeqT2FailedStmt to represent the failed statement. The code then declares a table variable, copies the rows from the table inserted to the table variable, rolls back the transaction, and then copies the rows from the table variable to T2Log.
After you alter the trigger, run the following code to insert and update rows in T2:
INSERT INTO dbo.T2(data) VALUES(10),(20),(30);GOINSERT INTO dbo.T2(data) VALUES(0),(5),(15);GOINSERT INTO dbo.T2(data) VALUES(40),(50),(60);GOINSERT INTO dbo.T2(data) VALUES(25),(0),(35);GOINSERT INTO dbo.T2(data) VALUES(70),(80),(90);GOINSERT INTO dbo.T2(data) VALUES(45),(55),(0);GOUPDATE dbo.T2 SET data = data - 50WHERE data >= 50;GO
The trigger will roll back some of the modifications. When it does, you'll get the following error message:
Trigger trg_T2_insert at work...Rolling back transaction. Data can be found in T2Log.Msg 3609, Level 16, State 1, Line 227The transaction ended in the trigger. The batch has been aborted.
Query the table T2Log to see that the trigger managed to save the rows despite the rollback:
SELECT * FROM dbo.T2Log;
I got the output that Figure 2 shows.
lsn ts failedstmt theaction keycol data----- -------------------- ----------- --------- ------- -----1 2014-12-02 07:37:42 1 INSERT 6 152 2014-12-02 07:37:42 1 INSERT 5 53 2014-12-02 07:37:42 1 INSERT 4 04 2014-12-02 07:37:42 2 INSERT 12 355 2014-12-02 07:37:42 2 INSERT 11 06 2014-12-02 07:37:42 2 INSERT 10 257 2014-12-02 07:37:42 3 INSERT 18 08 2014-12-02 07:37:42 3 INSERT 17 559 2014-12-02 07:37:42 3 INSERT 16 4510 2014-12-02 07:37:42 4 UPDATE 15 4011 2014-12-02 07:37:42 4 UPDATE 14 3012 2014-12-02 07:37:42 4 UPDATE 13 2013 2014-12-02 07:37:42 4 UPDATE 9 1014 2014-12-02 07:37:42 4 UPDATE 8 0
Handy Table Variable Behavior
As it turns out, it's quite handy that table variables aren't affected by a user transaction rollback. This behavior is especially useful when you need to save data in a transaction that you need to roll back or in a transaction that has to roll back due to a dooming error. In my next article, I'll cover additional tips concerning table variables.
About the Author
You May Also Like