Question :
When testing some queries, I set the recovery model to full on a database and ran two identical 1M row inserts, with and without TABLOCK
.
On the one with TABLOCK
, I got around 9295
log records on a sql server 2008 instance, and around 8714
log records on a SQL Server 2017 instance.
When running the insert without tablock, I get around 1035659
records for the 2008 instance and around 1068599
records for the 2017 instance.
The reason for testing on sql server 2008 is to match the data loading performance guide‘s statement on the recovery models for ML operations :
Minimally logged operations are available only if your database is in
bulk-logged or simple recovery mode.
So, what am Is seeing here if it is not minimal logging?
Use DatabaseName
ALTER DATABASE DatabaseName SET RECOVERY FULL;
GO
BACKUP DATABASE DatabaseName TO DISK = '\locationDatabaseName.bak';
BACKUP LOG DatabaseName TO DISK = '\locationDatabaseName_log.trn';
GO
IF OBJECT_ID(N'dbo.Accounts', N'U') IS NOT NULL
BEGIN
DROP TABLE dbo.Accounts;
END;
GO
CREATE TABLE dbo.Accounts( AccountID INT PRIMARY KEY NOT NULL,
AccountName varchar(255),
DateCreated DATETIME2);
-- Insert 1M Rows into dbo.Account without TABLOCK
GO
SET STATISTICS IO, TIME ON;
INSERT INTO dbo.Accounts (AccountID,AccountName,DateCreated)
SELECT TOP(1000000)
ROW_NUMBER() OVER (ORDER BY (SELECT NULL)),
'Name N ' + CAST(ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) AS varchar(255)),
DATEADD(MINUTE,-ROW_NUMBER() OVER (ORDER BY (SELECT NULL)),GETDATE())
FROM MASTER..SPT_VALUES SPT1
CROSS APPLY MASTER..SPT_VALUES SPT2;
GO
-- check the amount of records in the log file
SELECT count(*)
FROM
fn_dbLog(NULL,NULL);
--1035659 rows
GO
-- clear the log
BACKUP LOG DatabaseName to disk = '\locationDatabaseName_log2.trn';
GO
--drop the table
IF OBJECT_ID(N'dbo.Accounts', N'U') IS NOT NULL
BEGIN
DROP TABLE dbo.Accounts;
END;
GO
-- create the table
CREATE TABLE dbo.Accounts( AccountID INT PRIMARY KEY NOT NULL,
AccountName varchar(255),
DateCreated DATETIME2);
-- Insert 1M Rows into dbo.Account WITH TABLOCK
GO
SET STATISTICS IO, TIME ON;
INSERT INTO dbo.Accounts WITH(TABLOCK) (AccountID,AccountName,DateCreated)
SELECT TOP(1000000)
ROW_NUMBER() OVER (ORDER BY (SELECT NULL)),
'Name N ' + CAST(ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) AS varchar(255)),
DATEADD(MINUTE,-ROW_NUMBER() OVER (ORDER BY (SELECT NULL)),GETDATE())
FROM MASTER..SPT_VALUES SPT1
CROSS APPLY MASTER..SPT_VALUES SPT2;
GO
-- check the amount of records in the log file
SELECT count(*)
FROM
fn_dbLog(NULL,NULL);
--9295 rows
Answer :
When using TABLOCK
your logging
even in full model
is called “efficient logging”, it’s when instead of logging every insert
row-by-row the whole pages
are logged.
It would be minimally logged
in simple
and bulk logged
modes with only page allocations
in the log
, but in full
recovery model you have fully formatted pages
of data
.
So the number of records
that go to the log in simple
and full
models when inserting with tablock
will be nearly the same but the content is different: in simple
/bulk logged
there will be only page numbers, in full there will be complete pages
.
And it is really full logging
since the insert
operation can be fully reconstituted using these log records while in simple
/bulk logged
you have only information sufficient to rollback
the insert
.