I’ve got a table of more than 400 million rows and want to convert the datatype of one of its columns, specifically
If I execute my statement, I get the following error:
The transaction log for database ‘xxxx’ is full due to ‘ACTIVE_TRANSACTION’
So is here any possibility to update this table?
(My log file can have a maximum size of 150GB, I don’t have any more free space available.)
Scott’s answer made me realise you might have enough space for this:
Create a datetime2 column with a temporary name, and transfer the original column contents to it in batches (to prevent your log running out of space — and I’m assuming your database is in Simple recovery model).
Then drop the original column, and rename the new column to the old column name.
I would recommend using SSIS.
-> Load data into flat file
-> truncate table / change data type
-> Convert datatype into datetime2
-> upload data into table (batch mode)
One option ‘might’ be to BCP export the current data (while converting the column in question to its new data type) to a flat file, drop and recreate the table with the new data type and BCP import the data using the -b parameter to limit the number of rows logged in each batch, thus preventing your transaction log from running out of space.
This subject is covered in great depth by Kendra Little at this link.
The problem is that as you go down the table you’re triggering continuous page splits on top of the reorganisation of the table.