Question :
Are there any best practices when configuring TDE in SQL Server 2008? On SQLMag, the article “Transparent Data Encryption FAQs” says the CPU may have increased usage by up to 30%?
Other than adding server horsepower, is there anything else DBAs typically do when turning on TDE?
Answer :
-
Some additional points that I have noticed is that in case you are using the backup compression feature, this feature together with TDE does not go that well. We have noticed a very minimal compression rate, almost negligible. Therefore consider this point of backup compression if you are using one.
-
I am sure you would be aware, but just to add, TDE is available for Enterprise edition, so therefore consider this as well during setting up SQL server for TDE.
-
TDE does not provide the same granular control, specific to a user or database role, as is offered by cell-level encryption.
-
Make sure the encryption keys are stored safely in a secure location that can be accessed in the event of a restore scenario. Familiarize yourself with restoring a database that has been encrypted to a new server. (originally a comment by Jonathan Fite).
Firstly, before encrypting a database, take a backup of the master key and certificate and store offline. Don’t wait until after you’ve applied TDE to do this. Also store the passwords in a password vault and make it clear which passwords correlate to which object; you really don’t want to lose track of these.
The impact that TDE has on the database is entirely dependent on the workload involved: I recently applied TDE to a data warehouse and the performance impact on the overnight load was nothing, which suggests that the process was not CPU bound. However that may not be true for your database. So if you can test the workload on a dev environment first, then do so.
It is not just the data in the files that are encrypted: TDE will also encrypt tempDB, so if you have other databases that use TempDB heavily then you may notice a performance hit. Both backups and snapshots are also encrypted.
Also consider if this database needs to be restored to other environments (eg test or UAT). You will need to restore the certificate used to encrypt the database to these other servers. This may not sound like too much of an issue, but if you do not have enterprise or developer in any of these environments then this may become costly. An alternative to applying TDE across the whole environment is to restore the database to an in-between instance that is enterprise/developer and either scramble the sensitive data, or drop the encryption and create a new backup to be restored to other environments. This second choice is probably not the most sensible, but it is always an option…
When turning TDE on, there are two locks that are applied to the database: a shared lock and an update lock. TechNet states this fully:
When TDE is enabled (or disabled), the database is marked as encrypted in the sys.databases catalog view and the DEK state is set to Encryption In Progress. The server starts a background thread (called the encryption scan or scan) that scans all database files and encrypts them (or decrypts them if you are disabling TDE). While the DDL executes, an update lock is taken on the database. The encryption scan, which runs asynchronously to the DDL, takes a shared lock. All normal operations that do not conflict with these locks can proceed. Excluded operations include modifying the file structure and detaching the database. While normal database writes to disk from the buffer pool are encrypted, log file writes may not be. The scan also forces a rollover for the virtual log file (VLF) to ensure that future writes to the log are encrypted.
My experience was on data warehouses that were barely used during the day and heavily used overnight, so I was able to apply TDE with minimal disruption during the day. If you are encrypting an OLTP in production, then best to schedule this during a maintenance window to minimize issues.
Edit: Also on the point of compression; whilst it is true that TDE affects backup compression, it does not affect row/page/columnstore compression. So if you want to balance out the loss of compression from backups you could compress objects in the database. Again, depending on your workload, you may/may not want to implement compression on your database because it’ll further stress the CPU. There is an excellent TechNet article on compression implementation: https://technet.microsoft.com/en-us/library/dd894051%28v=sql.100%29.aspx