I have a testing database that I restore regularly from a production backup file in SQL Server 2008 R2 to get new data. We often editchange or update the stored procedures in the testing DB to test the data compared to the production’s stored procedures!
Is there away to only refreshrestore the table(s) and not the stored procedures? so I don’t have to re-edit them back to the changes that I need?
Both of these databases (Production and Testing) are in one server.
My DB size is about 5 GB and we can’t afford to have third party tools.
In SQLServer, there is no option to backup/restore just the tables. You could create an SSIS package to import data from specific/all tables and schedule it using the SQL Agent or run manually.
As both the databases Production and testing are on the same server, the data load may be faster compared to pulling data from a remote server. To minimize the load on the production database, restore the backup to a new database and then import the data to the Test database.
If you drop and recreate the tables for the refresh make sure the indexes are created after the data import.
Or if you choose to bulk import on to an existing table, drop the indexes and recreate them later and change the recovery model to simple or bulk logged.
If you have few very large tables, you can probably consider moving to separate file groups and do a partial backup and restore.
You can also try third part tools like Dell Litespeed, Idera virtual database or Apex SQL restore etc. for object level restore.
Yes, you can do this, just not by using a database
RESTORE. Instead, you would use the SqlPackage.exe utility that is part of SQL Server Data Tools (SSDT). This utility usually assists in migrating database projects in Visual Studio to SQL Server. But, it can also extract schema and data from a database into a file, and then it can push whatever is in that file into another database.
I created a CMD script, GetDataFromProduction.cmd, that extracts the schema and data, and then publishes only the tables and data. The Stored Procedures, Functions, etc are extracted into the .dacpac file (no way around that), but they can be excluded when publishing the content into a database. The following script calls SqlPackage.exe twice, once with
/Action:Extract and once with
/Action:Publish. It uses
/p:ExtractAllTableData=True on the “extract” to grab the data for all tables.
@ECHO OFF "C:Program Files (x86)Microsoft SQL Server120DACbinSqlPackage.exe" ^ /Action:Extract ^ /SourceDatabaseName:TEMPTEST ^ /SourceServerName:(local) ^ /TargetFile:C:TEMP_Extract.dacpac ^ /p:IgnoreExtendedProperties=True ^ /p:ExtractAllTableData=True ECHO Hit the "any" key to publish the extracted data PAUSE "C:Program Files (x86)Microsoft SQL Server120DACbinSqlPackage.exe" ^ /Action:Publish ^ /SourceFile:C:TEMP_Extract.dacpac ^ /TargetDatabaseName:TestPublish ^ /TargetServerName:(local) ^ /p:DropObjectsNotInSource=False ^ /p:ExcludeObjectTypes=Aggregates;ApplicationRoles;Assemblies;AsymmetricKeys;^ BrokerPriorities;Certificates;Contracts;DatabaseRoles;DatabaseTriggers;^ ExtendedProperties;FileTables;MessageTypes;Permissions;Queues;RemoteServiceBindings;^ RoleMembership;Rules;ScalarValuedFunctions;SearchPropertyLists;Sequences;Services;^ Signatures;StoredProcedures;SymmetricKeys;Synonyms;TableValuedFunctions;^ UserDefinedDataTypes;UserDefinedTableTypes;ClrUserDefinedTypes;Views;Audits;Credentials;^ CryptographicProviders;DatabaseAuditSpecifications;Endpoints;ErrorMessages;^ EventNotifications;EventSessions;LinkedServerLogins;LinkedServers;Routes;^ ServerAuditSpecifications;ServerRoleMembership;ServerRoles;ServerTriggers;^ ColumnEncryptionKeys;ColumnMasterKeys;Defaults;ExternalDataSources;ExternalFileFormats;^ ExternalTables;Filegroups;FullTextCatalogs;FullTextStoplists;PartitionFunctions;^ PartitionSchemes;SecurityPolicies;Users;XmlSchemaCollections;DatabaseScopedCredentials;^ Logins PAUSE
Now, if you do not want the data for all tables, then you can specify individual tables to get. Please note that there is no option to instruct SqlPackage to get all tables except one or two. If you want all but one or two, then you need to specify all of the tables to grab. When specifying individual tables, you need to first specify
/p:ExtractAllTableData=False (it was
True in the example above) and then add a new “property” to specify each table in “SchemaName.TableName” format:
/p:TableData="dbo.Table1" ^ /p:TableData="Orders.Order" ^ /p:TableData="Orders.OrderItems"
As you can see, you can specify
/p:TableData="SchemaName.TableName" as many times as you need. Unfortunately it does not appear that there is any ability to specify a wild-card for “SchemaName.%” or anything like that (because that would make things too easy, and what’s the fun in that?). However, you can use the following to generate that list of tables to just copy-and-paste into the CMD script:
SELECT N' /p:TableData="' + SCHEMA_NAME(tbl.[schema_id]) + N'.' + OBJECT_NAME(tbl.[object_id]) + N'" ^' FROM sys.tables tbl;
Please also note that while this method does do exactly what is being requested here, I have no basis for estimating at what data volumes this becomes unmanageable. I suppose if a database is too large for
/p:ExtractAllTableData=True, then you can break the migration into pieces by specifying
/p:ExtractAllTableData=False and then specifying a subset of tables that is manageable, and repeating that across 2 or several of these scripts.
That being said, it does handle the data sync quite nicely in that it will make whatever changes are necessary to get the existing schema and data to match whatever is in the .dacpac file. You don’t need to mess with truncating the tables first or anything like that.
If you do not already have Visual Studio and/or SSDT installed, you can get SSDT by itself from: Download Latest SQL Server Data Tools
On a related topic:
We often editchange or update the stored procedures in the testing DB to test the data compared to the production’s stored procedures!
Why are you changing the Stored Procedures? Doing this, or needing to do this, might point to a problem in your release process and/or architecture that could be improved. Generally speaking, you shouldn’t be making code changes to match a different set of data. That implies that you are either hard-coding Customer info in the code (very bad for several reasons) and/or are hard-coding environment-sensitive info, such as URLs, file paths, etc. None of that should be hard-coded; it should all be pulled from config files and/or values in a config table. And you can have a config table where there is a field for
EnvironmentID so that all environments can have the exact same data, and then you just need a single function to return the value of the current environment, and that will be used to filter against the config table. This makes the release process easier when add/updating/removing config data since the same release script (which usually includes values for all environments) runs in all environments and doesn’t need any IF conditions to check for which environment it is running in.
Comment from O.P. on this answer:
unfortunately I can’t install the package without getting an approval
This approach technically doesn’t require installing on the server running SQL Server.
SqlPackage is a client tool, just like SQLCLD, SSMS, etc, etc. It just needs to be installed on a computer that can connect to the source server. So if I.T. won’t let you install it on your local PC, then perhaps you can find a dev or QA server somewhere to install it on.
Community Wiki answer created from comments left on the question:
One option would be to store the stored procedures in a separate database (and use synonyms where necessary). There is no way to say “just back up tables.” You’d have to use replication or some other mechanism, or just restore everything to a copy and then import/export/SSIS/BCP just the tables.
“We can’t afford to have third party tools” – sometimes you can’t afford NOT to. While re-writing common software all over again technically saves you from writing a check, that does not make it free. Please read The cost of reinventing the wheel (and have your boss, or whoever authorizes software purchases and/or pays your salary, read it too). – Aaron Bertrand
Consider keeping your stored procedures (and other database objects) under source control so you can redeploy after a production restore. – Dan Guzman
Community wiki posts have been donated to the community in hopes that others will edit them to keep them up to date, to add useful information, and generally improve their quality.
Or, you can script out the stored procedures and functions from the development database to a file using SSMS; restore the prod backup over the development DB; then run the script to replace the prod procedures and functions with the dev ones.
- In SSMS, right-click the development database, select “Tasks”, then “Generate scripts”.
- In the resulting wizard, select “Stored Procedures” (and, if appropriate, “User-defined functions”).
- In the next window, click the “Advanced” button to open a dialog window where you can set options for the script generation. Scroll down to the “Script DROP and Create” option, and set it to “Script DROP and CREATE”. Click “OK”.
- Provide the path and name for the script file (definitely write to a single file for this), then finish the wizard.
The resulting file will have the necessary code to replace your product procedures and functions with your development ones.