In SQL Server, are there any other options of data Import and Export

Posted on

Question :

I’ve been working with SQL Server since SQL Server 7. So I know about BCP, Bulk insert, openrowset and SSIS.

My question is a simple one:

  • Do you know of any other ways to Import/Export data using SQL server
    Express versions SQL 2005 to 2014 (and soon 2016).

I want to make sure that there isn’t anything else due to the fact that the use of BCP (which is what I’m using right now) requires the use of xp_cmdshell and for security reasons, requires an xp_cmdshell_proxy account. I’m being challenged at work, and want to make sure nothing else exists, or that is new in Microsoft SQL Server (Express) before I respond.

Answer :

SQLCMD can be used to export data. The default output is sometimes unwieldy for that, but you can use some command-line options to turn off the headers, trim trailing spaces, and to use a comma separator instead of the default space:

sqlcmd -Q "SET NOCOUNT ON; SELECT TOP(10) name, object_id, type FROM sys.objects;" ^
  -h-1 -W -s ","

Returns:

sysrscols,3,S
sysrowsets,5,S
sysclones,6,S
sysallocunits,7,S
sysfiles1,8,S
sysseobjvalues,9,S
syspriorities,17,S
sysdbfrag,18,S
sysfgfrag,19,S
sysdbfiles,20,S

Use the -o option to send the output to a file:

sqlcmd -Q "SET NOCOUNT ON; SELECT TOP(10) name, object_id, type FROM sys.objects;" ^
  -h-1 -W -s "," -o C:tempSysObjects.txt

A custom .NET program (Console app, Windows app, or even Web app) can be created to import and/or export. For importing, since Table-Valued Parameters (TVPs) are not an option here, it is sometimes faster to transform/serialize the data into XML, and then execute a stored procedure that will accept the XML parameter and using the .nodes() function, deserialize it back into a rows that can be inserted via the INSERT INTO table (fields) SELECT x.value(),... FROM @XmlParam.nodes(); construct. Of course, you wouldn’t want to do 1 million rows in this manner, so you would break the import file into batches of X rows and send 100 – 500 at a time. If there is a lot of data then maybe straight INSERT statements are better, but in that case you would want to wrap each batch of X rows into an explicit transaction using SqlTransaction.


Depending on how comfortable you (or someone you work with) are with .NET programming, you can use SQLCLR to build your own custom import and/or export functionality. This would work in all editions of SQL Server in all versions starting with SQL Server 2005.

Along these lines, there is a commercially available library of SQLCLR functions and procedures called SQL# (which I am the creator of). Some of the functions and stored procedures handle importing and exporting data. The File_SplitIntoFields Stored Procedure returns a result set where each row is a line from the input file, but each field in the file is a separate column. By default the datatype for each field is NVARCHAR(MAX), but there is an optional input parameter to specify what the exact datatype of each column should be. This makes it rather easy to do:

INSERT INTO dbo.MyImportTable (Col1, Col2, Col3, Col4)
  EXEC SQL#.File_SplitIntoFields
    @FilePath = N'',
    @RegExDelimiter = N',',
    @DataTypes = N'NVARCHAR(100),INT,DATETIME,BIT';

And there is an equivalent Stored Procedure called INET_SplitIntoFields that does the same thing, but rather than sourcing the data from a file, it streams it directly from a URL (hence no need to download the file locally first).

For exporting, there is a Stored Procedure called DB_BulkExport that combines many of the features of SQLCMD and SSIS. I created this after needing to do several exports and not being able to decide between SQLCMD and SSIS since each one has its pros and cons, and each one has one or two features that the other one doesn’t have but all of those features were needed. For example, DB_BulkExport doesn’t need any hard-coded output field specification and can handle SELECT * queries (something I couldn’t figure out how to manage in SSIS but was easy with SQLCMD). It can also handle text-qualification of fields without having to update the source query to concatenate the double-quotes in there (which makes for a very ugly and hard to maintain query). In fact, it can be set to text-qualify ALL fields, or just fields that need it based on their datatype (strings, dates, GUIDs, etc).

Leave a Reply

Your email address will not be published.