Bulk Copy Data into SQL Server with PowerShell
The fastest way to get data into SQL Server is to use bulk copy methods, whether you use the old bcp utility or use the same capability in other ways.
January 13, 2014
The fastest way to get data into SQL Server is to use bulk copy methods, whether you use the old bcp utility or use the same capability in other ways. T-SQL supports the BULK INSERT command as well, allowing you to move data into SQL Server quickly from a flat file.
Related: A Bulk-Copy Procedure
When working in PowerShell, you may find that you want to load data from some source into SQL Server for tracking or analysis, as well. This could be server statistics or any other data that it will be useful to have in a database table.
ADO.NET provides the ability to bulk load data using the SqlBulkCopy object. There are two important caveats for working with the SqlBulkCopy object. The first, is that the destination table must exactly match the incoming data—no additional columns, even if they have identity or default constraints. The second, is that the source data be in the form of an ADO.NET DataTable object.
Related: Bulk-Insert Options for ADO.NET
Out-DataTable Function
A while back, a PowerShell MVP named Marc van Orsouw, created a function called Out-DataTable and published it to his blog. This original reference has since ceased to exist, but another PowerShell MVP named Chad Miller has enhanced it and kept it public at SourceForge.net.
This function is extremely useful in that it takes any PowerShell object and converts it to a DataTable object. You pipe the original object to the function, and assign the results to a variable, and that variable will be a DataTable object.
So, let's say we want to capture the size and free space on each of our logical drives on our server, and we want to load that into SQL Server for later analysis. We can use Windows Management Instrumentation queries to collect the disk information and select the name, size, and freespace properties from those queries. We also want the current date and time, though, and for that we can create a hash table and include that in the pipeline select so our PowerShell object has all four properties.
$logdsk = gwmi -query "select * from Win32_LogicalDisk where DriveType=3" -computername 'WS12SQL' | select @{Name="DiskDTM"; Expression={get-date}}, Name, FreeSpace, Size
We need to create a table that has these same four columns in SQL Server, and we'll do that in T-SQL.
USE ServerAnalysisGOCREATE TABLE dbo.LogicalDisk ([DiskDTM] datetime,[Name] varchar(50),[FreeSpace] bigint,[Size] bigint)GO
Once we've created the PowerShell object with the data we need, we can pipe it to the Out-DataTable function and assign the results to a variable we'll use for our input to the SqlBulkCopy object.
$dtable = $logdsk | Out-DataTable
Next, we'll open up a connection to SQL Server so we can load the data. We use the ADO.NET SqlConnection object to connect to SQL Server, and we'll open that connection as soon as we create it.
$cn = new-object System.Data.SqlClient.SqlConnection("Data Source=WS12SQL;Integrated Security=SSPI;Initial Catalog=ServerAnalysis");$cn.Open()
Create the SqlBulkCopy Object
Now, we're ready to create the SqlBulkCopy object using that connection. We'll set the SqlBulkCopy object's DestinationTableName property to the name of our table, and use the WriteToServer method, with our DataTable object as its argument, to load the data into SQL Server. Once it's done, we close the connection and we're done.
$bc = new-object ("System.Data.SqlClient.SqlBulkCopy") $cn$bc.DestinationTableName = "dbo.LogicalDisk"$bc.WriteToServer($dtable)$cn.Close()
This process makes it fairly easy to get large amounts of data into SQL Server very quickly.
Related: Loading Data with BULK INSERT
About the Author
You May Also Like