It's just Emanuele showing you what you can do in the Microsoft Data Platform

Quickly move data between instances for testing using only PowerShell

Q

Quick tip, how many times did a dev ask you to move the PROD/QA data to DEV to test a feature? Of course, not the whole DB, which would be too easy via a scripted backup/restore, but only a single table, or subset of table data?

How do you deal with it usually? With the SSMS Import data Wizard? (ugh..) Exporting to flat file and reimporting? Linked Servers? External tools? Restore the whole database in the dev instance and extract the single table? Ignore the dev until it goes away?

The easiest way for me is using Powershell, like so:

### Parameters
$SourceInstance = 'SOURCEINSTANCE'
$SourceDB = 'SOURCEDB'
$SourceSchema = 'dbo'
$SourceTable = 'SOURCETABLE'

$DestinationInstance = 'DESTINATIONINSTANCE'
$DestinationDB = 'DESTINATIONDATABASE'
$DestinationSchema = 'dbo'
$DestinationTable = 'DESTINATIONTABLE'

### Parameters

$DataTable = Read-SqlTableData -TableName $SourceTable -SchemaName $SourceSchema -DatabaseName $SourceDB -ServerInstance $SourceInstance -Verbose

$DataTable | ConvertTo-DbaDataTable | Write-DbaDbTableData -Table $DestinationTable -Schema $DestinationSchema -Database $DestinationDB -SqlInstance $DestinationInstance -Truncate -KeepIdentity -AutoCreateTable -KeepNulls -BatchSize 50000

I’m using the sqlserver and dbatools Powershell modules to accomplish such a tedious task in the fastest way possibile.
The Write-DbaDbTableData cmdlet is pretty neat because it can create automatically the destination table if it doesn’t exists, truncate the table if it exists (or append, your choice), keep the identity values and nulls if necessary and everything is done via a bulk insert with a configurable batch size.

I’m pretty sure that the example above can be used to integrate a fully automated alignment process, give it a try!

Performance Note

I had to include ConvertTo-DbaDataTable in the Write-DbaDbTableData pipeline, because otherwise the cmdlet is not able to do a proper bulk insert, since as opposed to getting all the rows from the $DataTable and sending it in bulk to SQL Server, it fetches only one row at the time and does a lot of bulk inserts for single rows.
If you’re having performance issues with Write-DbaDbTableData keep this piece of information in mind.

About the author

Emanuele Meazzo

3 comments

  • EMANUELE MEAZZO very nice work!
    However, in working with same problem, I was able to avoid “ConvertTo-DbaDataTable in the Write-DbaDbTableData pipeline”,
    by piping as follows object to object in “Invoke-DbaQuery -As Datatable | Write-DbaDbTableData -InputObject”

    • Nice catch, in this way you don’t even need to use the sqlserver cmdlet (even if everybody should have it installed regardless ๐Ÿ˜€ )
      I used the “mixed approach” because in this way I can apply any transformation to the $DataTable while copying it from server to server, as opposed as post-copy (e.g. masking sensitive data) to comply with whatever policy it’s in place.

    • I use the same way you described too, the only problem with Write-DbaDbTableData is that the table has to exist. It won’t allow autocreate which I wish it did! That is when Emanuele’s solution works best.

It's just Emanuele showing you what you can do in the Microsoft Data Platform

Emanuele Meazzo

My Social Profiles

If you're interested in following me around in the social world, here are some URLs for you