DB->Scripts compare far far slower than Scripts-->DB

Compares and synchronizes SQL database content.

Moderators: Chris Auckland, David Atkinson, richardjm, david connell

DB->Scripts compare far far slower than Scripts-->DB

Postby TetonSig » Sat Jan 28, 2012 8:41 am

Using SQL Source Control 3 hooked up with Mercurial. Large Databases with lots of records. Normally we don't keep any data under source control, so our Data folder in the scripts folder is usually empty.

We have an automated deployment process through TeamCity that does a checkout from Mercurial, and then runs two steps (1) a command line SQL Compare from the scripts folder to the deployment server and then (2) a command line SQL Data Compare from the scripts folder to the deployment server.

This runs fine.

I setup two more steps today to generate rollback scripts so now we have the two above and two more (3) a command line SQL Compare from the deployment server to the scripts folder and (4) A command line SQL Data Compare from the deployment server to the scripts folder.

Step (2) takes almost always takes very little time (because the Data folder is usually empty). The times we do have data under source control it runs longer. No problem.

Step (4) so far always gets to Comparing databases and then hangs.

I checked the SQL Server and it is always running a select from a large table with a couple hundred thousand rows.

It appears to be doing things differently because of the order of comparison, but it should check my Data folder first in my src2 Scripts and see that nothing (or very few tables at most) are there to compare.

I am calling SQL Data Compare with an arguments file in both cases. Here are the arguments files:

(2) Runs fine

<?xml version="1.0"?>
<commandline>
<server2>servername</server2>
<database2>databasename</database2>
<username2>username</username2>
<password2>password</password2>
<scr1>c:\\prod\\db\\server\\db</scr1>
<include>identical</include>
<scriptfile>Report/DataSyncScript.txt</scriptfile>
<force/>
<verbose/>
</commandline>

(4) Won't complete

<?xml version="1.0"?>
<commandline>
<server1>servername</server1>
<database1>databasename</database1>
<username1>username</username1>
<password1>password</password1>
<scr2>c:\\prod\\db\\server\\database</scr2>
<include>identical</include>
<scriptfile>Report/DataSyncScript.txt</scriptfile>
<force/>
<verbose/>
</commandline>

Things I have tried:
Doing the same compare through the UI: Runs quickly as expected
Removing <include>identical</include>: No difference, still very slow
Rearrange XML elements in several different order: No difference, still very slow

Any help would be greatly appreciated.

Thank you,
Jason Catlett
TetonSig
 
Posts: 21
Joined: Fri Jan 06, 2012 10:45 pm

Postby David Atkinson » Sat Jan 28, 2012 1:20 pm

The second comparison is taking longer because all your tables are being compared.

When the source is a scripts folder SQL Data Compare will only select tables that are defined in the scripts folder. Otherwise its deployment script will try to remove transactional data tables from the target.

I think to solve your issue, we need to match this behavior when the target is a scripts folder. I can't think of a scenario in which anyone would want all the tables to be considered in this scenario, as this would generate a script that tries to populate a scripts folder with transactional data, which is definitely not a good idea. I take it you probably just want a script that considers the objects in your scripts folder and ignore the others?

I think that your scenario is probably the most typical. If we get requests for all objects to be considered, we can add that as a command line option.

I'll suggest this change to the project team to see what they think.

David Atkinson
Product Manager
Red Gate Software
David Atkinson
 
Posts: 1124
Joined: Mon Dec 05, 2005 4:54 pm
Location: Twitter: @dtabase

Postby TetonSig » Sat Jan 28, 2012 6:07 pm

Great, thanks. I figured that's what was happening. I appreciate you checking to see if the behavior can be changed.

Any other suggestions on what we might do in the interim?

I assume migration scripts would be an option, but we're using mercurial so they're not available to us yet (ETA?)

Our current workaround is just to have each developer understand that we can't currently generate rollbacks for data and they are responsible to "know" when they're pushing data between environments and make them responsible to write a rollback script for the final release and attach it to the "release ticket" with the other autogenerated scripts.
TetonSig
 
Posts: 21
Joined: Fri Jan 06, 2012 10:45 pm

Postby David Atkinson » Sat Jan 28, 2012 9:28 pm

In the meantime can you specify your tables in the <include> section of the xml file? There's an example at http://www.red-gate.com/supportcenter/c ... 0711000189

How many static data tables do you have?

Migration scripts only support the 'forward' scripts, not 'backward' ones, so that wouldn't solve your problem. We'll soon be evaluating what it might take to support Mercurial, so we can't make a decision until we have done this work.

David
David Atkinson
 
Posts: 1124
Joined: Mon Dec 05, 2005 4:54 pm
Location: Twitter: @dtabase

Postby TetonSig » Sat Jan 28, 2012 11:34 pm

I actually just finished a little project last night to be able to generate those argument XML files based on a set of tables (We currently have 16 databases on 4 servers under source control)

I was only planning on generating them when we had an overall build or other environment changes, but now that you mention it, I supposed I could modify that process to read the names of the files in the data directory and translate that to an include element.

Very good. Thanks for the idea.
TetonSig
 
Posts: 21
Joined: Fri Jan 06, 2012 10:45 pm

Postby TetonSig » Sun Jan 29, 2012 2:06 am

Implemented your idea and we're back to normal run times. Thanks.
TetonSig
 
Posts: 21
Joined: Fri Jan 06, 2012 10:45 pm


Return to SQL Data Compare Previous Versions

Who is online

Users browsing this forum: No registered users and 0 guests