Execute exported .sql scripts generated by SQL Data compare?

Compares and synchronizes SQL database content.

Moderators: Chris Auckland, David Atkinson, richardjm, david connell

Execute exported .sql scripts generated by SQL Data compare?

Postby crutch » Mon Nov 24, 2008 4:43 pm

I'm trying to merge two databases by using the scripts generated by SQL Data Comare. I have run up against a wall trying and I'm not sure how to get around it. Basically the two databases are part of a replication system and I have a series of unique ID columns that cause sync wizard to error out.

I am trying to export the results to a .sql script and that works great. The problem is that the script ends up being over 500 MB in size. I then take this script and manually modify the ID's that are causing me issues basically using a search and replace.

The problem I'm having is trying to execute the script after the chages are made. SSMS can't handle anything that large, and SQLCMD gives me "Sqlcmd: Error: Scripting error." when ever I try to use the .sql script as input. At first I thought it was possibly related to my search and replace on the ID's, but if I take the file exported by SQL Data Compare and use it as the input file, I get the same error. So it's obviously somewhere else. Is there an easier/better way to execute these scripts?
crutch
 
Posts: 1
Joined: Mon Nov 24, 2008 4:32 pm

Postby Chris Auckland » Mon Dec 01, 2008 3:17 pm

Unfortunately SQL Server MS is unable to run such large scripts. The way round this is to switch off transactions whilst doing SQL Data Compare and obtain the script in the usual way. You then have to parse the script in an application, say written in Visual Basic or Perl that will partition the SQL script into smaller chunks and send the inserts to SQL Server in batches. As you will not have any transactional integrity you will have to take a backup before you attempt to run the VB application.

Unfortunately there isn't a facility in SQL Data Compare to do this at the moment but I believe it has been suggested as a design request (SDC-799).

The other workarounds might be to use a WHERE clause to restrict the amount of data being synced, or to select fewer objects for the sync and then run several syncs.

A VB script might look like this - this will inject GO every 100 lines:

Code: Select all
Option Explicit
Dim objFS, objFolder
Dim strDir

ReadLineTextFile()

Function ReadLineTextFile
Const ForReading = 1, ForWriting = 2
Dim fin, fout, MyFile, i, lineinterval, stuff
lineinterval = 100

i = 0
Set MyFile = CreateObject("Scripting.FileSystemObject")
Set fin = MyFile.OpenTextFile("input.sql", ForReading,false)
MyFile.CreateTextFile "output.sql" ' Create a file.
Set fout = MyFile.OpenTextFile("test1out.txt", ForWriting, True)

fout.Writeline "--Start of script"

Do While fin.AtEndOfStream <> True
if (i = lineinterval) then
fout.Writeline "GO"
fout.Writeline ""
i = 0
else
stuff = fin.ReadLine()
fout.WriteLine stuff
i = i +1
end if

Loop

fout.Writeline "GO"
fout.Writeline "--End of script"

fin.close
fout.close

End Function
Chris
Chris Auckland
 
Posts: 757
Joined: Tue Oct 24, 2006 2:12 pm
Location: Red Gate Software Ltd.


Return to SQL Data Compare Previous Versions

Who is online

Users browsing this forum: No registered users and 0 guests