Unfortunately SQL Server MS is unable to run such large scripts. The way round this is to switch off transactions whilst doing SQL Data Compare and obtain the script in the usual way. You then have to parse the script in an application, say written in Visual Basic or Perl that will partition the SQL script into smaller chunks and send the inserts to SQL Server in batches. As you will not have any transactional integrity you will have to take a backup before you attempt to run the VB application.
Unfortunately there isn't a facility in SQL Data Compare to do this at the moment but I believe it has been suggested as a design request (SDC-799).
The other workarounds might be to use a WHERE clause to restrict the amount of data being synced, or to select fewer objects for the sync and then run several syncs.
A VB script might look like this - this will inject GO every 100 lines:
- Code: Select all
Dim objFS, objFolder
Const ForReading = 1, ForWriting = 2
Dim fin, fout, MyFile, i, lineinterval, stuff
lineinterval = 100
i = 0
Set MyFile = CreateObject("Scripting.FileSystemObject")
Set fin = MyFile.OpenTextFile("input.sql", ForReading,false)
MyFile.CreateTextFile "output.sql" ' Create a file.
Set fout = MyFile.OpenTextFile("test1out.txt", ForWriting, True)
fout.Writeline "--Start of script"
Do While fin.AtEndOfStream <> True
if (i = lineinterval) then
i = 0
stuff = fin.ReadLine()
i = i +1
fout.Writeline "--End of script"