I am using the most current version of Wandisco svn client to manage an svn server. I have an 11.8 GB repository that I have tried to use svnsync on. I am on Windows, both the source and the sync are on the same local drive.
I started this process last night before I left and saw that it was running fine for nearly 4 hours -- I can see the destination growing, past 6GB. I came in this morning to find it failed with an out of memory error. It was somewhere near revision 20,000 when it failed.
So I restarted this again today and I have been watching memory. The destination repository has grown past 6.5 GB. At the same time, my memory footprint on my Windows 7 64 Bit machine with 4GB of physical memory has also grown. SvnSync is now consuming 1,628,292 KB of memory. It just keeps growing. As I approach revision 20,000 available physical memory is shrinking to zero.
Any help, updates or work arounds on this?
Yep, failed the second try. Ran for about 4 hours and 45 minutes.
In the Windows shell it shows...
Copied properties for revision 20157
Transmitting file data. Out of memory - terminating application
Looking at the source repository: 11.8 GB 58,317 Files 4,110 Folders
Looking at the sync repository: 7.08 B 40,366 Files 51 Folders
51 Folders seems strange, but who knows... perhaps that number would come up to 4,110 if I was able to run to completion.
So, doubling my 4 GB to 8 GB of RAM would probably solve my problem short term.
But this sure smells like a memory leak to me.
It might be related to issue 3593: http://subversion.tigris.org/issues/...ug.cgi?id=3593
If the problem is a memory leak that accumulates over many revisions then you can work around the problem simply by starting a new sync process on the partial copy. You may need to manually delete the svn:sync-lock revision property.
Continuing the sync is a viable work around. I did have to delete the svn:sync-lock revision property. I did so by using the command svn propdel svn:sync-lock --revprop -r 0 file:///MyRepository
Thanks for the work around.
Tags for this Thread