Community technical support mailing list was retired 2010 and replaced with a professional technical support team. For assistance please contact: Pre-sales Technical support via email to sales@march-hare.com.
On Mon, 4 Aug 2003 06:28:42 -0400, "Chavous P. Camp" <ccamp at scconsultants.net> wrote: >The problem is it doesn't go away after time. They turn into energizer >bunny processes. And, really, we aren't committing that large of files. >They are all text files (source code) less than 100k. Doing a quick >sample, the largest ,v file I see is 54k. Okay, wait a sec. Some folks >uploaded some binary files running around 100-150k, but still - nothing >in the megabyte range. Our entire repository is only 12.5M and only has >1800 files, for an average of somewhere around 7k per file, if I'm >calculating correctly. That shouldn't be a problem... you can get away with a few MB befire you'll notice any load problem even on a smallish machine. It's 10s and 100s of MB that start to scale badly. >The other thing is, the client has LONG disconnected while the processor >is still spiraling... The only other thing I can think of is an old sserver problem that was fixed a few versions ago. The current devel version is even more bulletproof so I can merge those changes into the release if it's still causing a problem. >At first I thought it was the atomic commits, but I turned those off >long before posting to the list. THEY caused other problems. :) > Atomic commits don't work particularly well... They'll not be needed soon, though (hopefully). Tony