Community technical support mailing list was retired 2010 and replaced with a professional technical support team. For assistance please contact: Pre-sales Technical support via email to sales@march-hare.com.
I've managed to get the CVSNT server up and running on a Windows 2000 server. As I understand it, the normal scenario with CVS is to have a central repository, out of which developers check files, edit them on their own machines, and check them back in. This gets more complex when you’re working with web technologies, as the files need to be accessible by a web server at edit time. This means that each developer needs to have a local web server running when editing/testing files prior to committing them back to the repository. All fine so far, but consider this: My team are developing web applications in ASP. Every application consists of at least two and sometimes three websites. The files (even checked out files) have to be stored on a server rather than a workstation in order to get backed up each night. So this rules out using IIS on the workstations, as unless you are running Win2k Server you are restricted to one site per machine (and 10 connections) we need 2 or 3 sites per ongoing project. We’re going to have to be checking files out into folders on a server instead of the workstations, which is fine. I think this is the bit that’s bothering me … for every site in every project for every developer I need to have: a) an IIS website b) a DNS record to access the site c) a ‘working’ folder on a server somewhere for files. Now say that there are 3 sites for one project and a developer has two projects on the go, that’s 6 websites just for that developer. For a team of 5, that’s 30 websites, 30 DNS entries and 30 working folders. Which means I need: d) a server admin to look after it all. Unless I’m missing a better way of working. Any suggestions? drew