hgwebdir.cgi
author Vadim Gelfer <vadim.gelfer@gmail.com>
Thu, 15 Jun 2006 12:55:58 -0700
changeset 2434 a2df85adface
parent 1829 b0f6af327fd4
child 2506 d0db3462d568
permissions -rw-r--r--
http server: support persistent connections. only "hg serve" affected yet. http server running cgi script will not use persistent connections. support for fastcgi will help that. clients that support keepalive can use one tcp connection for all commands during clone and pull. this makes latency of binary search during pull much lower over wan. if server does not know content-length, it will force connection to close at end. right fix is to use chunked transfer-encoding but this is easier and does not hurt performance. only command that is affected is "changegroup" which is always last command during a pull.

#!/usr/bin/env python
#
# An example CGI script to export multiple hgweb repos, edit as necessary

import cgitb, sys
cgitb.enable()

# sys.path.insert(0, "/path/to/python/lib") # if not a system-wide install
from mercurial import hgweb

# The config file looks like this.  You can have paths to individual
# repos, collections of repos in a directory tree, or both.
#
# [paths]
# virtual/path = /real/path
# virtual/path = /real/path
#
# [collections]
# /prefix/to/strip/off = /root/of/tree/full/of/repos
#
# collections example: say directory tree /foo contains repos /foo/bar,
# /foo/quux/baz.  Give this config section:
#   [collections]
#   /foo = /foo
# Then repos will list as bar and quux/baz.

# Alternatively you can pass a list of ('virtual/path', '/real/path') tuples
# or use a dictionary with entries like 'virtual/path': '/real/path'

h = hgweb.hgwebdir("hgweb.config")
h.run()