view hgwebdir.cgi @ 2079:ee96ca273f32

New lazy index code for revlogs. This tunes for large repositories. It does not read the whole index file in one big chunk, but tries to buffer reads in more reasonable chunks instead. Search speeds are improved in two ways. When trying to find a specific sha hash, it searches from the end of the file backward. More recent entries are more likely to be relevant, especially the tip. Also, this can load only the mapping of nodes to revlog index number. Loading the map uses less cpu (no struct.unpack) and much less memory than loading both the map and the index. This cuts down the time for hg tip on the 80,000 changeset kernel repo from 1.8s to 3.69s. Most commands the pull a single rev out of a big index get roughly the same benefit. Commands that read the whole index are not slower.
author mason@suse.com
date Tue, 04 Apr 2006 16:47:12 -0400
parents b0f6af327fd4
children d0db3462d568
line wrap: on
line source

#!/usr/bin/env python
#
# An example CGI script to export multiple hgweb repos, edit as necessary

import cgitb, sys
cgitb.enable()

# sys.path.insert(0, "/path/to/python/lib") # if not a system-wide install
from mercurial import hgweb

# The config file looks like this.  You can have paths to individual
# repos, collections of repos in a directory tree, or both.
#
# [paths]
# virtual/path = /real/path
# virtual/path = /real/path
#
# [collections]
# /prefix/to/strip/off = /root/of/tree/full/of/repos
#
# collections example: say directory tree /foo contains repos /foo/bar,
# /foo/quux/baz.  Give this config section:
#   [collections]
#   /foo = /foo
# Then repos will list as bar and quux/baz.

# Alternatively you can pass a list of ('virtual/path', '/real/path') tuples
# or use a dictionary with entries like 'virtual/path': '/real/path'

h = hgweb.hgwebdir("hgweb.config")
h.run()