------------------------------------------------------------
A probabilistic model and algorithms for evolution of genes by duplications.
Jens Lagergren (invited talk)
The Royal Institute of Technology, Stockholm
Comparative studies of genomes identify functionally
important genomic elements, elucidate evolutionary
mechanisms, and makes it possible to translate knowledge
of one organism to another, e.g. from model organism to
human. The divergence time for a pair of species or,
more generally, the species tree (en edge weighted tree
describing the evolutionary history of a set of species)
is highly relevant in comparative studies. Gene duplications
is a type of mutation that is believed to be important in
creation of genes of new function; it is also central in the
definition of ``same function'' (orthology).
We will describe a probabilistic model for gene duplications,
where a gene tree, describing the evolution of a gene family,
evolves inside a species tree. A reconciliation is a mathematical
description of such an evolution. Given a gene and a species tree,
the likelihood of a reconciliation is the probability that it is the
correct reconciliation. It turns out that the likelihood can be computed
by dynamic programming. Markov Chain Monte Carlo techniques are then
used to compute the posterior probability distribution for
reconciliations.A probabilistic model and algorithms for evolution
of genes by duplications.
------------------------------------------------------------
Elementary bounds on Poincar\'e and log-Sobolev constants
for decomposable Markov chains
Mark Jerrum
(joint work with Jung-Bae Son, Prasad Tetali and Eric Vigoda)
Edinburgh
Abstract: We consider finite-state Markov chains that can be naturally
decomposed
into smaller ``projection'' and ``restriction'' chains. Possibly this
decomposition will be inductive, in that the restriction chains
will be smaller copies of the initial chain. We provide expressions for
Poincar\'e (respectively, log-Sobolev) constants of the initial
Markov chain in terms of Poincar\'e (respectively, log-Sobolev)
constants of the projection and restriction chains, together
with further parameter. In the case of the Poincar\'e constant,
our bound is always at least as good
as existing ones, and, depending on the value of the extra parameter,
may be much better. There appears to be no previously published
decomposition result for the log-Sobolev constant.
------------------------------------------------------------
Sampling colorings of the grid
Leslie Ann Goldberg
(joint work with Russ Martin and Mike Paterson)
Edinburgh
In this talk, we consider the problem of uniformly sampling proper
q-colorings of the 2-dimensional Cartesian lattice Z^2. In fact, we
will be interested in sampling colorings of finite rectangular
regions. The ``Glauber dynamics'' Markov chain is a simple Markov
chain whose stationary distribution is uniform on q-colorings. In
this talk, we discuss what is known about the mixing properties of the
chain.
------------------------------------------------------------
Approximating Random Instances of MAX-3SAT
Marek Karpinski
(joint work with W. Fernandez de la Vega)
University of Bonn
Abstract: We prove that MAX-3SAT can be approximated
in polynomial time to within a factor below 9/8 on
random instances. Some intriguing open problems and
connections to other optimization problems are also
discussed.
------------------------------------------------------------
Approximation Schemes for Metric Minimum Bisection and Partitioning
W. Fernandez de la Vega
(joint work with Marek Karpinski and Claire Kenyon)
CNRS, Orsay
Abstract: We design polynomial time approximation schemes
(PTASs) for Metric MIN-BISECTION, i.e., the problem
of dividing a given finite metric space into
two halves so as to minimize the sum of distances across the cut.
The method extends to partitioning problems with arbitrary size
constraints.
The main ingredients in our algorithms are a lower bound for
the minimum bisection, the use of biased sampling, and the so-called,
now well-known, device of ``exhaustive sampling''. As a by product
we lower down by a logarithmic factor
the sample size in the approximate solution of dense
MAX-CUT by linearized quadratic programs.
-----------------------------------------------------------
On the of Non-Approximability of the Tutte Polynomial
Dominic Welsh
Oxford
The Tutte polynomial of a graph or matrix contains as specializations
the following well known invariants..
a) The Jones polynomial of an alternating knot.
b) The weight enumerator of a linear code over any finite field.
c) The characteristic function of a hyperplane arrangement.
d) The chromatic and flow polynomials of a graph.
e) The Ehrhart polynomial of a unimodular zonotope.
In 1990, with F. Jaeger and D.L. Vertigan, we showed that unless
#P=P (which is most unlikely) there is no polynomial time
evaluation algorithm except at 8 special points and along one special
curve of the plane. In this talk I shall first briefly describe the above
applications and the current state of knowledge on
the much more difficult problem of how well the above
functions can be approximated.In particular I shall present
new results with D. L . Vertigan showing that unless NP=RP
a large part of the plane has no fully polynomial randomized
algorithm .
-----------------------------------------------------------
Approximating Longest Directed Path
Andreas Bj\"orklund
(joint work with Thore Husfeldt and Sanjeev Khanna)
Abstract:
We investigate the hardness of approximating the longest path
and
the longest cycle in directed graphs on $n$ vertices. We show that
neither of these two problems can be polynomial time approximated
within $n^{1-\epsilon}$ for any $\epsilon>0$ unless
P=3DNP. In particular, the result holds for
digraphs of constant bounded outdegree that
contain a Hamiltonian cycle. Assuming the stronger complexity
conjecture that Satisfiability cannot be solved in subexponential
time, we show that there is no polynomial time algorithm that always
finds a path of length $\Omega(\log^{2+\epsilon} n)$, or a cycle of
length $\Omega(\log^{1+\epsilon} n)$, for any constant $\epsilon>0$
in these graphs. In contrast we show that there is a polynomial time
algorithm always finding a path of length $\Omega(\log^2 n/\log\log
n)$ in these graphs. This separates the approximation hardness of
Longest Path and Longest Cycle in this class of graphs. Furthermore,
we present a polynomial time algorithm that finds paths of length
$\Omega(n)$ in most digraphs of constant bounded outdegree.
------------------------------------------------------------
Uniform Hashing in Constant Time and Linear Space (To appear
at STOC 2003)
Rasmus Pagh (joint work with Anna Östlin)
ITU Copenhagen
Abstract:
Many algorithms and data structures employing hashing have been
analyzed under the uniform hashing assumption, i.e., the assumption
that hash functions behave like truly random functions. Starting with
the discovery of universal hash functions, many researchers have
studied to what extent this theoretical ideal can be realized by hash
functions that do not take up too much space and can be evaluated
quickly. In this paper we present an almost ideal solution to this
problem: A hash function that, on any set of n inputs, behaves like a
truly random function with high probability, can be evaluated in
constant time on a RAM, and can be stored in O(n) words, which is
optimal. For many hashing schemes this is the first hash function that
makes their uniform hashing analysis come true, with high probability,
without incurring overhead in time or space.
-----------------------------------------------------------
Novel Architectures for P2P Applications: the Continuous-Discrete Approach
Moni Naor
(joint work with with Udi Wieder)
Weizmann Institute
We propose a new approach for constructing P2P networks based on a
dynamic decomposition of a continuous space into cells
corresponding to processors. We demonstrate the power of these
design rules by suggesting two new architectures, one for DHT
(Distributed Hash Table) and the other for dynamic expander
networks. The DHT network, which we call Distance Halving,
allows logarithmic routing and load, while preserving constant
degrees. Our second construction builds a network that is
guaranteed to be an expander.
The resulting topologies are simple to maintain and implement.
Their simplicity makes it easy to modify and add protocols.
We present a provably good protocol for relieving hot
spots and a construction with high fault tolerance.
-----------------------------------------------------------
Hidden Translation and Orbit Coset in Quantum Computing
Miklos Santha
(joint work with G. Ivanyos, K. Friedl, F. Magniez and P. Sen.)
LRI, Orsay
Abstract:
We give efficient quantum algorithms for
Hidden Translation (HTP) and Hidden Subgroup (HSP) in
a large class of non-abelian groups including solvable groups
of bounded exponent and of bounded derived series.
Our algorithms are recursive. For the base case, we
solve efficiently HTP in Z_{p}^{n},
whenever p is a fixed prime.
For the induction step, we introduce the problem Orbit Coset (OC)
generalizing both HTP and HSP, and prove a
self-reducibility result:
OC in a finite group G
is reducible to OC in G/N and subgroups of N, for
any solvable normal subgroup N of G.
The talk walk concentrate on the "classical"
probabilistic part of the results.
------------------------------------------------------------
Utilitarian Link Assignment
Russell Martin
(joint work with Petra Berenbrink, Leslie Ann Goldberg
and Paul Goldberg)
Edinbourgh
Abstract:
This paper studies a resource allocation problem introduced
by Koutsoupias and Papadimitriou. The scenario is modelled
as a multiple-player game in which each player selects one
of a finite number of known resources. The cost to the player
is the total weight of all players who choose that resource,
multiplied by the ``delay'' of that resource. Recent papers
have studied the Nash equilibria and social optima of this
game in terms of the $L_\infty$ cost metric, in which the
social cost is taken to be the maximum cost to any player.
We study the $L_1$~variant of this game, in which the
social cost is taken to be the sum of the costs to the
individual players, rather than the maximum of these costs.
We bound the ratio between the social optimum and the cost
of Nash equilibria in special cases where the resource delays
are identical, or when each player's weight contribution is
the same. We also study the variation in cost between
different Nash equilibria in these cases.
------------------------------------------------------
Chips on Wafers, or Packing Rectangles into Grids
Mattias Andersson
(joint work with Joachim Gudmundsson and Christos Levcopoulos)
Lund university
Abstract: A set of rectangles S is said to be grid packed
if there exists a rectangular grid (not necessarily regular)
such that every rectangle lies in the grid and there is at
most one rectangle of S in each cell.
The area of a grid packing is the area of a minimal bounding
box that contains all the rectangles in the grid packing.
We present an approximation algorithm that given a set S of
rectangles and a real constant eps>0 produces a grid packing of
S whose area is at most 1+eps times larger than an optimal
packing in polynomial time. If eps is chosen large enough the running
time of the algorithm will be linear.