"Information Theory and its Applications to Biology, Finance and Physics" at the Stefan Banach International Mathematical Center (Warsaw, Poland), May 21-26, 2001
Jonathan D. H. Smith
Department of Mathematics, Iowa State University, Ames, Iowa 50011, U.S.A.
Tel: +1 (515) 294-8172; Fax: +1 (515) 294-5454,
E-mail: [email protected],
website: www.math.iastate.edu/jdhsmith/homepage.html
Received: 8 June 2001 / Published: 11 June 2001
The conference was held at the Banach Centre [1] within the European Union Programme Centres of Excellence. The organizers were M. Broom (Sussex), I. Csiszar (Budapest), D. Petz (Budapest), Zh. Reznikova (Novosibirsk), B. Ryabko (Novosibirsk), D. Samperi (New York), L. Stettner (Warsaw), F. Topsøe (Copenhagen), and I. Vajda (Prague). Here is a listing of most of the speakers, roughly (and slightly artificially) subdivided according to the area of their presentation [2]:
General:
T. Downarowicz, P. Harremoes, M. Janzura, R. Rudnicki, F. Topsøe, I. Vajda
Biology:
M. Broom, A. Elitzur, J. Plotkin, J.D.H. Smith
Finance:
R. Baviera, T. Choulli, M. Frittelli, L. Gulko, D. Samperi, J. Siqueira, W. Stummer
Physics:
C.A. Fuchs, R. Jozsa, S.-K. Lin, D. Petz, M.B. Ruskai
Amongst the general talks, Topsøe and Harremoes lead the conference with their game-theoretic treatment of maximum entropy modelling. Instead of entropy maximisation using Lagrange multipliers according to the now classical method of Gibbs, Jaynes, Jeffreys et al., they considered entropy maximisation as a primal problem dual to the minimisation of the so-called "risk," by which they mean the expected length of a binary Huffman encoding of the states of a system. Downarowicz dealt with issues of encoding in topological dynamical systems. Rudnicki discussed general conditional entropy measures in the context of Markov semigroups. Janzura treated the relationship between Gibbs random fields and information theory. Vajda discussed the convergence of information measures arising from discretisation of parameter space for statistical distributions.
Possibly reflecting the vastness of the field, and the difficulty of the problems involved, the talks in biology were the most diverse. Broom gave a general survey of some of the main areas of biology in which information theory is currently being applied. These areas include bioinformatics in molecular biology (particularly involving genome and protein sequences), the origin of language, game theoretical analyses of evolution (following Maynard Smith), and population entropy (following Demetrius). Plotkin presented information-theoretical models for the evolution of language, notably at the level of word formation. Smith presented an entropy maximisation approach to demographic modelling. Elitzur's talk was more general, discussing information-theoretic aspects of living systems. He stressed the invariance of "important" information with respect to transformations such as change of spatial location. By analogy with the sensitivity of a scientific instrument (such as a Newtonian telescope), he also pointed out the way in which a growing population amplifies the selection value of mutations.
The finance talks, including those of Baviera, Samperi and Stummer, concentrated on the problem of asset pricing in incomplete markets. Frittelli presented minimisation of the entropy of a martingale measure as a dual to utility maximisation. Choulli applied the concepts of Kakutani-Hellinger distance and Hellinger process. Siqueira discussed informational statistical decision analysis. Gulko's talk was quite direct, showing how entropy maximisation yielded pricing patterns as actually observed in the markets. A lively discussion following his talk examined the extent to which such basic methods could handle dynamical processes. At any rate, the consensus stressed how the information theoretical approach was more general and realistic than the earlier stochastic differential equation approach of Black-Scholes.
The physics talks generally addressed topics in quantum information theory, ranging from questions of the basic physics to discussions of the mathematical foundations in functional analysis. Fuchs and Josza summarised the ideas underlying quantum computation and quantum information theory, including Schumacher's Theorem as a quantum analogue of the Shannon Coding Theorem. They dealt with quantum non-locality as a process of information transmission along a "Feynman zig-zag" proceeding from one location backward in time to the common source, and then forward in time to the second location. Fuchs also formulated the Neumann entropy as a contour integral involving the trace of operators, and showed how comparable analysis of the determinant lead to the concept of subentropy, classically a concave function maximised by the uniform distribution. Ruskai's talks established a bridge between the physical ideas and the mathematical background, including the geometry of the Bloch sphere, and many of the key inequalities involving operators. Petz also gave a detailed treatment of the relative entropy concept for operators, including a new form of Schumacher's Theorem. Finally, back in the classical context, Lin emphasized the connection between symmetry and entropy in physical systems. Subsequent discussions suggested that there are still many aspects of this topic that remain to be explored, particularly involving different kinds of symmetry at different levels of a complex system.
References and Notes