We address the problem of computing approximate marginals in Gaussian probabilistic models by using mean field and fractional Bethe approximations. We define the Gaussian fractional Bethe free energy in terms of the moment parameters of the approximate marginals, derive a lower and an upper bound on the fractional Bethe free energy and establish a necessary condition for the lower bound to be bounded from below. It turns out that the condition is identical to the pairwise normalizability condition, which is known to be a sufficient condition for the convergence of the message passing algorithm. We show that stable fixed points of the Gaussian message passing algorithm are local minima of the Gaussian Bethe free energy. By a counterexample, we disprove the conjecture stating that the unboundedness of the free energy implies the divergence of the message passing algorithm.
Monday, 30 May 2011
Saturday, 28 May 2011
Recent progress in neural networks
In the past several years, much progress has been made in neural network technology. Neural networks have been used in many applications of signal processing to classify different sets of patterns. This paper presents some of the trends relevant to application of neural technology. No comprehensive review of the state of the art is attempted. Rather, the emphasis is selectively on certain current trends, as new ideas, that in the author's opinion show promise for the future. However, the reader should note that neural network technology is in a state of flux with several alternative theoretical models and approaches. The paper deals with the practical aspects of the research in neural networks. The basic concepts of neural networks and progress in learning algorithms are briefly reviewed, followed by a discussion of the trends relevant to hardware implementations of these networks. Finally, hybrids comprising neural networks, expert systems, and genetic algorithms are considered.
Subscribe to:
Comments (Atom)