Export 2 results:
Sort by: Author Title [ Type  (Asc)] Year
Journal Article
Abarbanel, HDI, Rozdeba PJ, Shirman S.  2018.  Machine learning: Deepest learning as statistical data assimilation problems. Neural Computation. 30:2025-2055.   10.1162/neco_a_01094   AbstractWebsite

We formulate an equivalence between machine learning and the formulation of statistical data assimilation as used widely in physical and biological sciences. The correspondence is that layer number in a feedforward artificial network setting is the analog of time in the data assimilation setting. This connection has been noted in the machine learning literature. We add a perspective that expands on how methods from statistical physics and aspects of Lagrangian and Hamiltonian dynamics play a role in how networks can be trained and designed. Within the discussion of this equivalence, we show that adding more layers (making the network deeper) is analogous to adding temporal resolution in a data assimilation framework. Extending this equivalence to recurrent networks is also discussed. We explore how one can find a candidate for the global minimum of the cost functions in the machine learning context using a method from data assimilation. Calculations on simple models from both sides of the equivalence are reported. Also discussed is a framework in which the time or layer label is taken to be continuous, providing a differential equation, the Euler-Lagrange equation and its boundary conditions, as a necessary condition for a minimum of the cost function. This shows that the problem being solved is a two-point boundary value problem familiar in the discussion of variational methods. The use of continuous layers is denoted "deepest learning." These problems respect a symplectic symmetry in continuous layer phase space. Both Lagrangian versions and Hamiltonian versions of these problems are presented. Their well-studied implementation in a discrete time/layer, while respecting the symplectic structure, is addressed. The Hamiltonian version provides a direct rationale for backpropagation as a solution method for a certain two-point boundary value problem.

Kadakia, N, Rey D, Ye J, Abarbanel HDI.  2017.  Symplectic structure of statistical variational data assimilation. Quarterly Journal of the Royal Meteorological Society. 143:756-771.   10.1002/qj.2962   Abstract

Data assimilation variational principles (4D-Var) exhibit a natural symplectic structure among the state variables x(t) and. x(t). We explore the implications of this structure in both Lagrangian coordinates {x(t), x(t)} andHamiltonian canonical coordinates {x(t), p(t)} through a numerical examination of the chaotic Lorenz 1996 model in ten dimensions. We find that there are a number of subtleties associated with discretization, boundary conditions, and symplecticity, suggesting differing approaches when working in the the Lagrangian versus the Hamiltonian description. We investigate these differences in detail, and accordingly develop a protocol for searching for optimal trajectories in a Hamiltonian space. We find that casting the problem into canonical coordinates can, in some situations, considerably improve the quality of predictions.