Page Not Found
Page not found. Your pixels are in another canvas.
A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.
Page not found. Your pixels are in another canvas.
Biography
Outside of Academia
Published:
Recurrent neural networks (RNNs) are a class of powerful open system models in machine learning that are specially designed to learn sequential data. They have also enjoyed synergies with disciplines such as signal processing, optimization, systems and control theory, neuroscience and network science. The explosion of real-time data (physical or not) and the promising potential of using dynamical systems (physically realizable or not) for computation and to learn the data are opening up a wide range of foundational and practical problems. However, the dearth of rigorous analysis limits the usefulness of RNNs in addressing scientific questions. Therefore, a deep understanding of the working mechanism of RNNs and related models is pivotal to shed light on the properties of large and adaptive architectures, and to facilitate systematic design of the next generation of networks.
Published:
Many models of noisy systems are too complex to be analyzed and solved analytically or even numerically if a large range of time scales is involved. For a wide class of high-dimensional dynamical systems it may be possible to approximate them by deriving lower-dimensional reduced models. The reduced models are often easier to analyze and faster to integrate numerically, while still capturing the essential features of the full system. When time scale separations exist one can apply and develop multiscale methods such as averaging and homogenization to study model reduction. This is in fact one of the main goals in statistical physics, when one is given a microscopic (first-principle) model for the full system.
Published:
Nonequilibrium dynamics, classical or quantum, are far more common in nature than equilibrium ones. Therefore, understanding nonequilibrium behavior is a fundamental task in statistical mechanics and could even inspire developments in other fields such as machine learning and neuroscience. However, it is more challenging than understanding equilibrium ones, and often requires new mathematical techniques and physical insights.