-
Archives
- July 2021
- January 2021
- December 2020
- June 2020
- March 2020
- October 2019
- August 2019
- June 2019
- April 2019
- February 2019
- January 2019
- December 2018
- November 2018
- August 2018
- May 2018
- April 2018
- January 2018
- November 2017
- October 2017
- May 2017
- February 2017
- January 2017
- December 2016
- September 2016
- June 2016
- May 2016
-
Meta
Category Archives: Minds & Machines
Criticality in deep neural nets
In the previous post, we introduced mean field theory (MFT) as a means of approximating the partition function for interacting systems. In particular, we used this to determine the critical point at which the system undergoes a phase transition, and … Continue reading
Posted in Minds & Machines
2 Comments
Free energy, variational inference, and the brain
In several recent posts, I explored various ideas that lie at the interface of physics, information theory, and machine learning: We’ve seen, à la Jaynes, how the concepts of entropy in statistical thermodynamics and information theory are unified, perhaps the … Continue reading
Posted in Minds & Machines
3 Comments
Deep learning and the renormalization group
In recent years, a number of works have pointed to similarities between deep learning (DL) and the renormalization group (RG) [1-7]. This connection was originally made in the context of certain lattice models, where decimation RG bears a superficial resemblance … Continue reading
Posted in Minds & Machines, Physics
1 Comment
Variational autoencoders
As part of one of my current research projects, I’ve been looking into variational autoencoders (VAEs) for the purpose of identifying and analyzing attractor solutions within higher-dimensional phase spaces. Of course, I couldn’t resist diving into the deeper mathematical theory … Continue reading
Posted in Minds & Machines
2 Comments
Restricted Boltzmann machines
As a theoretical physicist making their first foray into machine learning, one is immediately captivated by the fascinating parallel between deep learning and the renormalization group. In essence, both are concerned with the extraction of relevant features via a process … Continue reading
Posted in Minds & Machines
2 Comments
Boltzmann machines
I alluded previously that information geometry had many interesting applications, among them machine learning and computational neuroscience more generally. A classic example is the original paper by Amari, Kurata, and Nagaoka, Information Geometry of Boltzmann Machines [1]. This paper has … Continue reading
Posted in Minds & Machines
Leave a comment
Information geometry (part 3/3)
Insofar as quantum mechanics can be regarded as an extension of (classical) probability theory, most of the concepts developed in the previous two parts of this sequence can be extended to quantum information theory as well, thus giving rise to … Continue reading
Posted in Minds & Machines, Physics
2 Comments
Information geometry (part 2/3)
In the previous post, we introduced the -connection, and alluded to a dualistic structure between and . In particular, the cases are intimately related to two important families of statistical models, the exponential or e-family with affine connection , and … Continue reading
Posted in Minds & Machines, Physics
2 Comments
Information geometry (part 1/3)
Information geometry is a rather interesting fusion of statistics and differential geometry, in which a statistical model is endowed with the structure of a Riemannian manifold. Each point on the manifold corresponds to a probability distribution function, and the metric … Continue reading
Posted in Minds & Machines, Physics
Leave a comment