A quick look at the gist of the backpropagation idea to calculate the gradients of neural networks, which go into the stochastic gradient descent algorithm. Check it out!
Machine learning and stochastic gradient descent
I very briefly describe gradient descent (GD) and how it is used in the context of machine learning: the stochastic gradient descent algorithm (SGD). The idea is simple: divide your data into randomly chosen mini-batches and use a mini-batch to estimate the gradient of your cost function. Use that to do GD iterations at fixed mini-batch, then go over all your mini-batches, and that defines an epoch. Go over many epochs, and your network will hopefully learn!
Machine learning and linear algebra
Toward QMC: Generating field configurations.
Toward QMC: Calculating thermal expectation values
I show the first notions of how to go about calculating thermal expectation values in quantum Monte Carlo. I work out an example for the particle number operator and show expressions for the one-body density matrix. I assume that the field configurations are given. More on how to obtain those samples next time!
Toward QMC: Spacetime determinant identities
Toward QMC: Trace-determinant identities. Part 2.
This is the fourth episode of the “Toward QMC” series. In the previous episode, I started to show how a crucial identity is proven in the case of fermions. Here I give more details on how to carry out that proof for products of multiple operators and explain what happens in the bosonic case. Check it out!
Video. Notes.
Toward QMC: Trace-determinant identities. Part 1.
Toward QMC: Hubbard-Stratonovich transformation
This is the second episode of the Toward-QMC series. I discuss how interactions enter the thermodynamics of quantum systems from an operational standpoint. The Hubbard-Stratonovich transformation allows us to take a step forward by decomposing the interaction factor as an integral over external fields. Check it out! More details on the algebra next time.
Video. Notes.