Machine learning and stochastic gradient descent

I very briefly describe gradient descent (GD) and how it is used in the context of machine learning: the stochastic gradient descent algorithm (SGD). The idea is simple: divide your data into randomly chosen mini-batches and use a mini-batch to estimate the gradient of your cost function. Use that to do GD iterations at fixed …

Toward QMC: Trace-determinant identities. Part 2.

This is the fourth episode of the “Toward QMC” series. In the previous episode, I started to show how a crucial identity is proven in the case of fermions. Here I give more details on how to carry out that proof for products of multiple operators and explain what happens in the bosonic case. Check …