You are viewing the course site for a past offering of this course. The current offering may be found here.
Lecture 12: Integration (12)
jeromylui

I was curious on that part where the slide talks about Monte Carlo Integration needing a lot of samples. How many samples do we mean by a lot? Also, just to confirm, does convergence just refer to bringing the variance down? Is there a certain value we would want to bring the variance down to?

Hsifnus

Just as a deep learning network might converge performance-wise in its training process at different speeds depending on its hyperparameters, what kinds of "hyperparameters" would affect the convergence of Monte Carlo integration?

zehric

@Hsifnus I think Monte Carlo integration is a generic way to do numeric integration. The only "hyperparameter" I can see is the probability distribution. As we see later on in these slides the choice of probability distribution function makes a large difference in the speed of convergence.

You must be enrolled in the course to comment