Skip to content

Commit 889e5b0

Browse files
committed
Merge branch 'master' of github.com:maxbiostat/Computational_Statistics
2 parents 5da7684 + f3c4b78 commit 889e5b0

File tree

3 files changed

+39
-0
lines changed

3 files changed

+39
-0
lines changed

README.md

Lines changed: 37 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -36,14 +36,32 @@ Books marked with [a] are advanced material.
3636

3737
- [Random Number Generation](https://www.iro.umontreal.ca/~lecuyer/myftp/papers/handstat.pdf) by [Pierre L'Ecuyer](http://www-labs.iro.umontreal.ca/~lecuyer/);
3838
- [Non-Uniform Random Variate Generation](http://www.nrbook.com/devroye/) by the great [Luc Devroye](http://luc.devroye.org/);
39+
- Walker's [Alias method](https://en.wikipedia.org/wiki/Alias_method) is a fast way to generate discrete random variables;
3940
- [Rejection Control and Sequential importance sampling](http://stat.rutgers.edu/home/rongchen/publications/98JASA_rejection-control.pdf) (1998), by Liu et al. discusses how to improve importance sampling by controlling rejections.
4041

42+
### Markov chains
43+
44+
- [These](https://pages.uoregon.edu/dlevin/MARKOV/markovmixing.pdf) notes from David Levin and Yuval Peres are excellent and cover a lot of material one might find interesting on Markov processes.
45+
4146
### Markov chain Monte Carlo
4247

4348
- Charlie Geyer's [website](http://users.stat.umn.edu/~geyer/) is a treasure trove of material on Statistics in general, MCMC methods in particular.
4449
See, for instance, [On the Bogosity of MCMC Diagnostics](http://users.stat.umn.edu/~geyer/mcmc/diag.html).
4550
- [Efficient construction of reversible jump Markov chain Monte Carlo proposal distributions](http://www2.stat.duke.edu/~scs/Courses/Stat376/Papers/TransdimMCMC/BrooksRobertsRJ.pdf) is nice paper on the construction of efficient proposals for reversible jump/transdimensional MCMC.
4651

52+
#### Hamiltonian Monte Carlo
53+
54+
The two definitive texts on HMC are [Neal (2011)](https://arxiv.org/pdf/1206.1901.pdf) and [Betancourt (2017)](https://arxiv.org/pdf/1701.02434.pdf).
55+
56+
#### Normalising Constants
57+
58+
[This](https://radfordneal.wordpress.com/2008/08/17/the-harmonic-mean-of-the-likelihood-worst-monte-carlo-method-ever/) post by Radford Neal explains why the Harmonic Mean Estimator (HME) is a _terrible_ estimator of the evidence.
59+
60+
#### Sequential Monte Carlo and Dynamic models
61+
62+
- [This](https://link.springer.com/book/10.1007/978-3-030-47845-2) book by Nicolas Chopin and Omiros Papaspiliopoulos is a great introduction (as it says in the title) about SMC.
63+
SMC finds application in many areas, but dynamic (linear) models deserve a special mention. The seminal 1997 [book](https://link.springer.com/book/10.1007/b98971) by West and Harrison remains the _de facto_ text on the subject.
64+
4765
## Optmisation
4866
#### The EM algortithm
4967
- This elementary [tutorial](https://zhwa.github.io/tutorial-of-em-algorithm.html) is simple but effective.
@@ -61,8 +79,27 @@ See, for instance, [On the Bogosity of MCMC Diagnostics](http://users.stat.umn.e
6179
- In [Markov Chain Monte Carlo Maximum Likelihood](https://www.stat.umn.edu/geyer/f05/8931/c.pdf), Charlie Geyer shows how one can use MCMC to do maximum likelihood estimation when the likelihood cannot be written in closed-form.
6280
This paper is an example of MCMC methods being used outside of Bayesian statistics.
6381

82+
- [This](https://github.com/maxbiostat/Computational_Statistics/blob/master/supporting_material/1997_Dunbar_CollegeMaths.pdf) paper discusses the solution of Problem A in [assigment 0 (2021)](https://github.com/maxbiostat/Computational_Statistics/blob/master/assignments/warmup_assignment.pdf).
83+
84+
#### Reparametrisation
85+
86+
Sometimes a clever way to make a target distribution easier to compute expectations with respect to is to _reparametrise_ it. Here are some resources:
87+
88+
- A youtube video [Introduction of the concepts and a simple example]( https://www.youtube.com/watch?v=gSd1msFFZTw);
89+
- [Hamiltonian Monte Carlo for Hierarchical Models](https://arxiv.org/abs/1312.0906) from M. J. Betancourt and Mark Girolami;
90+
- [A General Framework for the Parametrization of Hierarchical Models](https://projecteuclid.org/journals/statistical-science/volume-22/issue-1/A-General-Framework-for-the-Parametrization-of-Hierarchical-Models/10.1214/088342307000000014.full) from Omiros Papaspiliopoulos, Gareth O. Roberts, and Martin Sköld;
91+
- [Efficient parametrisations for normal linear mixed models](https://www.jstor.org/stable/2337527?seq=1#metadata_info_tab_contents) from Alan E. Gelfand, Sujit K. Sahu and Bradley P. Carlin.
92+
93+
See [#4](https://github.com/maxbiostat/Computational_Statistics/issues/4). Contributed by @lucasmoschen.
94+
95+
#### Variance reduction
96+
97+
- [Rao-Blackwellisation](http://www.columbia.edu/~im2131/ps/rao-black.pdf) is a popular technique for obtaining estimators with lower variance. I recommend the recent International Statistical Review [article](https://arxiv.org/abs/2101.01011) by Christian Robert and Gareth Roberts on the topic.
98+
6499
### Extra (fun) resources
65100

101+
- A [Visualisation](https://chi-feng.github.io/mcmc-demo/app.html) of MCMC for various algorithms and targets.
102+
66103
In these blogs and websites you will often find interesting discussions on computational, numerical and statistical aspects of applied Statistics and Mathematics.
67104

68105
- Christian Robert's [blog](https://xianblog.wordpress.com/);

annotated_bibliography.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -25,3 +25,5 @@ In this seminal paper, Dempster, Laird and Rubin introduce the Expectation-Maxim
2525

2626
13. [The No-U-Turn Sampler: Adaptively Setting Path Lengths in Hamiltonian Monte Carlo (2014)](https://arxiv.org/abs/1111.4246). Matt Hoffmann and Andrew Gelman introduce a novel algorithm that tunes the step size and tree depth of the HMC algorithm automatically.
2727
The No-U-Turn Sampler (NUTS) as it came to christened, is the building block for what would later for the main algorithm implemented in [Stan](https://mc-stan.org/).
28+
29+
14. In [A tutorial on adaptive MCMC](https://people.eecs.berkeley.edu/~jordan/sail/readings/andrieu-thoms.pdf), Cristophe Andrieu and Johannes Thoms give a very nice overview of the advantages and pitfalls (!) of adaptive MCMC. Pay special heed to Section 2.
1.69 MB
Binary file not shown.

0 commit comments

Comments
 (0)