Statistical Reasoning
Dit is een conceptversie. De vakomschrijving kan nog wijzigen, bekijk deze pagina op een later moment nog eens.
Faculteit  Science and Engineering 
Jaar  2018/19 
Vakcode  WISR11 
Vaknaam  Statistical Reasoning 
Niveau(s)  bachelor 
Voertaal  Engels 
Periode  semester I a 
ECTS  5 
Rooster  rooster.rug.nl 
Uitgebreide vaknaam  Statistical Reasoning  
Leerdoelen  This course provides an introduction to Bayesian Statistics.Upon completion of the course the students are able 1. to formulate and to model statistical realworld problem in terms of Bayesian probabilistic models. 2. to analytically derive posterior distributions for standard Bayesian models with conjugate prior distributions e.g. BernoulliBeta, BinomialBeta, PoissonGamma, NormalNormal,NormalGamma, NormalNormalGamma. 3. to compute predictive distributions and full conditional distributions for standard Bayesian models with conjugate prior distributions. 4. to design Monte Carlo (MC) approximation algorithms and Markov Chain Monte Carlo (MCMC) sampling algorithms (for probabilities and models which are analytically infeasible). 5. to implement those Monte Carlo (MC) and Markov Chain Monte Carlo (MCMC) algorithms in the R programming environment. 

Omschrijving  Our knowledge about the world is uncertain and continuously updated by observing data and learning from data. Statistics is the science which deals with the analysis of data, and two conceptually different paradigms can be distinguished: Frequentist and Bayesian Statistics. This course gives an introduction to Bayesian Statistics, while the course unit: 'Statistics' deals with frequentist Statistics. In frequentist Statistics model parameters are assumed to be unknown constants which have to be estimated from data. In Bayesian Statistics model parameters are assumed to be random variables, whose distributions are unknown. The goal is then to infer their distributions. The Bayesian approach is to formulate prior belief about the unknown parameters in terms of prior distributions and to update those prior distributions in light of data, so as to obtain the parameter’s posterior distributions. For more complex Bayesian models computationally expensive Markov Chain Monte Carlo (MCMC) sampling algorithms have to be designed to generated posterior distribution samples. It can be shown that both concepts (Bayesian vs. Frequentist) yield identical results asymptotically, i.e. for large informative data sets. However, in realworld applications the available data are often rather sparse and noisy. Then Bayesian approaches can be beneficial, e.g. when genuine 'preknowledge' is available or when information can be coupled by hierarchical models. The course covers the themes: Bayes theorem, Bayesian models, conjugate priors, various standard conjugate Bayesian models, posterior distributions, marginal likelihoods, predictive distributions, full conditional distributions, Monte Carlo approximations, Markov chains, Markov Chain Monte Carlo (MCMC) simulations, MetropolisHastings MCMC sampling, Gibbs MCMC sampling, Graphical Model representations and Hierarchical Bayesian models. The computer practical will start with an introductory tutorial on R programming. 

Uren per week  
Onderwijsvorm  Hoorcollege (LC), Opdracht (ASM), Practisch werk (PRC)  
Toetsvorm 
Opdracht (AST), Schriftelijk tentamen (WE)
(Assessment takes place through homework assignments and written exam according to Final = 0.1 x max(HW1, WE) + 0.1 x max(HW2, WE) + 0.1 x max(HW3, WE) + 0.7 x WE only if WE >=4.5 otherwise Final = WE, where HWi is homework grade for ith homework set, WE final written exam grade.) 

Vaksoort  bachelor  
Coördinator  dr. M.A. Grzegorczyk  
Docent(en)  dr. M.A. Grzegorczyk  
Verplichte literatuur 


Entreevoorwaarden  This course builds on 'Probability Theory' from the first BSc year.  
Opmerkingen  
Opgenomen in 
