<?xml version="1.0" encoding="UTF-8"?>
<collection xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.loc.gov/MARC21/slim http://www.loc.gov/standards/marcxml/schema/MARC21slim.xsd" xmlns="http://www.loc.gov/MARC21/slim">
 <record>
  <leader>00000cam a22000004i 4500</leader>
  <controlfield tag="001">UP-99796217612665619</controlfield>
  <controlfield tag="003">Buklod</controlfield>
  <controlfield tag="005">20180203150953.0</controlfield>
  <controlfield tag="006">m    |o  d |      </controlfield>
  <controlfield tag="007">ta</controlfield>
  <controlfield tag="008">180203s2014    xx     d     r    |||| u|</controlfield>
  <datafield tag="020" ind1=" " ind2=" ">
   <subfield code="a">9781118357729</subfield>
  </datafield>
  <datafield tag="035" ind1=" " ind2=" ">
   <subfield code="a">(iLib)UPD-00360912548</subfield>
  </datafield>
  <datafield tag="040" ind1=" " ind2=" ">
   <subfield code="a">DLC</subfield>
   <subfield code="d">DSTC</subfield>
   <subfield code="e">rda</subfield>
  </datafield>
  <datafield tag="041" ind1=" " ind2=" ">
   <subfield code="a">eng</subfield>
  </datafield>
  <datafield tag="042" ind1=" " ind2=" ">
   <subfield code="a">DMLUC</subfield>
  </datafield>
  <datafield tag="090" ind1=" " ind2=" ">
   <subfield code="a">QA 276.4</subfield>
   <subfield code="b">V66 2014</subfield>
  </datafield>
  <datafield tag="100" ind1="1" ind2=" ">
   <subfield code="a">Voss, Jochen</subfield>
   <subfield code="e">author.</subfield>
  </datafield>
  <datafield tag="245" ind1="1" ind2="3">
   <subfield code="a">An introduction to statistical computing</subfield>
   <subfield code="b">a simulation-based approach</subfield>
   <subfield code="c">Jochen Voss, School of Mathematics, University of Leeds, UK.</subfield>
  </datafield>
  <datafield tag="264" ind1=" " ind2="1">
   <subfield code="a">Chichester, West Sussex</subfield>
   <subfield code="b">Wiley</subfield>
   <subfield code="c">2014.</subfield>
  </datafield>
  <datafield tag="300" ind1=" " ind2=" ">
   <subfield code="a">xii, 382 pages</subfield>
   <subfield code="b">illustrations</subfield>
   <subfield code="c">26 cm</subfield>
  </datafield>
  <datafield tag="336" ind1=" " ind2=" ">
   <subfield code="a">content</subfield>
   <subfield code="2">rdacarrier</subfield>
  </datafield>
  <datafield tag="337" ind1=" " ind2=" ">
   <subfield code="a">unmediated</subfield>
   <subfield code="2">rdamedia</subfield>
  </datafield>
  <datafield tag="338" ind1=" " ind2=" ">
   <subfield code="a">volume</subfield>
   <subfield code="2">rdacarrier</subfield>
  </datafield>
  <datafield tag="490" ind1="0" ind2=" ">
   <subfield code="a">Wiley series in computational statistics</subfield>
  </datafield>
  <datafield tag="504" ind1=" " ind2=" ">
   <subfield code="a">Includes bibliographical references (pages 375-377) and index.</subfield>
  </datafield>
  <datafield tag="505" ind1="0" ind2=" ">
   <subfield code="a">Pseudo random number generators -- 1.1.1. The linear congruential generator -- 1.1.2. Quality of pseudo random number generators -- 1.1.3. Pseudo random number generators in practice -- 1.2. Discrete distributions -- 1.3. The inverse transform method -- 1.4. Rejection sampling -- 1.4.1. Basic rejection sampling -- 1.4.2. Envelope rejection sampling -- 1.4.3. Conditional distributions -- 1.4.4. Geometric interpretation -- 1.5. Transformation of random variables -- 1.6. Special-purpose methods -- 1.7. Summary and further reading -- Exercises -- 2. Simulating statistical models -- 2.1. Multivariate normal distributions -- 2.2. Hierarchical models -- 2.3. Markov chains -- 2.3.1. Discrete state space -- 2.3.2. Continuous state space -- 2.4. Poisson processes -- 2.5. Summary and further reading -- Exercises -- 3. Monte Carlo methods -- 3.1. Studying models via simulation -- 3.2. Monte Carlo estimates -- 3.2.1.Computing Monte Carlo estimates. Contents note continued: 3.2.2. Monte Carlo error -- 3.2.3. Choice of sample size -- 3.2.4. Refined error bounds -- 3.3. Variance reduction methods -- 3.3.1. Importance sampling -- 3.3.2. Antithetic variables -- 3.3.3. Control variates -- 3.4. Applications to statistical inference -- 3.4.1. Point estimators -- 3.4.2. Confidence intervals -- 3.4.3. Hypothesis tests -- 3.5. Summary and further reading -- Exercises -- 4. Markov Chain Monte Carlo methods -- 4.1. The Metropolis-Hastings method -- 4.1.1. Continuous state space -- 4.1.2. Discrete state space -- 4.1.3. Random walk Metropolis sampling -- 4.1.4. The independence sampler -- 4.1.5. Metropolis-Hastings with different move types -- 4.2. Convergence of Markov Chain Monte Carlo methods -- 4.2.1. Theoretical results -- 4.2.2. Practical considerations -- 4.3. Applications to Bayesian inference -- 4.4. The Gibbs sampler -- 4.4.1. Description of the method -- 4.4.2. Application to parameter estimation -- 4.4.3. Applications to image processing. Contents note continued: 4.5. Reversible Jump Markov Chain Monte Carlo -- 4.5.1. Description of the method -- 4.5.2. Bayesian inference for mixture distributions -- 4.6. Summary and further reading -- 4.6. Exercises -- 5. Beyond Monte Carlo -- 5.1. Approximate Bayesian Computation -- 5.1.1. Basic Approximate Bayesian Computation -- 5.1.2. Approximate Bayesian Computation with regression -- 5.2. Resampling methods -- 5.2.1. Bootstrap estimates -- 5.2.2. Applications to statistical inference -- 5.3. Summary and further reading -- Exercises -- 6. Continuous-time models -- 6.1. Time discretisation -- 6.2. Brownian motion -- 6.2.1. Properties -- 6.2.2. Direct simulation -- 6.2.3. Interpolation and Brownian bridges -- 6.3. Geometric Brownian motion -- 6.4. Stochastic differential equations -- 6.4.1. Introduction -- 6.4.2. Stochastic analysis -- 6.4.3. Discretisation schemes -- 6.4.4. Discretisation error -- 6.5. Monte Carlo estimates -- 6.5.1. Basic Monte Carlo -- 6.5.2. Variance reduction methods. Contents note continued: 6.5.3. Multilevel Monte Carlo estimates -- 6.6. Application to option pricing -- 6.7. Summary and further reading -- Exercises -- Appendix A Probability reminders -- A.1. Events and probability -- A.2. Conditional probability -- A.3. Expectation -- A.4. Limit theorems -- A.5. Further reading -- Appendix B Programming in R -- B.1. General advice -- B.2.R as a Calculator -- B.2.1. Mathematical operations -- B.2.2. Variables -- B.2.3. Data types -- B.3. Programming principles -- B.3.1. Don't repeat yourself! -- B.3.2. Divide and conquer! -- B.3.3. Test your code! -- B.4. Random number generation -- B.5. Summary and further reading -- Exercises -- Appendix C Answers to the exercises -- C.1. Answers for Chapter 1 -- C.2. Answers for Chapter 2 -- C.3. Answers for Chapter 3 -- C.4. Answers for Chapter 4 -- C.5. Answers for Chapter 5 -- C.6. Answers for Chapter 6 -- C.7. Answers for Appendix B.</subfield>
  </datafield>
  <datafield tag="520" ind1=" " ind2=" ">
   <subfield code="a">This is a book about exploring random systems using computer simulation and thus, this book combines two different topic areas which have always fascinated me: the mathematical theory of probability and the art of programming computers.</subfield>
  </datafield>
  <datafield tag="650" ind1=" " ind2="0">
   <subfield code="a">Mathematical statistics</subfield>
   <subfield code="x">Data processing.</subfield>
  </datafield>
  <datafield tag="905" ind1=" " ind2=" ">
   <subfield code="a">FO</subfield>
  </datafield>
  <datafield tag="852" ind1="0" ind2=" ">
   <subfield code="a">UPD</subfield>
   <subfield code="b">DSTC</subfield>
   <subfield code="h">QA 276.4</subfield>
   <subfield code="i">V66 2014</subfield>
  </datafield>
  <datafield tag="942" ind1=" " ind2=" ">
   <subfield code="a">Book</subfield>
  </datafield>
 </record>
</collection>
