The Gibbs sampler has been used extensively in the ML314 statistics

The Gibbs sampler has been used extensively in the ML314 statistics literature. random variables = {= {incompatible. The study of PICSD is closely related to the Gibbs sampler because the latter relies on iteratively drawing samples from to form a Markov chain. Under mild conditions the Markov chain converges to the desired joint distribution if is compatible. However if is not compatible then the Gibbs sampler could exhibit erratic behavior [9]. In this paper our goal is to demonstrate the behavior of the Gibbs sampler (or the pseudo Gibbs sampler as it ML314 is not a true ML314 Gibbs sampler in the traditional sense of presumed compatible conditional distributions) ENG for PICSD. By using several simple examples we show mathematically that what a Gibbs sampler converges to is a function of the order of the sampling scheme in the Gibbs sampler. Furthermore we show that if we follow a random order in sampling conditional distributions at each iteration��i.e. using a random-scan Gibbs sampler [10]��then the Gibbs sampling will lead ML314 to a mixture of the joint distributions formed by each combination of fixed-order (or more formally fixed-scan) when = 2 but the result is not true when > 2. This result is a refinement of a conjecture put forward in Liu [11]. Two recent developments in the statistical and machine-learning literature underscore the importance of the current work. The first is in the application of the Gibbs sampler to a dependency network which is a type of generalized graphical model specified by conditional probability distributions [7]. One approach to learning a dependency network is to first specify individual conditional models and then apply a (pseudo) Gibbs sampler to estimate the joint model. Heckerman et al. [7] acknowledged the possibility of incompatible conditional models but argued that when the sample size is large the degree of incompatibility will not be substantial and the Gibbs sampler is still applicable. Yet another example is the use of the fully conditional specification for multiple imputation of missing data [12 13 The method which is also called multiple imputation by chained equations (MICE) makes use of a Gibbs sampler or other MCMC-based methods that operate on a set of conditionally specified models. For each variable with a missing value an imputed value is created under an individual conditional-regression model. This kind of procedure was viewed as combining the best features of many currently available multiple imputation approaches [14]. Due to its flexibility over compatible multivariate-imputation models [15] and ability to handle different variable types (continuous binary and categorical) the MICE has gained acceptance for its practical treatment of missing data especially in high-dimensional data sets [16]. Popular as it is the MICE has the limitation of potentially encountering incompatible conditional-regression models and it has been shown that an incompatible ML314 imputation model can lead to biased estimates from imputed data [17]. So far very little theory has been developed in supporting the use of MICE [18]. A better understanding of the theoretical properties of applying the Gibbs sampler to PICSD could lead to important refinements of these imputation methods in practice. The article is organized as follows: First we provide basic background to the Gibbs chain and Gibbs sampler and define the scan order of a Gibbs sampler. In Section 3 we offer several analytic results concerning the stationary distributions of the Gibbs sampler under different scan patterns and a counter-example to a surmise about the Gibbs sampler under a random order of scan pattern. Section 4 describes two simple examples to numerically demonstrate the convergence behavior of a Gibbs sampler as a function of scan order both by applying matrix algebra to the transition kernel as well as using MCMC-based computation. Finally in Section 5 we provide a brief discussion. 2 GIBBS CHAIN AND GIBBS SAMPLER Continuing the notation in the previous section let = (= {with �� {1 2 ? is the number of categories of the �� (defined in the order of with respect to can be implemented as follows: Pick an arbitrary starting.