A Bernoulli process is a finite or infinite sequence of independent random variables X1, X2, X3, …, such that

· for each i, the value of Xi is either 0 or 1;

· for all values of i, the probability p that Xi = 1 is the same.

In other words, a Bernoulli process is a sequence of independent identically distributed Bernoulli trials.

Independence of the trials implies that the process is memoryless. Given that the probability p is known, past outcomes provide no information about future outcomes. (If p is unknown, however, the past informs about the future indirectly, through inferences about p.)

If the process is infinite, then from any point the future trials constitute a Bernoulli process identical to the whole process, the fresh-start property.

The two possible values of each Xi are often called “success” and “failure”. Thus, when expressed as a number 0 or 1, the outcome may be called the number of successes on the ith “trial”.

Two other common interpretations of the values are true or false and yes or no. Under any interpretation of the two values, the individual variables Xi may be called Bernoulli trials with parameter p.

In many applications time passes between trials, as the index i increases. In effect, the trials X1, X2, … Xi, … happen at “points in time” 1, 2, …, i, …. That passage of time and the associated notions of “past” and “future” are not necessary, however. Most generally, any Xi and Xj in the process are simply two from a set of random variables indexed by {1, 2, …, n}, the finite cases, or by {1, 2, 3, …}, the infinite cases.

One experiment with only two possible outcomes, often referred to as “success” and “failure”, usually encoded as 1 and 0, can be modeled as a Bernoulli distribution. Several random variables and probability distributions beside the Bernoullis may be derived from the Bernoulli process:

· The number of successes in the first n trials, which has a binomial distribution B(n, p)

· The number of failures needed to get r successes, which has a negative binomial distribution NB(r, p)

· The number of failures needed to get one success, which has a geometric distribution NB(1, p), a special case of the negative binomial distribution

The negative binomial variables may be interpreted as random waiting times.

Every variable Xi in the sequence is associated with a Bernoulli trial or experiment. They all have the same Bernoulli distribution. Much of what can be said about the Bernoulli process can also be generalized to more than two outcomes (such as the process for a six-sided die); this generalization is known as the Bernoulli scheme.

The problem of determining the process, given only a limited sample of Bernoulli trials, may be called the problem of checking whether a coin is fair.

## No comments:

## Post a Comment