14.2. Estimates for expectation and variance

Lecture



Let there be a random variable   14.2.  Estimates for expectation and variance with mathematical expectation   14.2.  Estimates for expectation and variance and variance   14.2.  Estimates for expectation and variance ; both parameters are unknown. Above magnitude   14.2.  Estimates for expectation and variance produced   14.2.  Estimates for expectation and variance independent experiments that gave results   14.2.  Estimates for expectation and variance . It is required to find consistent and unbiased estimates for parameters   14.2.  Estimates for expectation and variance and   14.2.  Estimates for expectation and variance .

As an estimate for the mathematical expectation, it is natural to propose an arithmetic average of the observed values ​​(previously we denoted it   14.2.  Estimates for expectation and variance ):

  14.2.  Estimates for expectation and variance . (14.2.1)

It is easy to verify that this estimate is consistent: according to the law of large numbers, with increasing   14.2.  Estimates for expectation and variance magnitude   14.2.  Estimates for expectation and variance converges in probability to   14.2.  Estimates for expectation and variance . Evaluation   14.2.  Estimates for expectation and variance is also unbiased since

  14.2.  Estimates for expectation and variance . (14.2.2)

The variance of this estimate is:

  14.2.  Estimates for expectation and variance . (14.2.3)

The effectiveness or inefficiency of the assessment depends on the type of distribution law   14.2.  Estimates for expectation and variance . It can be proved that if   14.2.  Estimates for expectation and variance distributed according to the normal law, the variance (14.2.3) will be minimally possible, i.e.   14.2.  Estimates for expectation and variance is effective. For other distribution laws this may not be the case.

We proceed to estimate for variance.   14.2.  Estimates for expectation and variance . At first glance, the statistical dispersion seems to be the most natural estimate:

  14.2.  Estimates for expectation and variance , (14.2.4)

Where

  14.2.  Estimates for expectation and variance . (14.2.5)

Check if this estimate is consistent. Express it through the second initial moment (according to the formula (7.4.6) Chapter 7):

  14.2.  Estimates for expectation and variance . (14.2.6)

The first term in the right side is the arithmetic mean   14.2.  Estimates for expectation and variance observed values ​​of random variable   14.2.  Estimates for expectation and variance ; he converges in probability to   14.2.  Estimates for expectation and variance . The second term converges in probability to   14.2.  Estimates for expectation and variance ; the entire quantity (14.2.6) converges in probability to the value

  14.2.  Estimates for expectation and variance .

This means that the estimate (14.2.4) is consistent.

Check if the score is   14.2.  Estimates for expectation and variance also unbiased. Substitute in the formula (14.2.6) instead of   14.2.  Estimates for expectation and variance its expression (14.2.5) and perform the specified actions:

  14.2.  Estimates for expectation and variance

  14.2.  Estimates for expectation and variance

  14.2.  Estimates for expectation and variance . (14.2.7)

Find the expected value (14.2.7):

  14.2.  Estimates for expectation and variance . (14.2.8)

Since the variance   14.2.  Estimates for expectation and variance does not depend on which point you select the origin of coordinates, choose it at the point   14.2.  Estimates for expectation and variance . Then

  14.2.  Estimates for expectation and variance ;   14.2.  Estimates for expectation and variance , (14.2.9)

  14.2.  Estimates for expectation and variance . (14.2.10)

The last equality follows from the fact that the experiments are independent.

Substituting (14.2.9) and (14.2.10) into (14.2.8), we get:

  14.2.  Estimates for expectation and variance . (14.2.11)

This shows that   14.2.  Estimates for expectation and variance not an unbiased estimate for   14.2.  Estimates for expectation and variance : her expectation is not equal   14.2.  Estimates for expectation and variance and somewhat less. Using the assessment   14.2.  Estimates for expectation and variance instead of dispersion   14.2.  Estimates for expectation and variance , we will make some systematic error down. To eliminate this bias, it is enough to introduce an amendment by multiplying the value   14.2.  Estimates for expectation and variance on   14.2.  Estimates for expectation and variance . We get:

  14.2.  Estimates for expectation and variance .

We will choose such a “corrected” statistical variance as an estimate for   14.2.  Estimates for expectation and variance :

  14.2.  Estimates for expectation and variance . (14.2.12)

Since the multiplier   14.2.  Estimates for expectation and variance tends to unit when   14.2.  Estimates for expectation and variance and score   14.2.  Estimates for expectation and variance wealthy then estimate   14.2.  Estimates for expectation and variance will also be wealthy.

In practice, instead of formula (14.2.12), it is often more convenient to apply another, equivalent to it, in which the statistical dispersion is expressed through the second initial moment:

  14.2.  Estimates for expectation and variance . (14.2.13)

For large values   14.2.  Estimates for expectation and variance naturally both estimates are biased   14.2.  Estimates for expectation and variance and unbiased   14.2.  Estimates for expectation and variance - there will be very little difference and the introduction of a correction factor loses its meaning.

Thus, we have arrived at the following rules for processing limited statistical material.

If given values   14.2.  Estimates for expectation and variance taken in   14.2.  Estimates for expectation and variance independent experiences random variable   14.2.  Estimates for expectation and variance with unknown expectation   14.2.  Estimates for expectation and variance and variance   14.2.  Estimates for expectation and variance , then to determine these parameters should be used approximate values ​​(estimates):

  14.2.  Estimates for expectation and variance (14.2.14)


Comments


To leave a comment
If you have any suggestion, idea, thanks or comment, feel free to write. We really value feedback and are glad to hear your opinion.
To reply

Probability theory. Mathematical Statistics and Stochastic Analysis

Terms: Probability theory. Mathematical Statistics and Stochastic Analysis