Nothing Special   »   [go: up one dir, main page]

Unit5 Updated

Download as pdf or txt
Download as pdf or txt
You are on page 1of 69

UNIT-5

Point Estimation and Central Limit Theorem


Given that 𝑷 𝒁 > 𝟏. 𝟓𝟓 = 𝟎. 𝟎𝟔𝟎𝟔 or 𝑷 𝒁 < 𝟏. 𝟓𝟓 = 𝟎. 𝟗𝟑𝟗𝟒 or 𝑷ሺ𝟎 < 𝒁 <
Q2. If 𝑿𝟏 , 𝑿𝟐 , … , 𝑿𝒏 are independent Poisson variates with parameter 2, if
𝐒𝐧 = 𝑿𝟏 + 𝑿𝟐 + ⋯ + 𝑿𝒏 and 𝑛 = 75 then Use Central Limit Theorem to
estimate 𝑃(120 ≤ 𝑺𝒏 ≤ 160).
Given that 𝑷 𝟎 < 𝒁 < 𝟎. 𝟖𝟓 = 𝟎. 𝟐𝟗𝟑𝟗, 𝑷 𝟎 < 𝒁 < 𝟐. 𝟒𝟓 = 𝟎. 𝟒𝟗𝟐𝟕
Q3. The guaranteed average life of a certain type of electric light bulb is
1000 h with a standard deviation of 125 h. It is decided to sample the
output so as to ensure that 90% of the bulbs do not fall short of
guaranteed average by more than 2.5%. Use CLT to find the minimum
sample size.
NOTE: Since the guaranteed mean is 1,000 we want that the sample size “n” be such so
that the mean of the sample should not be less than 2.5% of 1000 (i.e. 25) from 1000.
Given that 𝑷 𝒁 < 𝟏. 𝟗𝟔 = 𝟎. 𝟗𝟓 or 𝑷 −𝟏. 𝟗𝟔 < 𝒁 < 𝟏. 𝟗𝟔 = 𝟎. 𝟗𝟓
or 𝑷 𝟎 < 𝒁 < 𝟏. 𝟗𝟔 = 𝟎. 𝟒𝟕𝟓 or 𝑷 𝒁 > 𝟏. 𝟗𝟔 = 𝟎. 𝟎𝟐𝟓.
Practice problem
Basic Definitions
• Population: The group of individual under study is called Population.
• Sample: A finite subset of statistical individuals in a population is called
Sample.
• Sample size: The number of individuals in a sample is called sample size.
• Simple Random Sampling: A random sample is one in which each unit of
population has an equal chance (say 𝑝) of being included in it and this
probability is independent of the previous drawing.
• Parameter: The statistical constant/measures of population is called
parameter e.g. Population Mean 𝜇, Population Variance 𝜎 2 etc.
• Statistic: Statistical constant/measure computed from the sample
observations e.g. sample mean 𝑥,ҧ sample variance 𝑠 2 etc.
Q: In normal distribution 𝑁ሺ𝜇, 𝜎 2 ), show that sample mean 𝒙
ഥ is an unbiased
estimator of population mean 𝜇, but sample variance 𝑠 2 is biased estimator of
population variance 𝜎 2 .
Q: If 𝑥1 , 𝑥2 , … , 𝑥𝑛 is a random sample from normal population 𝑁ሺ𝜇, 1).
1 𝑛
Show that 𝑡 = σ𝑖=1 𝑥𝑖2 is an unbiased estimator of 𝜇2 + 1.
𝑛
Practice Questions
Q: If 𝑇 is an unbiased estimator for 𝜃, show that 𝑇 2 is a biased estimator for 𝜃 2
Sufficient conditions for consistency
• Let 𝑇𝑛 be a sequence of estimators, such that ∀ 𝜃 ∈ Θ
(i) 𝐸 𝑇𝑛 → 𝛾ሺ𝜃), as 𝑛 → ∞
(ii) 𝑉 𝑇𝑛 → 0, as 𝑛 → ∞.
Then 𝑇𝑛 is a consistent estimator of 𝛾ሺ𝜃).
Q: Prove that in a sampling from 𝑁ሺ𝜇, 𝜎 2 ) population, the sample
mean is a consistent estimator of 𝜇.
Q: If 𝑋1 , 𝑋2 , … , 𝑋𝑛 are random observations on a Bernoulli variate 𝑋 taking the
value 1 with probability 𝑝 and the value 0 with probability ሺ1 − 𝑝), show that:
σ𝑥𝑖 σ𝑥𝑖
1− is a consistent estimator of 𝑝 1 − 𝑝 .
𝑛 𝑛
Practice questions
Q: A random sample 𝑋1 , 𝑋2 , 𝑋3 of size 3 is drawn from a normal population with unknown mean 𝜇
and variance 𝜎 2. 𝑇1 , 𝑇2 , 𝑇3 are the estimators used to estimate mean value 𝑇1 = 𝑋1 + 𝑋2 − X3 ,
1
𝑇2 = 2𝑋1 + 3𝑋3 − 4𝑋2 , 𝑇3 = 𝜆𝑋1 + 𝑋2 + 𝑋3
3
(A) Are 𝑇1 , 𝑇2 are unbiased estimators.
(B) For what value of 𝜆, 𝑇3 is unbiased?
(C) With this value of 𝜆, is 𝑇3 a consistent estimator?
Efficient estimator
• If, of the two consistent estimators 𝑇1 , 𝑇2 of a certain parameter 𝜃,
we have 𝑉 𝑇1 < 𝑉ሺ𝑇2 ), for all 𝑛 then 𝑇1 is more efficient than 𝑇2 for
all sample sizes.
• Most efficient estimator: If in a class of consistent estimators or a
parameter, there exists one whose sampling variance is less than that
of any such estimator, it is called the most efficient estimator.
• Efficiency: If 𝑇1 is the most efficient estimator with variance 𝑉1 and 𝑇2
is any other estimator with variance 𝑉2 then the efficiency 𝜂 of 𝑇2 is
V1
defined as : 𝜂 = , 0 ≤ 𝜂 < 1.
V2
Q: A random sample 𝑋1 , 𝑋2 , 𝑋3 , 𝑋4 , 𝑋5 of size 5 is drawn from a normal population
with unknown mean 𝜇. Consider the following estimators to estimate 𝜇:
𝑋1 +𝑋2 +𝑋3 +𝑋4 +𝑋5 𝑋1 +𝑋2 2𝑋1 +𝑋2 +𝜆𝑋3
(i) 𝑡1 = (ii) 𝑡2 = + 𝑋3 (iii) 𝑡3 = , where 𝜆 is such
5 2 3
that 𝑡3 is an unbiased estimator of 𝜇 . Find 𝜆. Are 𝑡1 and 𝑡2 unbiased? State giving
reasons, the estimator which is best among 𝑡1 , 𝑡2 and 𝑡3 .
2 σ𝑛
𝑖=1 𝑖 𝑥𝑖
Q. Check whether 𝑇2 = is consistent estimator of population mean and also find, Of the
𝑛ሺ𝑛+1)
estimators, 𝑇1 = 𝑋ത and 𝑇2 , which is more efficient estimator?
Practice question
Q: If 𝑋1 , 𝑋2 and 𝑋3 is a random sample of size 3 from a population with mean value 𝜇 and
variance 𝜎 2. If 𝑡1 , 𝑡2 and 𝑡3 are the estimators used to estimate the mean value 𝜇, where
𝜆𝑋 +𝑋 +𝑋
𝑡1 = 𝑋1 + 𝑋2 − 𝑋3 , 𝑡2 = 2𝑋1 + 3𝑋3 − 4𝑋2 , and 𝑡3 = 1 2 3.
3
(i) Are 𝑡1 and 𝑡2 unbiased estimators?
(ii) Find the value of 𝜆 such that 𝑡3 is unbiased estimator for 𝜇.
(iii) With the value of 𝜆, is 𝑡3 a consistent estimator?
(iv)Which is the best estimator?
Sufficiency
An estimator is said to be sufficient for a parameter, if it contains all the
information in the sample regarding the parameter
Sufficient estimator: If 𝑻 = 𝒕ሺ𝒙𝟏 , 𝒙𝟐 , … , 𝒙𝒏 ) is an estimator of
parameter 𝜃, based on the sample of size 𝑛 from the population with
density 𝑓ሺ𝑥, 𝜃), such that the conditional distribution of 𝑥1 , 𝑥2 , … , 𝑥𝑛
given 𝑇, is independent of 𝜃, then 𝑇 is sufficient estimator of 𝜃.
i.e. 𝑷ሺ𝒙𝟏 ∩ 𝒙𝟐 ∩ ⋯ ∩ 𝒙𝒏 |𝑻) is independent of parameter 𝜽, then 𝑻 is
sufficient estimator of 𝜽.
Necessary and Sufficient Condition for sufficient estimator
Q: Let 𝑿𝟏 , 𝑿𝟐 , … , 𝑿𝒏 be a random sample from the population with pdf
𝒇 𝒙, 𝜽 = 𝜽𝒙𝜽−𝟏 ; 𝟎 < 𝒙 < 𝟏, 𝜽 > 𝟎. Show that ς𝒏𝒊=𝟏 𝑿𝒊 is sufficient for 𝜽.
Fisher-Neymann Criteria: A statistic 𝒕𝟏 = 𝒕ሺ𝒙𝟏 , 𝒙𝟐 , … , 𝒙𝒏 ) is a sufficient
estimator of parameter 𝜽 if and only if the likelihood function can be
expressed as:
𝑳 = ς𝒏𝒊=𝟏 𝒇ሺ𝒙𝒊 , 𝜽) = 𝒈 𝒕𝟏 , 𝜽 𝒉ሺ𝒙𝟏 , 𝒙𝟐 , … , 𝒙𝒏 )
where 𝒈ሺ𝒕𝟏 , 𝜽) is the pdf of the statistic 𝒕𝟏 and 𝒉ሺ𝒙𝟏 , 𝒙𝟐 , … , 𝒙𝒏 )
is a function of sample observations only, independent of 𝜃.
Practice question
Q. Let 𝑥1 , 𝑥2 , … , 𝑥𝑛 be a random sample from a Bernoulli population with parameter 𝑝,
0 < 𝑝 < 1. Show that 𝑇 = σ𝑛𝑖=1 𝑥𝑖 is sufficient estimator for 𝑝.

Q. Let 𝑥1 , 𝑥2 , … , 𝑥𝑛 be a random sample from 𝑁ሺ𝜇, 𝜎 2 ) population. Find the sufficient estimators
for 𝜇 and 𝜎 2.
Q: Let 𝑥1 , 𝑥2 , … , 𝑥𝑛 be a random sample from a uniform
population on [0, 𝜃]. Find a sufficient estimator for 𝜃.
Q: Let 𝑥1 , 𝑥2 , … , 𝑥𝑛 be a random sample from a distribution
with p.d.f 𝑓 𝑥 = 𝑒 − 𝑥−𝜃 , 𝜃 < 𝑥 < ∞, −∞ < 𝜃 < ∞. Obtain a
sufficient statistic for 𝜃.
Invariance Property of Sufficient estimator
If 𝑇 is a sufficient estimator or the parameter 𝜃 and if 𝜓ሺ𝑇) is one to
one function of 𝑇, then 𝜓ሺ𝑇) is sufficient estimator of 𝜓ሺ𝜃).
Maximum Likelihood Estimator (MLE)
Properties of MLE
Q: In random sampling from normal population N(𝜇; σ2 ), find the maximum
likelihood estimators for (i) 𝜇 when σ2 is known. (ii) σ2 when 𝜇 is known. and (iii)
the simultaneous estimation of μ and σ2
Q: Find the maximum likelihood estimate for the parameter 𝜆 of a Poisson distribution on
the basis of a sample of size n. Also find its variance .
Practice question
Q. Prove that the maximum likelihood estimate of the parameter 𝛼 of a population having density
2
function: 2 𝛼 − 𝑥 , 0 < 𝑥 < 𝛼,for a sample of unit size is 2𝑥, 𝑥 being the sample value. Show also
𝛼
that the estimate is biased.

Q: Suppose 10 rats are used in a biomedical study where they are injected with cancer cells and then
given a cancer drug that is designed to increase their survival rate. The survival times, in months, are
14, 17, 27, 18, 12, 8, 22, 13, 19, and 12. Assume that the exponential distribution applies. Give a
maximum likelihood estimate of the mean survival time.

You might also like