Statistics Formula Tables
Statistics Formula Tables
Statistics Formula Tables
Tables and Formulas for Sullivan, Statistics: Informed Decisions Using Data ©2017 Pearson Education, Inc Tables and Formulas for Sullivan, Statistics: Informed Decisions Using Data ©2017 Pearson Education, Inc Tables and Formulas for Sullivan, Statistics: Informed Decisions Using Data ©2017 Pearson Education, Inc
Chapter 2 Organizing and Summarizing Data • Complement Rule • Factorial Chapter 10 Hypothesis Tests Regarding a Parameter Chapter 14 Inference on the Least-Squares Regression Model and Multiple Regression
frequency • Class midpoint: The sum of consecutive lower class limits
c
P1E 2 = 1 - P1E2 n! = n # 1n - 12 # 1n - 22 # g # 3 # 2 # 1 Test Statistics • Standard Error of the Estimate • Confidence Interval about the Mean Response of y, yn
g 1yi - yn i 2 g residuals
• Relative frequency = divided by 2.
sum of all frequencies • Multiplication Rule for Independent Events pn - p0
2 2
1x* - x2 2
C n g 1xi - x2 2
yn { ta>2 # se
• Permutation of n objects taken r at a time: x - m0 1
• z0 = • t0 = se = = +
P1E and F 2 = P1E2 # P1F 2 p0 11 - p0 2 s 1n C n- 2 C n- 2
Chapter 3 Numerically Summarizing Data
n!
gxi gxi fi
n Pr = C n • Standard error of b1
• Multiplication Rule for n Independent Events 1n - r2! 1n - 12s2 where x* is the given value of the explanatory variable and
gfi
• x20 =
P1E and F and G g 2 = P1E2 # P1F2 # P1G2 # g
• Population Mean: m = • Population Mean from Grouped Data: m = s20 se ta>2 is the critical value with n - 2 degrees of freedom.
N • Combination of n objects taken r at a time: sb1 =
gxi gxi fi
Chapter 11 Inferences on Two Samples 2g 1xi - x2 2 • Prediction Interval about an Individual Response, yn
• Conditional Probability Rule
gfi
n!
• Sample Mean: x = • Sample Mean from Grouped Data: x = nC r = • Test Statistic for the Slope of the Least-Squares Regression Line 1x* - x2 2
g 1xi - x2 2
yn { ta>2 # se
n r!1n - r2! • Test Statistic Comparing Two Population Proportions • Test Statistic Comparing Two Means (Independent Sampling) 1
P1E and F 2 N1E and F 2 1+ +
gwi xi
P1F E2 = (Independent Samples) C
= b1 - b1 b1 - b1 n
• Range = Largest Data Value - Smallest Data Value P1E2 N1E2 1x1 - x2 2 - 1m1 - m2 2 t0 = =
gwi
• Permutations with Repetition: sb1
• Weighted Mean: xw = pn 1 - pn 2 - 1p1 - p2 2 x1 + x2 t0 =
se n 2g 1xi - x2 2 here x* is the given value of the explanatory variable and
w
1 gxi 2 2
n1 + n2 + ta>2 is the critical value with n - 2 degrees of freedom.
n1! # n2! # g # nk!
1 1
P1E and F2 = P1E2 # P1F E2
gx 2i
• Population Standard Deviation from Grouped Data: 2pn 11 - pn 2 + C n1 n2 • Confidence Interval for the Slope of the Regression Line
g 1xi - m2 2 1 gxi fi 2 2
B n1 n2
gx 2i fi -
- se
gfi
b1 { ta>2 #
g 1xi - m2 fi
N Chapter 6 Discrete Probability Distributions • Confidence Interval for the Difference of Two Means
s = = 2 • Confidence Interval for the Difference of Two Proportions (Independent Samples)
gfi gfi
C N S N 2g 1xi - x2 2
s = = • Mean (Expected Value) of a Discrete Random Variable • Mean and Standard Deviation of a Binomial Random Variable (Independent Samples)
mX = gx # P1x2
B R s21 s22
• Sample Standard Deviation 1x1 - x2 2 { ta>2 where ta>2 is computed with n - 2 degrees of freedom.
1 gxi 2
pn 1 11 - pn 1 2 pn 2 11 - pn 2 2 +
mX = np sX = 2np11 - p2 C n1
gx 2i -
• Sample Standard Deviation from Grouped Data: 1pn 1 - pn 2 2 { za>2 + n2
1 gxi fi 2 2
2
g 1xi - x 2
C n1 n2 Chapter 15 Nonparametric Statistics
gx 2i fi
• Standard Deviation of a Discrete Random Variable • Poisson Probability Distribution Function
gfi
Note: ta>2 is found using the smaller of n1 - 1 or n2 - 1
g 1xi - m2 2fi
2
x20 = a = a
Chapter 4 Describing the Relation between Two Variables s p11 - p2 Ei = mi = npi for i = 1, 2, c, k 1observed - expected2 2 1Oi - Ei 2 2 The test statistic, k, is the The test statistic, The test statistic, the sample data that correspond to MX in the hypothesis.
mx = m and sx = m pn = p and s pn = smaller of the number of k, is the number k, is the number Large-Sample Case 1 n1 + 202 or 1 n2 + 202
B n
aa s ba s b
xi - x yi - y • Residual = observed y - predicted y = y - yn 2n • Expected Frequencies (when testing for independence or expected Ei
minus signs or plus signs. of plus signs. of minus signs.
homogeneity of proportions) i = 1, 2, c, k n1n2
x y x T -
• Correlation Coefficient: r = • R2 = r 2 for the least-squares regression model • Sample Proportion: pn = Large-Sample Case 1n + 252 The test statistic, z0, is 2
n-1 yn = b1x + b0 n 1row total21column total2 All Ei Ú 1 and no more than 20% less than 5. z0 =
Expected frequency = n n1n2 1n1 + n2 + 12
• The equation of the least-squares regression line is 2 Chapter 9 Estimating the Value of a Parameter table total • Test Statistic for Comparing Two Proportions 1k + 0.52 -
• The coefficient of determination, R , measures the 2 B 12
proportion of total variation in the response variable that is (Dependent Samples) z0 =
sy
yn = b1x + b0, where yn is the predicted value, b1 = r # Confidence Intervals 1n • Test Statistic for Spearman’s Rank Correlation Test
explained by the least-squares regression line.
• A 11 - a2 # 100% confidence interval about p is
sx Sample Size 1f12 - f21 2 2 2
is the slope, and b0 = y - b1x is the intercept. • To estimate the population proportion with a margin of x20 = 6gd 2i
pn 11 - pn 2 error E at a 11 - a2 # 100% level of confidence: f12 + f21 where n is the number of minus and plus signs and k is obtained rs = 1 -
pn { za>2 # as described in the small sample case. n1n2 - 12
za>2 2 Chapter 13 Comparing Three or More Means
Chapter 5 Probability B n n = pn 11 - pn 2 a b rounded up to the next integer,
E • Test Statistic for the Wilcoxon Matched-Pairs where di = the difference in the ranks of the two
• Empirical Probability • Addition Rule for Disjoint Events • A 11 - a2 # 100% confidence interval about m is where pn is a prior estimate of the population proportion,
• Test Statistic for One-Way ANOVA • Test Statistic for Tukey’s Test after One-Way ANOVA
Signed-Ranks Test observations in the i th ordered pair.
za>2 2 Mean square due to treatment 1x2 - x1 2 - 1m2 - m1 2 x2 - x1 • Test Statistic for the Kruskal–Wallis Test
x { ta>2 #
s MST Small-Sample Case 1n " 302
frequency of E P1E or F 2 = P1E2 + P1F 2 or n = 0.25 a b rounded up to the next integer when no F = = q = =
a
E
s # 12
1 s # 12
1 Two-Tailed Left-Tailed Right-Tailed 12 1 ni 1N + 12 2
number of trials of experiment prior estimate of p is available. a + b a + b H = c Ri - d
• Addition Rule for n Disjoint Events B2 n1 n2 B2 n1 n2 N1N + 12 ni 2
Note: ta>2 is computed using n - 1 degrees of freedom. where H0: MD = 0 H0: MD = 0 H0: MD = 0
• Classical Probability • To estimate the population mean with a margin of error E R21 R22 R2k
P1E or F or G or g 2 = P1E2 + P1F 2 + P1G2 + g
za>2 # s 2
12
• A 11 - a2 # 100% confidence interval about s is
2 2 2
n- k
Chapter 2 Organizing and Summarizing Data • Complement Rule • Factorial Chapter 10 Hypothesis Tests Regarding a Parameter Chapter 14 Inference on the Least-Squares Regression Model and Multiple Regression
frequency • Class midpoint: The sum of consecutive lower class limits
c
P1E 2 = 1 - P1E2 n! = n # 1n - 12 # 1n - 22 # g # 3 # 2 # 1 Test Statistics • Standard Error of the Estimate • Confidence Interval about the Mean Response of y, yn
g 1yi - yn i 2 g residuals
• Relative frequency = divided by 2.
sum of all frequencies • Multiplication Rule for Independent Events pn - p0
2 2
1x* - x2 2
C n g 1xi - x2 2
yn { ta>2 # se
• Permutation of n objects taken r at a time: x - m0 1
• z0 = • t0 = se = = +
P1E and F 2 = P1E2 # P1F 2 p0 11 - p0 2 s 1n C n- 2 C n- 2
Chapter 3 Numerically Summarizing Data
n!
gxi gxi fi
n Pr = C n • Standard error of b1
• Multiplication Rule for n Independent Events 1n - r2! 1n - 12s2 where x* is the given value of the explanatory variable and
gfi
• x20 =
P1E and F and G g 2 = P1E2 # P1F2 # P1G2 # g
• Population Mean: m = • Population Mean from Grouped Data: m = s20 se ta>2 is the critical value with n - 2 degrees of freedom.
N • Combination of n objects taken r at a time: sb1 =
gxi gxi fi
Chapter 11 Inferences on Two Samples 2g 1xi - x2 2 • Prediction Interval about an Individual Response, yn
• Conditional Probability Rule
gfi
n!
• Sample Mean: x = • Sample Mean from Grouped Data: x = nC r = • Test Statistic for the Slope of the Least-Squares Regression Line 1x* - x2 2
g 1xi - x2 2
yn { ta>2 # se
n r!1n - r2! • Test Statistic Comparing Two Population Proportions • Test Statistic Comparing Two Means (Independent Sampling) 1
P1E and F 2 N1E and F 2 1+ +
gwi xi
P1F E2 = (Independent Samples) C
= b1 - b1 b1 - b1 n
• Range = Largest Data Value - Smallest Data Value P1E2 N1E2 1x1 - x2 2 - 1m1 - m2 2 t0 = =
gwi
• Permutations with Repetition: sb1
• Weighted Mean: xw = pn 1 - pn 2 - 1p1 - p2 2 x1 + x2 t0 =
se n 2g 1xi - x2 2 here x* is the given value of the explanatory variable and
w
1 gxi 2 2
n1 + n2 + ta>2 is the critical value with n - 2 degrees of freedom.
n1! # n2! # g # nk!
1 1
P1E and F2 = P1E2 # P1F E2
gx 2i
• Population Standard Deviation from Grouped Data: 2pn 11 - pn 2 + C n1 n2 • Confidence Interval for the Slope of the Regression Line
g 1xi - m2 2 1 gxi fi 2 2
B n1 n2
gx 2i fi -
- se
gfi
b1 { ta>2 #
g 1xi - m2 fi
N Chapter 6 Discrete Probability Distributions • Confidence Interval for the Difference of Two Means
s = = 2 • Confidence Interval for the Difference of Two Proportions (Independent Samples)
gfi gfi
C N S N 2g 1xi - x2 2
s = = • Mean (Expected Value) of a Discrete Random Variable • Mean and Standard Deviation of a Binomial Random Variable (Independent Samples)
mX = gx # P1x2
B R s21 s22
• Sample Standard Deviation 1x1 - x2 2 { ta>2 where ta>2 is computed with n - 2 degrees of freedom.
1 gxi 2
pn 1 11 - pn 1 2 pn 2 11 - pn 2 2 +
mX = np sX = 2np11 - p2 C n1
gx 2i -
• Sample Standard Deviation from Grouped Data: 1pn 1 - pn 2 2 { za>2 + n2
1 gxi fi 2 2
2
g 1xi - x 2
C n1 n2 Chapter 15 Nonparametric Statistics
gx 2i fi
• Standard Deviation of a Discrete Random Variable • Poisson Probability Distribution Function
gfi
Note: ta>2 is found using the smaller of n1 - 1 or n2 - 1
g 1xi - m2 2fi
2
x20 = a = a
Chapter 4 Describing the Relation between Two Variables s p11 - p2 Ei = mi = npi for i = 1, 2, c, k 1observed - expected2 2 1Oi - Ei 2 2 The test statistic, k, is the The test statistic, The test statistic, the sample data that correspond to MX in the hypothesis.
mx = m and sx = m pn = p and s pn = smaller of the number of k, is the number k, is the number Large-Sample Case 1 n1 + 202 or 1 n2 + 202
B n
aa s ba s b
xi - x yi - y • Residual = observed y - predicted y = y - yn 2n • Expected Frequencies (when testing for independence or expected Ei
minus signs or plus signs. of plus signs. of minus signs.
homogeneity of proportions) i = 1, 2, c, k n1n2
x y x T -
• Correlation Coefficient: r = • R2 = r 2 for the least-squares regression model • Sample Proportion: pn = Large-Sample Case 1n + 252 The test statistic, z0, is 2
n-1 yn = b1x + b0 n 1row total21column total2 All Ei Ú 1 and no more than 20% less than 5. z0 =
Expected frequency = n n1n2 1n1 + n2 + 12
• The equation of the least-squares regression line is 2 Chapter 9 Estimating the Value of a Parameter table total • Test Statistic for Comparing Two Proportions 1k + 0.52 -
• The coefficient of determination, R , measures the 2 B 12
proportion of total variation in the response variable that is (Dependent Samples) z0 =
sy
yn = b1x + b0, where yn is the predicted value, b1 = r # Confidence Intervals 1n • Test Statistic for Spearman’s Rank Correlation Test
explained by the least-squares regression line.
• A 11 - a2 # 100% confidence interval about p is
sx Sample Size 1f12 - f21 2 2 2
is the slope, and b0 = y - b1x is the intercept. • To estimate the population proportion with a margin of x20 = 6gd 2i
pn 11 - pn 2 error E at a 11 - a2 # 100% level of confidence: f12 + f21 where n is the number of minus and plus signs and k is obtained rs = 1 -
pn { za>2 # as described in the small sample case. n1n2 - 12
za>2 2 Chapter 13 Comparing Three or More Means
Chapter 5 Probability B n n = pn 11 - pn 2 a b rounded up to the next integer,
E • Test Statistic for the Wilcoxon Matched-Pairs where di = the difference in the ranks of the two
• Empirical Probability • Addition Rule for Disjoint Events • A 11 - a2 # 100% confidence interval about m is where pn is a prior estimate of the population proportion,
• Test Statistic for One-Way ANOVA • Test Statistic for Tukey’s Test after One-Way ANOVA
Signed-Ranks Test observations in the i th ordered pair.
za>2 2 Mean square due to treatment 1x2 - x1 2 - 1m2 - m1 2 x2 - x1 • Test Statistic for the Kruskal–Wallis Test
x { ta>2 #
s MST Small-Sample Case 1n " 302
frequency of E P1E or F 2 = P1E2 + P1F 2 or n = 0.25 a b rounded up to the next integer when no F = = q = =
a
E
s # 12
1 s # 12
1 Two-Tailed Left-Tailed Right-Tailed 12 1 ni 1N + 12 2
number of trials of experiment prior estimate of p is available. a + b a + b H = c Ri - d
• Addition Rule for n Disjoint Events B2 n1 n2 B2 n1 n2 N1N + 12 ni 2
Note: ta>2 is computed using n - 1 degrees of freedom. where H0: MD = 0 H0: MD = 0 H0: MD = 0
• Classical Probability • To estimate the population mean with a margin of error E R21 R22 R2k
P1E or F or G or g 2 = P1E2 + P1F 2 + P1G2 + g
za>2 # s 2
12
• A 11 - a2 # 100% confidence interval about s is
2 2 2
n- k
Chapter 2 Organizing and Summarizing Data • Complement Rule • Factorial Chapter 10 Hypothesis Tests Regarding a Parameter Chapter 14 Inference on the Least-Squares Regression Model and Multiple Regression
frequency • Class midpoint: The sum of consecutive lower class limits
c
P1E 2 = 1 - P1E2 n! = n # 1n - 12 # 1n - 22 # g # 3 # 2 # 1 Test Statistics • Standard Error of the Estimate • Confidence Interval about the Mean Response of y, yn
g 1yi - yn i 2 g residuals
• Relative frequency = divided by 2.
sum of all frequencies • Multiplication Rule for Independent Events pn - p0
2 2
1x* - x2 2
C n g 1xi - x2 2
yn { ta>2 # se
• Permutation of n objects taken r at a time: x - m0 1
• z0 = • t0 = se = = +
P1E and F 2 = P1E2 # P1F 2 p0 11 - p0 2 s 1n C n- 2 C n- 2
Chapter 3 Numerically Summarizing Data
n!
gxi gxi fi
n Pr = C n • Standard error of b1
• Multiplication Rule for n Independent Events 1n - r2! 1n - 12s2 where x* is the given value of the explanatory variable and
gfi
• x20 =
P1E and F and G g 2 = P1E2 # P1F2 # P1G2 # g
• Population Mean: m = • Population Mean from Grouped Data: m = s20 se ta>2 is the critical value with n - 2 degrees of freedom.
N • Combination of n objects taken r at a time: sb1 =
gxi gxi fi
Chapter 11 Inferences on Two Samples 2g 1xi - x2 2 • Prediction Interval about an Individual Response, yn
• Conditional Probability Rule
gfi
n!
• Sample Mean: x = • Sample Mean from Grouped Data: x = nC r = • Test Statistic for the Slope of the Least-Squares Regression Line 1x* - x2 2
g 1xi - x2 2
yn { ta>2 # se
n r!1n - r2! • Test Statistic Comparing Two Population Proportions • Test Statistic Comparing Two Means (Independent Sampling) 1
P1E and F 2 N1E and F 2 1+ +
gwi xi
P1F E2 = (Independent Samples) C
= b1 - b1 b1 - b1 n
• Range = Largest Data Value - Smallest Data Value P1E2 N1E2 1x1 - x2 2 - 1m1 - m2 2 t0 = =
gwi
• Permutations with Repetition: sb1
• Weighted Mean: xw = pn 1 - pn 2 - 1p1 - p2 2 x1 + x2 t0 =
se n 2g 1xi - x2 2 here x* is the given value of the explanatory variable and
w
1 gxi 2 2
n1 + n2 + ta>2 is the critical value with n - 2 degrees of freedom.
n1! # n2! # g # nk!
1 1
P1E and F2 = P1E2 # P1F E2
gx 2i
• Population Standard Deviation from Grouped Data: 2pn 11 - pn 2 + C n1 n2 • Confidence Interval for the Slope of the Regression Line
g 1xi - m2 2 1 gxi fi 2 2
B n1 n2
gx 2i fi -
- se
gfi
b1 { ta>2 #
g 1xi - m2 fi
N Chapter 6 Discrete Probability Distributions • Confidence Interval for the Difference of Two Means
s = = 2 • Confidence Interval for the Difference of Two Proportions (Independent Samples)
gfi gfi
C N S N 2g 1xi - x2 2
s = = • Mean (Expected Value) of a Discrete Random Variable • Mean and Standard Deviation of a Binomial Random Variable (Independent Samples)
mX = gx # P1x2
B R s21 s22
• Sample Standard Deviation 1x1 - x2 2 { ta>2 where ta>2 is computed with n - 2 degrees of freedom.
1 gxi 2
pn 1 11 - pn 1 2 pn 2 11 - pn 2 2 +
mX = np sX = 2np11 - p2 C n1
gx 2i -
• Sample Standard Deviation from Grouped Data: 1pn 1 - pn 2 2 { za>2 + n2
1 gxi fi 2 2
2
g 1xi - x 2
C n1 n2 Chapter 15 Nonparametric Statistics
gx 2i fi
• Standard Deviation of a Discrete Random Variable • Poisson Probability Distribution Function
gfi
Note: ta>2 is found using the smaller of n1 - 1 or n2 - 1
g 1xi - m2 2fi
2
x20 = a = a
Chapter 4 Describing the Relation between Two Variables s p11 - p2 Ei = mi = npi for i = 1, 2, c, k 1observed - expected2 2 1Oi - Ei 2 2 The test statistic, k, is the The test statistic, The test statistic, the sample data that correspond to MX in the hypothesis.
mx = m and sx = m pn = p and s pn = smaller of the number of k, is the number k, is the number Large-Sample Case 1 n1 + 202 or 1 n2 + 202
B n
aa s ba s b
xi - x yi - y • Residual = observed y - predicted y = y - yn 2n • Expected Frequencies (when testing for independence or expected Ei
minus signs or plus signs. of plus signs. of minus signs.
homogeneity of proportions) i = 1, 2, c, k n1n2
x y x T -
• Correlation Coefficient: r = • R2 = r 2 for the least-squares regression model • Sample Proportion: pn = Large-Sample Case 1n + 252 The test statistic, z0, is 2
n-1 yn = b1x + b0 n 1row total21column total2 All Ei Ú 1 and no more than 20% less than 5. z0 =
Expected frequency = n n1n2 1n1 + n2 + 12
• The equation of the least-squares regression line is 2 Chapter 9 Estimating the Value of a Parameter table total • Test Statistic for Comparing Two Proportions 1k + 0.52 -
• The coefficient of determination, R , measures the 2 B 12
proportion of total variation in the response variable that is (Dependent Samples) z0 =
sy
yn = b1x + b0, where yn is the predicted value, b1 = r # Confidence Intervals 1n • Test Statistic for Spearman’s Rank Correlation Test
explained by the least-squares regression line.
• A 11 - a2 # 100% confidence interval about p is
sx Sample Size 1f12 - f21 2 2 2
is the slope, and b0 = y - b1x is the intercept. • To estimate the population proportion with a margin of x20 = 6gd 2i
pn 11 - pn 2 error E at a 11 - a2 # 100% level of confidence: f12 + f21 where n is the number of minus and plus signs and k is obtained rs = 1 -
pn { za>2 # as described in the small sample case. n1n2 - 12
za>2 2 Chapter 13 Comparing Three or More Means
Chapter 5 Probability B n n = pn 11 - pn 2 a b rounded up to the next integer,
E • Test Statistic for the Wilcoxon Matched-Pairs where di = the difference in the ranks of the two
• Empirical Probability • Addition Rule for Disjoint Events • A 11 - a2 # 100% confidence interval about m is where pn is a prior estimate of the population proportion,
• Test Statistic for One-Way ANOVA • Test Statistic for Tukey’s Test after One-Way ANOVA
Signed-Ranks Test observations in the i th ordered pair.
za>2 2 Mean square due to treatment 1x2 - x1 2 - 1m2 - m1 2 x2 - x1 • Test Statistic for the Kruskal–Wallis Test
x { ta>2 #
s MST Small-Sample Case 1n " 302
frequency of E P1E or F 2 = P1E2 + P1F 2 or n = 0.25 a b rounded up to the next integer when no F = = q = =
a
E
s # 12
1 s # 12
1 Two-Tailed Left-Tailed Right-Tailed 12 1 ni 1N + 12 2
number of trials of experiment prior estimate of p is available. a + b a + b H = c Ri - d
• Addition Rule for n Disjoint Events B2 n1 n2 B2 n1 n2 N1N + 12 ni 2
Note: ta>2 is computed using n - 1 degrees of freedom. where H0: MD = 0 H0: MD = 0 H0: MD = 0
• Classical Probability • To estimate the population mean with a margin of error E R21 R22 R2k
P1E or F or G or g 2 = P1E2 + P1F 2 + P1G2 + g
za>2 # s 2
12
• A 11 - a2 # 100% confidence interval about s is
2 2 2
n- k
Chapter 2 Organizing and Summarizing Data • Complement Rule • Factorial Chapter 10 Hypothesis Tests Regarding a Parameter Chapter 14 Inference on the Least-Squares Regression Model and Multiple Regression
frequency • Class midpoint: The sum of consecutive lower class limits
c
P1E 2 = 1 - P1E2 n! = n # 1n - 12 # 1n - 22 # g # 3 # 2 # 1 Test Statistics • Standard Error of the Estimate • Confidence Interval about the Mean Response of y, yn
g 1yi - yn i 2 g residuals
• Relative frequency = divided by 2.
sum of all frequencies • Multiplication Rule for Independent Events pn - p0
2 2
1x* - x2 2
C n g 1xi - x2 2
yn { ta>2 # se
• Permutation of n objects taken r at a time: x - m0 1
• z0 = • t0 = se = = +
P1E and F 2 = P1E2 # P1F 2 p0 11 - p0 2 s 1n C n- 2 C n- 2
Chapter 3 Numerically Summarizing Data
n!
gxi gxi fi
n Pr = C n • Standard error of b1
• Multiplication Rule for n Independent Events 1n - r2! 1n - 12s2 where x* is the given value of the explanatory variable and
gfi
• x20 =
P1E and F and G g 2 = P1E2 # P1F2 # P1G2 # g
• Population Mean: m = • Population Mean from Grouped Data: m = s20 se ta>2 is the critical value with n - 2 degrees of freedom.
N • Combination of n objects taken r at a time: sb1 =
gxi gxi fi
Chapter 11 Inferences on Two Samples 2g 1xi - x2 2 • Prediction Interval about an Individual Response, yn
• Conditional Probability Rule
gfi
n!
• Sample Mean: x = • Sample Mean from Grouped Data: x = nC r = • Test Statistic for the Slope of the Least-Squares Regression Line 1x* - x2 2
g 1xi - x2 2
yn { ta>2 # se
n r!1n - r2! • Test Statistic Comparing Two Population Proportions • Test Statistic Comparing Two Means (Independent Sampling) 1
P1E and F 2 N1E and F 2 1+ +
gwi xi
P1F E2 = (Independent Samples) C
= b1 - b1 b1 - b1 n
• Range = Largest Data Value - Smallest Data Value P1E2 N1E2 1x1 - x2 2 - 1m1 - m2 2 t0 = =
gwi
• Permutations with Repetition: sb1
• Weighted Mean: xw = pn 1 - pn 2 - 1p1 - p2 2 x1 + x2 t0 =
se n 2g 1xi - x2 2 here x* is the given value of the explanatory variable and
w
1 gxi 2 2
n1 + n2 + ta>2 is the critical value with n - 2 degrees of freedom.
n1! # n2! # g # nk!
1 1
P1E and F2 = P1E2 # P1F E2
gx 2i
• Population Standard Deviation from Grouped Data: 2pn 11 - pn 2 + C n1 n2 • Confidence Interval for the Slope of the Regression Line
g 1xi - m2 2 1 gxi fi 2 2
B n1 n2
gx 2i fi -
- se
gfi
b1 { ta>2 #
g 1xi - m2 fi
N Chapter 6 Discrete Probability Distributions • Confidence Interval for the Difference of Two Means
s = = 2 • Confidence Interval for the Difference of Two Proportions (Independent Samples)
gfi gfi
C N S N 2g 1xi - x2 2
s = = • Mean (Expected Value) of a Discrete Random Variable • Mean and Standard Deviation of a Binomial Random Variable (Independent Samples)
mX = gx # P1x2
B R s21 s22
• Sample Standard Deviation 1x1 - x2 2 { ta>2 where ta>2 is computed with n - 2 degrees of freedom.
1 gxi 2
pn 1 11 - pn 1 2 pn 2 11 - pn 2 2 +
mX = np sX = 2np11 - p2 C n1
gx 2i -
• Sample Standard Deviation from Grouped Data: 1pn 1 - pn 2 2 { za>2 + n2
1 gxi fi 2 2
2
g 1xi - x 2
C n1 n2 Chapter 15 Nonparametric Statistics
gx 2i fi
• Standard Deviation of a Discrete Random Variable • Poisson Probability Distribution Function
gfi
Note: ta>2 is found using the smaller of n1 - 1 or n2 - 1
g 1xi - m2 2fi
2
x20 = a = a
Chapter 4 Describing the Relation between Two Variables s p11 - p2 Ei = mi = npi for i = 1, 2, c, k 1observed - expected2 2 1Oi - Ei 2 2 The test statistic, k, is the The test statistic, The test statistic, the sample data that correspond to MX in the hypothesis.
mx = m and sx = m pn = p and s pn = smaller of the number of k, is the number k, is the number Large-Sample Case 1 n1 + 202 or 1 n2 + 202
B n
aa s ba s b
xi - x yi - y • Residual = observed y - predicted y = y - yn 2n • Expected Frequencies (when testing for independence or expected Ei
minus signs or plus signs. of plus signs. of minus signs.
homogeneity of proportions) i = 1, 2, c, k n1n2
x y x T -
• Correlation Coefficient: r = • R2 = r 2 for the least-squares regression model • Sample Proportion: pn = Large-Sample Case 1n + 252 The test statistic, z0, is 2
n-1 yn = b1x + b0 n 1row total21column total2 All Ei Ú 1 and no more than 20% less than 5. z0 =
Expected frequency = n n1n2 1n1 + n2 + 12
• The equation of the least-squares regression line is 2 Chapter 9 Estimating the Value of a Parameter table total • Test Statistic for Comparing Two Proportions 1k + 0.52 -
• The coefficient of determination, R , measures the 2 B 12
proportion of total variation in the response variable that is (Dependent Samples) z0 =
sy
yn = b1x + b0, where yn is the predicted value, b1 = r # Confidence Intervals 1n • Test Statistic for Spearman’s Rank Correlation Test
explained by the least-squares regression line.
• A 11 - a2 # 100% confidence interval about p is
sx Sample Size 1f12 - f21 2 2 2
is the slope, and b0 = y - b1x is the intercept. • To estimate the population proportion with a margin of x20 = 6gd 2i
pn 11 - pn 2 error E at a 11 - a2 # 100% level of confidence: f12 + f21 where n is the number of minus and plus signs and k is obtained rs = 1 -
pn { za>2 # as described in the small sample case. n1n2 - 12
za>2 2 Chapter 13 Comparing Three or More Means
Chapter 5 Probability B n n = pn 11 - pn 2 a b rounded up to the next integer,
E • Test Statistic for the Wilcoxon Matched-Pairs where di = the difference in the ranks of the two
• Empirical Probability • Addition Rule for Disjoint Events • A 11 - a2 # 100% confidence interval about m is where pn is a prior estimate of the population proportion,
• Test Statistic for One-Way ANOVA • Test Statistic for Tukey’s Test after One-Way ANOVA
Signed-Ranks Test observations in the i th ordered pair.
za>2 2 Mean square due to treatment 1x2 - x1 2 - 1m2 - m1 2 x2 - x1 • Test Statistic for the Kruskal–Wallis Test
x { ta>2 #
s MST Small-Sample Case 1n " 302
frequency of E P1E or F 2 = P1E2 + P1F 2 or n = 0.25 a b rounded up to the next integer when no F = = q = =
a
E
s # 12
1 s # 12
1 Two-Tailed Left-Tailed Right-Tailed 12 1 ni 1N + 12 2
number of trials of experiment prior estimate of p is available. a + b a + b H = c Ri - d
• Addition Rule for n Disjoint Events B2 n1 n2 B2 n1 n2 N1N + 12 ni 2
Note: ta>2 is computed using n - 1 degrees of freedom. where H0: MD = 0 H0: MD = 0 H0: MD = 0
• Classical Probability • To estimate the population mean with a margin of error E R21 R22 R2k
P1E or F or G or g 2 = P1E2 + P1F 2 + P1G2 + g
za>2 # s 2
12
• A 11 - a2 # 100% confidence interval about s is
2 2 2
n- k