A recent journal article reported that college students tend to watch an average of 12 hours of television per week. Candice, a
college senior, believes that college students at her school watch a different amount of television per week than 12 hours. To test her theory, she randomly selects 79 students from her college and asks each student to record the number of hours he/she spends watching television (on average) each week. Candice finds that her sample of students watches an average of 9.49 hours of television per week, with a 95% confidence interval from 7.60 hours to 11.37 hours. Based on this interval, should Candice reject or fail to reject the hypothesis that students at her school watch an average of 12 hours of television per week? Answer
A. Candice should reject the null hypothesis as the 95% confidence interval does not contain 12 hours
B. Candice should fail to reject the null hypothesis as the 95% confidence interval does not contain 12 hours
C. Candice should fail to reject the null hypothesis as the 95% confidence interval contains 12 hours
D. Candice should reject the null hypothesis as there is no p-value calculated
A. Candice should reject the null hypothesis as the 95% confidence interval does not contain 12 hours
Step-by-step explanation:
Given that:
Average watch hour = 12
sample size = 79
The 95% Confidence interval lies between 7.60 and 11.37.
Thus, since the average watch of students in her school does not lie within the interval between 7.60 and 11.37, Candice should reject the null hypothesis.
If the bacteria is tripling every 10 minutes, that means the "rate of increase" on that period is 200%, so if say the current amount is "c", 200% of "c" is just 2c, so c + 2c is 3c, a tripled amount.