A grocery store will only accept yellow onions that are at least 2.75 in. in diameter. A grower has a crop of onions with diamet
ers that are normally distributed, with a mean diameter of 3.25 in. and a standard deviation of 0.25 in. What percent of the onions will be accepted by the grocery store?
Imagine (or draw) a standard normal curve. Draw a vertical line through the center of this curve, and mark it 3.25 inches. This is the mean diameter. Next, draw vertical lines to represent these sums: 3.25 plus .25, or 3.50 inches; 3.25 less .25, or 3.00 inches, and 3.25 less .50 inches, or 2.75 inches.
Note that 2.75 inches is 2 standard deviations below the mean: That's 2.75 inches less twice 0.25 inches, or 2.75 = 3.25 inches less 2(0.25) inches.
onions with diameters less than 2.75 are rejected; onions with diameters greater than 2.75 inches are accepted.
Find the area to the LEFT of 2.75 inches. In other words, find the area to the left of 2 standard deviations below the mean.
I would use the Empirical Rule here: "68% of data is within 1 standard dev. of the mean; 95% is within 2 standard dev. of the mean.
This tells us that the area to the left of 2.75 inches or -2 standard deviations is 0.025 (half of 0.050). Subtract this from 1.000; you'll get 0.975. This means that 97.5% of the onions will prove to be acceptable, whereas 2.5% will be rejected because they are less than 2.75 inches in diameter.
You could also use a table of z-scores to do this: Determine from the table the area under the std. normal curve to the left of -2 std. dev. Again, this comes out to 0.025, meaning that the area to the right of -2 std. dev is 0.975.
4x+20+2x+6 is what you get when you distribute, just group the similar ones together and get 6x+26, that's the answer <em><u>6x+26</u></em><em><u /></em><u />