thought it was 5 at first too. the 00 past the decimal at significant because it's expressing it is exactly to that point. sort of confusing. say you measured a pencil with a ruler, and say it was 5" long roughly. you can only scientifically put down that the pencil is 5 inches not 5.0000 because your ruler doesn't go that many decimal places. when something is absolute is the only time you can add .0000000000000000000000 to it and it still be correct. an example of something that is absolute is like you have 10 fingers, 10 is absolute because it's exact 10.0000000000000000=10 fingers it's not any more and not any less. Those two zeros are included after the decimal so they aqre significant. The rule is all final zeros after the decimal are significant Since there is a decimal point present, start with the leftmost nonzero didgit. Count that digit and every other digit to the right of it.