The range of a dataset is calculated as:
\[
\text{Range} = \max(x_i) - \min(x_i)
\]
For the given dataset, the maximum value is \(100\) and the minimum value is \(10\):
\[
\text{Range} = 100 - 10 = 90
\]
The mean \( \mu \) of the dataset is calculated using the formula:
\[
\mu = \frac{\sum x_i}{n}
\]
Where \( n \) is the number of data points. For the dataset, we have:
\[
\mu = \frac{550}{10} = 55.0
\]
The variance \( \sigma^2 \) is calculated using the formula:
\[
\sigma^2 = \frac{\sum (x_i - \mu)^2}{n}
\]
Substituting the values, we find:
\[
\sigma^2 = 825.0
\]
The standard deviation \( \sigma \) is the square root of the variance:
\[
\sigma = \sqrt{825.0} \approx 28.7
\]
- The range is \( \boxed{90} \).
- The variance is \( \boxed{825.0} \).
- The standard deviation is \( \boxed{28.7} \).