What is difference between an instrument resolution and accuracy and how repeating measurements and statistics relates to them?
Nothing is perfect and it relates to our definitions of concepts (our modeling and simplifications of reality) and our communications (whether we are talking about the same things). Resolution is the output scale precision, to what smallest part we are able to “resolve” an instrument reading. If it is a digital scale, it is the last digit and instrument rounds the reading to the closest displayed digit for us. If it is an analog scale, than it is the smallest fraction we could visually resolve the indicator position against the scale divisions, regardless whether they are marked or we imagine them (then somewhat subjective). Accuracy of an instrument is separate from its display resolution and depends on quality of instrument components and calibration. For example, a scale may have display resolution in grams and systematically be off for say 10 grams – that is accuracy. In addition to resolution and accuracy, when we repeat the measurements we usually get variability (scatter) in rep