Important Notice: Our web hosting provider recently started charging us for additional visits, which was unexpected. In response, we're seeking donations. Depending on the situation, we may explore different monetization options for our Community and Expert Contributors. It's crucial to provide more returns for their expertise and offer more Expert Validated Answers or AI Validated Answers. Learn more about our hosting issue here.

What is difference between an instrument resolution and accuracy and how repeating measurements and statistics relates to them?

0
Posted

What is difference between an instrument resolution and accuracy and how repeating measurements and statistics relates to them?

0

Nothing is perfect and it relates to our definitions of concepts (our modeling and simplifications of reality) and our communications (whether we are talking about the same things). Resolution is the output scale precision, to what smallest part we are able to “resolve” an instrument reading. If it is a digital scale, it is the last digit and instrument rounds the reading to the closest displayed digit for us. If it is an analog scale, than it is the smallest fraction we could visually resolve the indicator position against the scale divisions, regardless whether they are marked or we imagine them (then somewhat subjective). Accuracy of an instrument is separate from its display resolution and depends on quality of instrument components and calibration. For example, a scale may have display resolution in grams and systematically be off for say 10 grams – that is accuracy. In addition to resolution and accuracy, when we repeat the measurements we usually get variability (scatter) in rep

What is your question?

*Sadly, we had to bring back ads too. Hopefully more targeted.