Validation and Calibration
Summary
People normally use approximations to estimate quantities. At some point a standard for those quantities became necessary and currently those standards are, or are being shifted to fundamental properties (speed of light, atomic frequency, etc). Two different ways to calibrate equipment: using fundamental quantities (which is far away since we don’t have the tools to do that yet), or to calibrate things against a certain tool/device (which is what is normally done).
From the session we discussed how calibration should be done often, so that users a) know how the calibration is drifting over time, b) they can trust their data, c) they demonstrate the reliability of their data to people who might not trust something that is not coming out of “traditional” contexts. It would be good to have protocols for calibration, that should accompany hardware blueprints. It would be useful if those would be easy enough to the point that they could be done before each use.
Notes
Is certification useful for the community?
how do we calibrate things?
Julian: 4 years at NIST (national ) working for calibration
Andre: interested in general
Fernan: interested in general
Clarissa: interested in how it is coordinate inside the community and learning as much as possible
Pierre: insterested in the process of calibrating things for his projects – How do show that it works well
Harold: interest in citizen science, and many academic scientists don’t trust that data. How do you make sure the data you are collecting are up to standards
Prayush: how are sensors calibrated
Joel: OSH for science, trying to make a sustainable business. Always interested in knowing how to create tools that are up to standard, and how to not waste time with protocols and paths that are not meeting standards.
Julian: traditional calibration, started with approx. Feet, inches, etc. I was used to give an estimate. At some point someone created a system (metric) to have standards. This is used everywhere (even USA- they calibrate using metric and convert to imperial). There are 7 main units. Mol, candela (light/energy through a certain space), meters, kilograms, ampera, kelvin, second.
Julian introduced history of the standards, how seconds are calibrated against frequency of atoms, how the travel of light over a certain time is the calibration for distance. So the calibration is done against fundamental elements. This is important because these calibrations won’t change over time. If you have a calibration related to an object, like the kilogram, the object will change over time (oxidize, etc) and all values have to be adapted/updated over time.
- two roots of calibration:
- take something that is already calibrated and do measurements against that.
- finding a way to measure something that can be linked to a natural constant (eg speed light) – this is better as we don’t need an intermediate step, but it will take long before we have the tools to measure this small constants.
Ben: Would be useful to use old ICs to do calibrations?
Julian: timing ICs for instance have their timing from cristal oscillators, they are precise, but not accurate. They change depending on how they were prepared/cut. Better would be to use atomic clocks from satellites/ GPS signals.
Voltage controllers for instance wander all over the place. I worked with some and they needed to be calibrated every three months.
Andre: maybe we can use practical examples? How do people calibrate things now, maybe we can get examples of what people in the room did
Julian: this comes to the idea of getting calibrated tools and comparing against them. For the open flexure, we are using Thorlabs slide with markers that has been calibrated somewhere else.
Harold: I have a thermistor that I need to calibrate. They have a big part to part calibration. They are analog sensors, so you could get infinite resolution with it, if your device allowed that. I want a device that has two digits of precision in the sub degree range (a hundredth of a degree).
Julian: With temperature, thermistors are never going to give you that, unless you get platinum ones. We should come up with a list of enabling technologies for calibration.
Harold: right now I use a thermometer that is better than mine and do a calibration of that.
Julian: you need to do a linearity calculation,
Pierre: this is important, because when we build thermocyclers you have to figure out, where to invest time, will I get better outcomes investing in hardware or software
Harold: would it make sense not to calibrate things at all? And just do it, when you need it?
Julian: not really, because calibrations change over time. Take data now, and try to calibrate later is a world of pain
Andre: maybe a good idea is to calibrate it before every use. Setting up a system that makes calibration super easy and straight forward, that you can do daily.
Joel: OpenBCIs take analog data and the system outputs digital data. We use a texas instrument chip that does all the heavy lift. The chip has 24 bit depth, and gives us counts. We have to rely on the datasheet heavily to do all calculations that come afterwards. So we had to create to hardware system to check if the chip is doing what is supposed to be doing. Working with a client that the necessary tools to make those controls allowed us to build that system. They could generate very small analog signals to cross check the Texas instruments chip.
We also validated against the gold standard in the scientific community. So our system got used by a researcher that had a top of the line equipment. He did experiments with the two devices simultaneously and compared.
Andre:
How many copies did you send to the client?
Do you need to send things out for calibration every time a new run of chips is made?
Joel:
We send 4 copies to the client
And yes, you should send things out for calibration every time. But at the moment we don’t have the resources to do it, and we are trusting Texas instruments, coupled with a few tests we do internally every time we build a equipment to make sure it is working as well as the others.
Julian: Depending on what you are using your equipment for, you have legal obligations to do certain types of calibration.
Last comments:
Ben: Within the community, some people are not interested in being that super precise and accurate. For instance taking a hobby example from instructables, the project doesn’t care about calibration. I wonder if it would be useful to think of ways of making these issues relevant to the whole community.
Pierre: in citizen science, there are issues not only on calibration, but also on how people collect data. Protocols are important in that sense
Valerian:
Ananda: I saw students collecting a lot of data in a farm I worked in, and they said they do it because they were trying to do calibration, and this way they wouldn’t have to ship the equipment out.
Harold: Most citizen science projects are related to observation, and so if the projects are done properly, volunteers are given a lot of training, and the dropout rate is high
julian:
Short wrap up: two paths to calibration, send it to someone/company to calibrate it for you. Or buy a device that can help you calibration (microscope slide with calibrated slits). Another path is to do calibration against natural constants.
Fernan: There are good platforms for documenting protocols, such as protocols.io, it might be a good platform for the calibration procedures. They get DOI, you can upload images, videos, whatever you want.
Summary:
people normally used approximations to estimate quantities. At some point a standard for those quantities became necessary and currently those standards are, or are being shifted to fundamental properties (speed of light, atomic frequency, etc). Two different ways to calibrate equipment: using fundamental quantities (which is far away since we don’t have the tools to do that yet), or to calibrate things against a certain tool/device (which is what is normally done).
From the session we discussed how calibration should be done often, so that users a) know how the calibration is drifting over time, b) they can trust their data, c) they demonstrate the reliability of their data to people who might not trust something that is not coming out of “traditional” contexts. It would be good to have protocols for calibration, that should accompany hardware blueprints. It would be useful if those would be easy enough to the point that they could be done before each use.