Jumping back to validation. Now that we know how that's working, I want to take a closer look at this. Depending on the requirements of your product, you may or may not want to do this, but this would be a step that would happen during the product validations test. You've got your product and you put it in a thermal chamber, and you set the temperature to say, zero degrees, and you've done your calibration already on it. But what you've done also, is right where the thermostat is, is you put this high-precision thermocouple, to measure the temperature. This comes out to a high precision temperature measuring device. What you want to see is, if the ambient temperature and therefore the temperature here, should be, when we start, let's say we're starting at zero degrees, the thermocouple should report zero degrees, and then you run your system and go ahead and measure it, and hopefully, it tells you its zero degrees. And then you bump up to 10 degrees and you do it again, and you check calculate an error, and see if there's an offset. You can keep going all the way up to whatever your maximum rate of temperature is, and it may track and it may not track. So, you may have to take the data from this validation step, and work it into the software calculation for what the real temperature is if you happen to see a delta develop, either at the low end or the high end or someplace in the middle. Again, how accurate that needs to be as a function of the requirements of your product. We had a discussion at work a couple weeks ago, so temperature of a drive gets hotter and hotter, especially, drives that are feel replaceable, there's specifications in place, UL, Underwriter's Laboratory. Any device that is feel replaceable, means a human being can touch it, while it's running and these drives are capable of being hot swap, so while the system is running, a technician can just walk in and grab the drive and pull it out. And the case we can't allow, we as manufacturers, can't allow the case temperature to get any higher than 70 degrees C, because its maximum temperature touch point by UL labs, and there's other ones that are the spec for the European Union and elsewhere around the world. It got us into this conversation, about well, what's going to happen at 70 degrees C case temperature, we can't let the case temperature get any hotter, so the drivers can report the temperature to the host system, the host can come in and ask the driver the temperature or the drive. There's some mechanism that enable the drive to tell the host system, "hey, we're getting really hot." So, what does the host do when the case temperature gets to 70 degrees C. Well, the host has fans and we've probably heard them in your laptop, you turn your laptop on and you're using your laptop for awhile, and then you're really downloading and cranking, or you're running a simulation or something and maybe you'll start to hear the fan on your laptop spin up, because it drives and the components inside are telling the software, the thermo-monitoring software, that it's getting hot and we need to pull some more air through. So, the gist of the conversation we had is, okay, we really care about the temperature in our market, when the case temperature, say, from 50 to 70 degrees C or 50 to 80 degrees C, that's where we really care. That's our area of interest. So, we want our temperature reading to be very accurate in that range, because as we get up to 70 degrees C, the drive can actually start slowing down, while actually, there's software routines in there that will actually slow the rate of operation of the drive down, in an attempt to cool it off, in addition, to reporting it to the host, which would hopefully blows more air, increase the airflow across to keep that temperature down. But below 50 degrees C or say 40 degrees C, we really don't care how accurate our temperature is, because the host isn't going to do anything with it. It's not an issue for the technician going into the rack and grabbing the drive. So, who cares if we're plus or minus five or plus or minus 10 degrees, if the case temperature is really 40 degrees C. We don't really care. So, we believe we can be fairly inaccurate at low temperature, especially, when you get down to 10 degrees C or zero degrees C or minus 10 degrees C, we just don't care, we don't care about the accuracy there. But as the temperature increases, we get more and more interested in the accuracy, and so we want the accuracy to be within some tolerance above a certain temperature, and we spent an enormous amount of time, and I was talking about this, and how do we measure the temperature? What are we going to do? What is software going to do? What do we really care about? It's a big deal for products. Yes, just thought I'd share some real-world experience with you. But, here's a setup that would validate your whole calibration process.