LTI 20-20 UltraLyte 100 Calibration checks

LTI 20-20 UltraLyte 100 Calibration checks

Author
Discussion

boyse7en

6,723 posts

165 months

Friday 27th November 2015
quotequote all
schmunk said:
Ah, I see, the New Hall (Tab) / St Hilda's (Ox) thing...
Thanks for that, it makes it soooo much clearer...


anonymous-user

54 months

Friday 27th November 2015
quotequote all
schmunk said:
Breadvan72 said:
Well, if you haven't sampled the delights, I am far too much of a (completely fake and pretendy) gentleman to tell you!

Go Vassar go!

(OK, Seven Sisters, not Ivy league, strictly speaking, but who's checking?)
Ah, I see, the New Hall (Tab) / St Hilda's (Ox) thing...
Somerville, surely?

Rovinghawk

13,300 posts

158 months

Friday 27th November 2015
quotequote all
Breadvan72 said:
I have lost about three stone since then - I kid you not!

In other words, you would.
As this is a motoring forum-

"Bend over & I'll drive you home"

Toltec

7,159 posts

223 months

Friday 27th November 2015
quotequote all
Devil2575 said:
Toltec said:
In the sense that the definition of a metre is the distance light travels in 1/299792458th of a second by measuring the time of the pulse reflection you are are also fundamentally measuring distance.
I'm bored of this back and forth now. You guys can think you're measuring distance if you want.

What great point were you trying to prove with this meaningless argument?

Edited by Devil2575 on Friday 27th November 08:17
Just a technical point, I used to design, build and calibrate optical, pressure and flow test systems so I tend to have a thing about metrology.

As others have pointed out there is no need to make intermediate calculations of distance for each pulse return, the speed can be calculated directly from the recorded times. I was supporting this by pointing out that we now define a distance unit in terms of time and the speed of light therefore timing a pulse reflection implicitly measures distance.

An investigation of the unit would need to look at the stability and uncertainty of readings in laboratory environment as well as field tests to characterize what usage procedures are required to attain reading within a given uncertainty.

From what I have seen of the calibration procedure for the LTI units, can be found on the web if you search, adjustments can be made if necessary to make the units readings meet the specification. There does not appear to be requirement to record the pre-adjustment measurements or to maintain a history for a given unit. This history is vital as the changes changes in correction over a series of calibrations is how you find the stability of the instrument and allows you to calculate the accuracy of any measurements taken to a given uncertainty. The upshot is that just stating an instrument has been calibrated is meaningless, like an MOT it only tells you what state it was in at a particular point in time, at the very least you need the results of the calibration prior to a measurement and the pre-adjustment results of the next calibration to make any meaningful statement about the likely accuracy of any measurement made by the unit.

Devil2575

13,400 posts

188 months

Friday 27th November 2015
quotequote all
Toltec said:
Just a technical point, I used to design, build and calibrate optical, pressure and flow test systems so I tend to have a thing about metrology.

As others have pointed out there is no need to make intermediate calculations of distance for each pulse return, the speed can be calculated directly from the recorded times. I was supporting this by pointing out that we now define a distance unit in terms of time and the speed of light therefore timing a pulse reflection implicitly measures distance.

An investigation of the unit would need to look at the stability and uncertainty of readings in laboratory environment as well as field tests to characterize what usage procedures are required to attain reading within a given uncertainty.

From what I have seen of the calibration procedure for the LTI units, can be found on the web if you search, adjustments can be made if necessary to make the units readings meet the specification. There does not appear to be requirement to record the pre-adjustment measurements or to maintain a history for a given unit. This history is vital as the changes changes in correction over a series of calibrations is how you find the stability of the instrument and allows you to calculate the accuracy of any measurements taken to a given uncertainty. The upshot is that just stating an instrument has been calibrated is meaningless, like an MOT it only tells you what state it was in at a particular point in time, at the very least you need the results of the calibration prior to a measurement and the pre-adjustment results of the next calibration to make any meaningful statement about the likely accuracy of any measurement made by the unit.
What exactly can be adjusted on this device?

anonymous-user

54 months

Friday 27th November 2015
quotequote all
boyse7en said:
schmunk said:
Ah, I see, the New Hall (Tab) / St Hilda's (Ox) thing...
Thanks for that, it makes it soooo much clearer...
I was talking about American posh college girls with very particular skill sets. schmunk may have extended the concept slightly to dirty posh college girls more generally.

Toltec

7,159 posts

223 months

Friday 27th November 2015
quotequote all
Devil2575 said:
What exactly can be adjusted on this device?
There is some optical alignment of the output and receiver as well as the sighting scope, I don't know the specific mechanics involved as this was just from handling and using one. The key part for measurements is going to be the stability of the clocks, crystal sources can drift over time, with temperature and due to changes is drive circuitry components. If there is on board temperature compensation then the stability of the thermometer circuitry is important. The receiver circuitry will have an analogue stage and component changes could cause phase changes. Without knowing the specifics of the design I can think of quite a few things that would need to be considered if I were to design one*. I would imagine that barring damage or component failure the adjustments would be in programmed parameters rather than physical.

Short answer, I don't know exactly, but I know they will be there.

* Not that I would personally try now, not up to date enough in electronics design.

tapereel

1,860 posts

116 months

Saturday 28th November 2015
quotequote all
Toltec said:
Just a technical point, I used to design, build and calibrate optical, pressure and flow test systems so I tend to have a thing about metrology.

As others have pointed out there is no need to make intermediate calculations of distance for each pulse return, the speed can be calculated directly from the recorded times. I was supporting this by pointing out that we now define a distance unit in terms of time and the speed of light therefore timing a pulse reflection implicitly measures distance.

An investigation of the unit would need to look at the stability and uncertainty of readings in laboratory environment as well as field tests to characterize what usage procedures are required to attain reading within a given uncertainty.

From what I have seen of the calibration procedure for the LTI units, can be found on the web if you search, adjustments can be made if necessary to make the units readings meet the specification. There does not appear to be requirement to record the pre-adjustment measurements or to maintain a history for a given unit. This history is vital as the changes changes in correction over a series of calibrations is how you find the stability of the instrument and allows you to calculate the accuracy of any measurements taken to a given uncertainty. The upshot is that just stating an instrument has been calibrated is meaningless, like an MOT it only tells you what state it was in at a particular point in time, at the very least you need the results of the calibration prior to a measurement and the pre-adjustment results of the next calibration to make any meaningful statement about the likely accuracy of any measurement made by the unit.
You only need stability in the clock during the measurement period. The device monitors this itself. If the clock is not stable in the measurememt period an alarm is raised and no measurement is made. The clock can drift in the life of the device, not much because that is monitored too, it just can't drift in the 0.3s to 0.4s of the measurement period.
Drift in the clock does not affect the speed reading, only the distance reading. The alarm for the clock drift is set so that it will alarm before it allows the distance error to be one integer of the smallest distance resolution, typically 0.1m.
It oes seem daft that members here are suggesting checks that they think will never have been thought of before. Maybe they can get together and develop a device themselves. smile

V8LM

5,174 posts

209 months

Saturday 28th November 2015
quotequote all
tapereel said:
You only need stability in the clock during the measurement period. The device monitors this itself. If the clock is not stable in the measurememt period an alarm is raised and no measurement is made. The clock can drift in the life of the device, not much because that is monitored too, it just can't drift in the 0.3s to 0.4s of the measurement period.
Drift in the clock does not affect the speed reading, only the distance reading. The alarm for the clock drift is set so that it will alarm before it allows the distance error to be one integer of the smallest distance resolution, typically 0.1m.
It oes seem daft that members here are suggesting checks that they think will never have been thought of before. Maybe they can get together and develop a device themselves. smile
True, there can be no drift within the measurement period for it to remain accurate. I understand the LTI decides if a received ping is an outlier by the residual from the least squares fit to the 60 or so received pulses, and calls a fail if more than 25% of these are outside of 1.3 ns (a certain number of cycles). I haven't done the calculation (yet*) but presumably anything above a certain level of acceleration or braking in the 300 ms would cause the LTI to call an abort.

I'm interested to know how can it check for drift itself for the distance measurement? Does it have two clocks?


ETA: I have now, it's about 6 G.


Edited by V8LM on Saturday 28th November 17:04

pinchmeimdreamin

9,951 posts

218 months

Saturday 28th November 2015
quotequote all
Has anybody seen a Red Panda ?

I left one in here and he appears to have got lost in amongst all the Willy waving.

V8LM

5,174 posts

209 months

Saturday 28th November 2015
quotequote all
pinchmeimdreamin said:
Has anybody seen a Red Panda ?

I left one in here and he appears to have got lost in amongst all the Willy waving.
How fast was it going?

pinchmeimdreamin

9,951 posts

218 months

Saturday 28th November 2015
quotequote all
V8LM said:
pinchmeimdreamin said:
Has anybody seen a Red Panda ?

I left one in here and he appears to have got lost in amongst all the Willy waving.
How fast was it going?
I don't know it was quite far away.

V8LM

5,174 posts

209 months

Saturday 28th November 2015
quotequote all
pinchmeimdreamin said:
V8LM said:
pinchmeimdreamin said:
Has anybody seen a Red Panda ?

I left one in here and he appears to have got lost in amongst all the Willy waving.
How fast was it going?
I don't know it was quite far away.
If you could measure by how much it was either more or less red than ....

sorry.

Toltec

7,159 posts

223 months

Saturday 28th November 2015
quotequote all
tapereel said:
You only need stability in the clock during the measurement period. The device monitors this itself. If the clock is not stable in the measurememt period an alarm is raised and no measurement is made. The clock can drift in the life of the device, not much because that is monitored too, it just can't drift in the 0.3s to 0.4s of the measurement period.
Drift in the clock does not affect the speed reading, only the distance reading. The alarm for the clock drift is set so that it will alarm before it allows the distance error to be one integer of the smallest distance resolution, typically 0.1m.
It oes seem daft that members here are suggesting checks that they think will never have been thought of before. Maybe they can get together and develop a device themselves. smile
Given speed is the change in distance readings over time the clock accuracy is fundamental to the speed reading. The drift between calibrations would need to be quite large, something approaching 1%, to matter much, however any corrections applied should be recorded in a calibration certificate.

Fundamentally the device is quite simple, the main challenges are processing/filtering the return pulse to get consistent results and making a field unit 'squaddie proof'. Not to say it would be trivial and when they were first designed much harder than it would be now, I suspect interleaved ADCs to get the sample rate. A quick Google finds ADC dev boards from Farnell that can handle 3.6 GSPS...

I have only had a quick look this though.

Trivia:

If the sample clock is 3GHz, minimum for 0.1m resolution* then the 18ns pulse will last for over 50 samples and be about 18 feet long.

* technically 1.5GSPS would give you +/- 0.1m, depends what tapereel means by resolution.