What is the resolution and or accuracy of timing of subscans? (Pipeline mode, CR1000 or CR6.)
For the curious, I received the following information from technical support:
The resolution of the Sub Scan interval is 100 usec in pipeline mode and the ticker resolution in sequential mode (1 msec for a CR6, 10 msec otherwise).
The problem with the subscan timing is accuracy, not precision. At compile time we compile all the op codes and the time each of them takes and the time of the delay at the bottom of the sub scan that will happen before the start of the next scan. There are some inaccuracies in some of the measurement times, hence an error. But, once we are running, the error is constant, i.e., there is very little jitter (perhaps 10 usec) in the time from one scan to another, but the constant error could be quite a bit larger than that.