Pipe Adjust Update Interval to Reduce Errors Option
Adjust Update Interval To Reduce Errors
With this selected, Planimate will make a small (0.1%) adjustment to the pipe update interval time to reduce round-off errors for each sample transferred. This is imporant when pipes are transferring small quantities into bins containing large quantities.
Pipes move values from one place to another by generating events at regular intervals and subtracting/adding to the source/target according to a specified hourly rate.
Numbers in PL are represented using C double precision values (64 bit).
These have a precision of about 14 decimal digits.
This precision limit becomes a problem when the pipe is transferring in small amounts to end points which contain a large value.
For example a pipe may be set up to transfer at a rate of 1750 units per hour, at a sampling resolution of 10 seconds to a target already containing 2 billion units.
1750 / 3600 * 10 = 4.86111111111111
the number recurs but in memory would stop after 14 digits or so)
When the pipe dumps the first quantity into the target it does:
2,000,000,000 . 000 000 000 000 00 + 4 . 861 111 111 111 11 ^
but due to the 14 digit limit, the result will lose the smaller fractions of the value being added because its a large value and runs out of precision (about where the ^ is)
Over time these errors accumulate, leading to noticeable misbalance in the system and comparisons against expected limits failing.
Ways to alleviate this problem include:
- Keep sources and targets as small as possible (eg:have a separate accumulator for your "entire" stockpile)
- Dont sample more often than necessary (the more frequently it samples, the smaller the values moved)
- Dont code/expect exact quantities with pipes, allow a margin of error either way
An additional and significant improvement can be gained using this new option.
WHAT THIS OPTION DOES
Given that the hourly rate for the pipe is known, PL adjusts the transfer amount per sample to a value which takes less precision to represent.
To compensate for this, the update interval is adjusted slightly, in order to keep the hourly rate the same.
So basically it trades off accuracy in the amount transferred against the accuracy of the sampling interval.
This all assumes that a "real world" model does not get troubled by the sampling interval not being exact.
In the case above, the amount per sample becomes 4.859375 (looks strange because truncation is done in base-2 math) and the sampling interval becomes 9.996428571429 (expect up to 0.1% variation from what is specified)
Test case: Set up to transfer 100 million units to a destination containing 2 billion, at rate and sampling as specified above.
With this option off, you end up with surplus of 1.124 units.
This would cause matching problems if you were testing the result against a hard-coded number, mess up material balances etc.
With this option on, the values are accurate to the display resolution.
The precision limits are still there and will still manifest themselves when the ratio of bin level to transfer amount per sample is high.
Initial testing suggests a ratio of 100 million to 1 should be OK