[Css-csts] Buffered DP question
Ray, Timothy J. (GSFC-5830)
timothy.j.ray at nasa.gov
Mon Apr 15 11:13:04 EDT 2013
Hi,
The DP procedure specifies that the size of the Input Queue for incoming PROCESS-DATA operations is defined by the service using this procedure, possibly delegating the definition to service management.
The BDP procedure provides the User with two mechanisms for controlling the maximum latency of buffered PROCESS-DATA invocations. The maximum latency can be controlled precisely (and explicitly) by setting the processing-latency-limit. The maximum latency can be controlled imprecisely (and implicitly) via the mechanism that discards the oldest buffered PROCESS-DATA invocations as needed (in timely transfer mode, when incoming PROCESS-DATA invocations overflow the Input Queue).
It's the second mechanism that I want to discuss. Suppose the size of the Input Queue is defined as 100 PROCESS-DATA operations. Is the Provider required to implement exactly that size (and no bigger)? Based on my experience, an implementer that sees a buffering requirement may interpret the requirement as a minimum size rather than an exact size. From an implementer's point-of-view, having a larger buffer than required is acceptable (and usually desirable).
Returning to the example, suppose the User is expecting the (small) buffer size to cause older PROCESS-DATA invocations to be replaced by newer ones. But, suppose the implemented Input Queue size is, say, 10000 PROCESS-DATA invocations. The User will not see the expected behavior.
What do you think?
Best regards,
Tim
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.ccsds.org/pipermail/css-csts/attachments/20130415/584f7af3/attachment.htm
More information about the Css-csts
mailing list