[Cesg-all] Results of CESG Polls closing 14 December 2016

Stafford, Laura (HQ-CG000)[Arctic Slope Technical Services, Inc.] laura.stafford at nasa.gov
Wed Jan 4 16:53:09 UTC 2017


CESG E-Poll Identifier: CESG-P-2016-11-001 Approval to publish CCSDS 921.1-B-1, Cross Support Transfer Service-Specification Framework (Blue Book, Issue 1) Results of CESG poll beginning 30 November 2016 and ending 14 December 2016:

                  Abstain:  1 (16.67%) (Calzolari)
  Approve Unconditionally:  2 (33.33%) (Barkley, Shames)
  Approve with Conditions:  3 (50%) (Merri, Behal, Scott)
  Disapprove with Comment:  0 (0%)

CONDITIONS/COMMENTS:

Mario Merri (Approve with Conditions): MOIMS has over the last years reported that the CCSDS has wrongly embarked in the development of a multiplicity of frameworks to define services:
the MAL, this one and possibly the EDS are examples. Having said that, I do not put any condition on this book in the light of the "London Agreement".

Bigette Behal (Approve with Conditions): cf. MOIMS AD conditions

Keith Scott (Approve with Conditions): =====================================

General comments (NOT conditions on moving the book forward):
=====================================
The framework does not specify implementations, and therefore can't ensure interoperability. I see its utility as the basis (framework) for OTHER blue books that do specify protocol, but am surprised that the book in itself is Blue.

Requirements such as the following don't seem to have been tested in the Yellow Book:
3.2.1.1 "All invokers of confirmed operations shall implement a timer for the acknowledgement in case of three-phase operations or the return in case of two-phase operations. In case the timer expires, the service user may issue a PEER-ABORT. and

3.2.1.2 "On reception of the acknowledgement in case of three-phase operations or the return in case of two-phase operations, the invoker of the corresponding confirmed operations shall stop the timer." seem to not be tested in the Yellow Book.

4.5.3.2.7.1 At the time of insertion of a TRANSFER-DATA or NOTIFY invocation into an empty return buffer, the service provider shall start a timer called the release timer.

Am I in fact mistaken about the above tests? Is there something that maps the tests in the yellow book to individual requirement numbers in the draft blue book?

=====================================
CONDITIONS that must be met (or where I need to be convinced I'm wrong):
=====================================
Section 1.5.1
From:
section 0 presents the purpose, scope, applicability, ...
To:
section 1 presents the purpose, scope, applicability, ...


==END==


Total Respondents: 6
No response was received from the following Area(s):

SOIS

SECRETARIAT INTERPRETATION OF RESULTS:  Approved with Conditions
PROPOSED SECRETARIAT ACTION:            Generate
CMC poll after conditions have been addressed

* * * * * * * * * * * * * * * * * * * * * * * *

CESG E-Poll Identifier: CESG-P-2016-11-002 Approval to publish CCSDS 922.1-B-1, Cross Support Transfer Services-Monitored Data Service (Blue Book, Issue 1) Results of CESG poll beginning 30 November 2016 and ending 14 December 2016:

                  Abstain:  0 (0%)
  Approve Unconditionally:  3 (50%) (Barkley, Shames, Scott)
  Approve with Conditions:  3 (50%) (Merri, Behal, Calzolari)
  Disapprove with Comment:  0 (0%)

CONDITIONS/COMMENTS:

Mario Merri (Approve with Conditions): MOIMS has over the last years reported that the CCSDS has wrongly embarked in the development of a multiplicity of services to monitor parameters:
the MO M&C and this one are examples (it is curious that they are to be published in parallel!). Having said that, I do not put any condition on this book in the light of the "London Agreement".

Bigette Behal (Approve with Conditions): cf. MOIMS AD conditions

Gian Paolo Calzolari (Approve with Conditions): File of PIDs attached


Total Respondents: 6
No response was received from the following Area(s):

SOIS

SECRETARIAT INTERPRETATION OF RESULTS:  Approved with Conditions
PROPOSED SECRETARIAT ACTION:            Generate
CMC poll after conditions have been addressed

* * * * * * * * * * * * * * * * * * * * * * * *

CESG E-Poll Identifier: CESG-P-2016-11-003 Approval to publish CCSDS 522.1-B-1, Mission Operations Monitor & Control Services (Blue Book, Issue 1) Results of CESG poll beginning 30 November 2016 and ending 14 December 2016:

                  Abstain:  1 (16.67%) (Calzolari)
  Approve Unconditionally:  3 (50%) (Merri, Behal, Scott)
  Approve with Conditions:  2 (33.33%) (Barkley, Shames)
  Disapprove with Comment:  0 (0%)

CONDITIONS/COMMENTS:

Erik Barkley (Approve with Conditions): 1) The "MAL::" nomenclature is not identified/defined.
It seems like the conventions section is ideal for this. Please add a decription.

2) Re the statsitic service, Nyquist theorem seems to be ignored; are there to be sampling "guarantees" If so, suggest adding requirement in accordance with established Nyquist theorem.
Otherwise the behavior seems rather
indeterminate. Please amend or provide rationale.

3) Re "...3.6.2.2 The statistics service may support the evaluation of the following set of statistic functions: a) the maximum value; b) the minimum value; c) the mean value; d) the standard deviation...."

Why not others? Perhaps kurtosis is of interest?
What if the distribution of samples is not a normal distribution? If objects are related perhaps there regression tests? Suggest a fuller list or perhaps registering the well recognized methods of this service. This may help to manage extensibility fo the service in the future.

4) Re Pics Pro-forma: How is inter-operability achieved if everything is optional? There is in fact no core capability to be implemented? It seems that there are in fact implied capabilities and layering: for example it seems rather odd to implement the statistics services without the parameter service, yet this this okay per the pics proforrma. Are there conditions that should be noted here? Request addinng conditions in the PICS Proforma as needed.

5) Re the test report: This appaears to be an API test. I can not see inter-operability truly being demonstrated. DLR and ESA are listed client/provider partners; presumably in testing the parameter service there would be a set of values indicated such that the parameter serivce interoperability could be verified/validated as faithfully delivering the values. The test of the stastics service could similarly indicate that the set of statiscs match an a-prior statistical characterization of the service -- this kind of test should help to verify that the service behavior is as expected. Please clarify.

Peter Shames (Approve with Conditions): There are numerous issues with this document and with the related test report. I note the following:

1) The restated scope is now so broad as to encompass everything, MO systems, spacecraft, ground stations, and "any system component that provides a control interface" This exceeds by a considerable amount the originally stated scope for this work.

2) The descriptions at many points are vague and in some cases read as circular. In many cases they redefine (oftne vaguely) common terms already well specified elsewhere, in ISO BRM, SCCS-ADD, or the SLE Reference Model.

3) At various points the stated scope includes things like ground stations, when the specification of cross supported and interoperable interfaces within CCSDS is clearly within the remit of the CSS Area. There is no mention at all made of the CSS area, nor the existing cross support services and service management. This is unacceptable "scope creep".

4) In addition to the vague definitions of those terms that are provided there are many terms that are used that are not defined at all nor referenced from other documents, either SM&C or other. This makes a lot of the text unreadable without reference to other documents.

5) Many of the concepts are described with lengthy paragraphs where a few well chosen (and well constructed) diagrams could be used to make the concepts clear. Much of Sec 2 suffers from this deficiency. Furthermore, many of the diagrams that are provided include the rather meaningless relationship type "related". Does that mean "part of", "married to", "uses", "friend of"? Everything is "related", how they are related is what provides meaning.

6) There are a number of sections where
speculation, "definition by negation", and vague references are used instead of solid, unimpeachable, definitions. See in particular notes sec 2.4, 2.6.1, 2.6.3, and 2.6.9. These do not appear to be really acceptable in what is intended to be published as a CCSDS Blue Book, even in the description section.

7) The rationale, Sec 1.4, says this:

The primary goal of CCSDS is to increase the level of interoperability among agencies. This Recommended Practice furthers that goal by providing a standard service specification for the basic monitor and control of a remote entity." The first part of this is true, that is a CCSDS primary goal. The second part, however, is not correct because this is proposed as a "standard" not a "practice" and only after the whole stack of this spec, MAL, COM, and technology binding and encoding are sandwiched together do we get to the possibility of interoperability. Further, the multitude of planned bindings and encodings make achieving interoperability, as we usually understand it, only possible if constraint is used is selecting among all of the possible options. It would be more accurate to state this clerly instead of to assert that it is globally the case.

8) Along this same thread, the test report is incomplete in that it does not state which of the many possible bindings, encodings, APIs, etc were used in the test. For these upper layer service tests we agreed to allow leveraging of these lower level components. This does not mean that they can be ignored. It does mean that they must be carefully documented so that the "provenance"
of the test string(s) can be verified.

At the same time the test report is way too complete in that it contains details that reviewers could not possibly care about. A nice, concise, table in an early section, summarizing the results, topic by topic, with pass / fail indicators, would be a big help.


Total Respondents: 6
No response was received from the following Area(s):

SOIS

SECRETARIAT INTERPRETATION OF RESULTS:  Approved with Conditions
PROPOSED SECRETARIAT ACTION:            Generate
CMC poll after conditions have been addressed

* * * * * * * * * * * * * * * * * * * * * * * *




More information about the CESG-All mailing list