wiki:Dc3SqaTasks
Last modified 11 years ago Last modified on 04/30/2008 10:31:08 AM

CCB Review of DC3 Software Quality Assurance Needs

from: CCB

I am preparing the framework for the LSST Software Quality Assurance, LSST Configuration Management, and LSST Validation and Verification (aka Testing) Plans to more formally define these aspects of quality software development which we have, to date, done in an ad hoc fashion. The framework will be then be specialized to DC3 requirements.

The evolving frameworks are found at:

Below are some DC3 tasks which need CCB approval in advance of the completion of the Plans.

Standards and Procedures

Build Configuration Management

Our configuration control procedures currently:

  • manage versioning of individual source files via revision # in 'svn';
  • manage versioning of packages via tags in 'svn'. We manage dependencies of packages via 'eups'.

We need to add versioning of builds tied to the archiving of the build's provenance so that we can routinely recover and execute a previous build environment.

  • pro: the current driving force for the capability is enabling bug repair in the same environment in which it was detected.
  • con: none - the ability to recover a previous environment including its data and tools is a DM requirement; it might be a stretch target for DC3.

Coverage Analysis Metric

SQA generally requires a quantitative measure of the thoroughness of the unit testing coverage of branches and statements. Coverage of 80% generally 'finds' 95% of bugs in unit being tested. I recommend the unit test coverage requirement for DC3 should be >60%; DC4 should be >70%, DC5/Construction should be 80%.

In the future we hope to automatically generate unit test stubs by post-processing the controllers defined in EA/ICONIX robustness diagram. However this feature has not been fully implemented by EA yet.

  • pro: Starting with a low coverage requirement allows us to learn how to define adequate tests.
  • con: Just get it over with; if we know we're targetting for 80% coverage just require it now so retrofit is minimized.

Refine Validation Testing Process

Our current build scenario needs to define more thorough validation testing. Then that scenario needs to be implemented in the DC3 build procedure. As a simple first step, a refinement of pass/fail status so it depends on the product's development stage (e.g. new, debug, production) should be introduced. For example:

  • within Ticket branch development
    • build warnings if coding standards check fails
    • build warnings if unittest doesn't satisfy DC3 coverage requirement
  • within Trunk branch development (generally merge from Ticket branch)
    • build failure if unittest doesn't satisfy DC3 coverage requirement
    • build failure if coding standards check fails
  • Nightly build of development trunk
    • any failures cause automatic 'Failed Nightly Build' ticket at high priority
  • Production Release sanity check
    • any failure causes automatic 'Failed Production sanity check' ticket at high priority
    • auxiliary automatic notification to highest authority on failure (separate from ticket notification)
  • pro: This is evolutionary step in software quality assurance process
  • con:

Chain from Requirements to Validation

In order to provide satisfactory Software Quality Assurance, we need to demonstrate the chain from individual DC3 software requirements to the Domain objects supporting those requirements, then to the software modules implementing those Domain objects, and finally to the validation tests on the modules.

The procedure to establish this chain needs to be defined.

  • pro: DM Software Quality Assurance requirement
  • con:

Tools and Support

Remove Pacman

Replace package extraction functionality using combination of 'eups'/'scons' and a build directive file.

  • pro: removes 3rd party tool; groups all package configuration files within package's 'svn' directory.
  • con: unknown to me how build script reacts to error situations

Add Virtual Machine Support

Add virtual machine support as Lsst development system distribution option. The VMs would be configured to match generic workstation environments (MacOS, Fedora Core, etc) and would include the entire LSST software stack prebuilt.

  • pro: Removes need for all users to build and install complete LSST software trees.
  • con: unknown to me if VM will accurately map low level distributed communications interfaces.

Add 64 bit support

  • pro: evolutionary step on Data Challenge plan

Coding Standards Compliance Checker

Add coding standards compliance checker. The benchmarking of the two candidates is occurring.

  • pro: doing the compliance checking 'by hand' resulted in uneven compliance enforcement.
  • con: none. Project needs to allocate FTE:
    • to perform compliance checking until a tool is acquired; or
    • if an adequate 3rd party tool is not found, to implement a parser for the LSST Language Standards .

Add boost Unittest for C++

Add boost/test/unit_test as the C++ unit test framework; we are currently using python's unittest for the same purpose.

  • pro: we already use many boost libraries; boost libraries are good quality implementations; Serge's test use in DC2/associate was positive.
  • con: there are a variety of other candidates which support similar functionality.

Add Coverage Analysis Checker

Add coverage analysis checker to the C++ unit test builds. Benchmarking coverage analysis tools such as 'gcov', 'ggcov' needs to be done.

  • pro: validation checking requires confirmation that the unit test provides adequate coverage of all branches and statements.
  • con: none

Add comment

Glossary

Software Quality Assurance

Software Quality Assurance is a planned review procedure to provide adequate confidence that the item or product produced conforms to established technical requirements. SQA does this by checking that:

  • plans are defined according to standards;
  • procedures are performed according to plans;
  • products are implemented according to standards.

(Extracted from: ESA PSS-05-11 Issue 1 Revision 1 Software Quality Assurance)

Software Verification and Validation

Software Verification and Validation checks products against its specifications. This is done by:

  • checking that each software item meets specified requirements;
  • checking each software item before it is used as an input to another activity;
  • ensuring that the amount of verification and validation effor is adequate to show each software iten is suitable for operational use.

(Extracted from: ESA PSS-05-10 Issue 1 Revision 1 Software Verification and Validation)

As such Software Quality Assurance compliments Software Verification and Validation. The former reviews the processes used to create the product and the latter checks the instantiation meets requirements.

Software Configuration Management

Software Configuration Management provides the means of tracking software development over time. Software Configuration ensures:

  • software components can be identified;
  • software is built from a consistent seto of components;
  • software components are available and accessible;
  • software components never get lost;
  • every change to software is approved and documented;
  • change do not get lost;
  • it is possible to go back to a previous version;
  • a history of changes is kept so that is is always possible to dicover who did what and when.

(Extracted from: ESA PSS-05-09 Issue 1 Revision 1 Software Configuration Management)