wiki:DC3/DC3aIntegrationTestPlan
Last modified 10 years ago Last modified on 03/16/2009 10:16:59 AM

from Baseline Software Management Plans

DRAFT Last revised 16 Mar 2009

DC3a Subsystem Integration Test (IT) Plan

1. Test Plan Identification

  • Baseline Identifier
    • DM DC3a
  • Testing Level
    • DM Subsystem Integration

2. References

3. Use Case Implementations being Validated

Following are the Use Cases and/or Activities being exercised in DC3a. The cycling of the Production through various pipelines and stages is not represented in the text below. Refer to the Alert Production activity diagram attached to this document.

The full set of DC3a Use Cases/Activities?, links to the Use Case/Activity? text, and Use Case/Activity? input and output requirements are in the Appendix. In that enumeration, the Middleware 'Run a Pipeline' Use Case has been interspersed within the Applications Use Cases to provide a more realistic (though not perhaps, accurate) end-to-end scenario of the Alert Production.

The Use Case model components may not identically match the actual pipeline stages implemented to satisfy the model's components.

  • Alert Production
    • Night Mops Pipeline (Francesco, JonM)
    • Image Processing Pipeline
      • ISR (Nicole)
        • Split FPA Exposure (Andy, Nicole)
        • Validate Exposure Type (Nicole)
        • Remove Bias (Nicole)
        • Subtract Overscan (Nicole)
        • Linearize (Nicole)
        • Correct for Dark Current (Nicole)
        • Flatten (Nicole)
        • Defringe (Nicole)
        • Correct Saturation (Nicole)
        • Remove Cosmic Rays (Nicole)
      • Image Characterization (renamed from WCS Pipeline) (Robert, Fergal)
        • Detect Sources (Robert)
        • Measure Simple Sources (Robert)
        • PSF Determination (Robert, AndyB)
        • Match Sources to Astrometric Standards Catalog (Robert, Fergal)
        • Generate WCS from Match (Fergal)
        • Calculate Rough Photometric Zeropoint (SteveB, Robert)
      • Difference Image (AndyB)
        • Following subsumed into one of primary stages below them
          • MW:Retrieve Template/Co?-Add covering an area (KT)
          • Warp Exposure (renamed from WCS Match Exposure) (RussellO)
        • PSF Match Masked Image (AndyB)
        • Subtract Template from Image (renamed from Subtract Image from Template) (AndyB)
      • Visit Processing (Robert)
        • Add Difference Exposure (Robert)
        • Detect Sources (Robert)
      • Measure Simple Sources (Robert)
      • Source Characterization (Serge)
    • Association Pipeline (Serge)
  • Middleware
    • Persist Data from Pipeline
      • Define persistence policies (KT)
      • Execute persistence (KT)
        • Obtain and configure Persistence object (KT)
        • Obtain and configure Storage object(s) for persistence (KT)
        • Specify additional object metadata (KT)
        • Format and send Persistable object to Storage(s) (KT)
      • Execute retrieval (KT)
        • Obtain and configure Persistence object (KT)
        • Obtain and configure Storage object(s) for persistence (KT)
        • Specify additional object metadata (KT)
        • Retrieve Persistable object from Storage(s) (KT)
    • Prepare Data Access for Pipeline (KT)
      • Configure Object Catalog for association (KT, Serge)
  • Initialize Catalogs (KT)
  • Stage Input Data (KT, SteveP, AndyB)
    • Setup access to Image Collection (KT, SteveP, AndyB)
    • Setup access to Co-add/Template Collection (KT, SteveP, AndyB)
    • Retrieve Template/Co?-Add covering an area (KT, AndyB)
    • Retrieve Image from Image Collection (KT, AndyB)
    • Retrieve Image Fragments from Image Collection (DC3b)
  • Record Pipeline Provenance (KT)
  • Event Handling (SteveP)
    • Record Event (SteveP)
      • Establish Event Transmitter In Event System (SteveP)
        • Create Event Transmitter (SteveP)
      • Publish Event Using Event System (SteveP)
        • Publish Event (SteveP)
    • Subscribe to an Event Topic (SteveP)
      • Establish Event Receiver In Event System (SteveP)
        • Create Event Receiver (SteveP)
      • Retrieve Event Using Event System (SteveP)
        • Receive Event (SteveP)
      • Retrieve Matching Event Using Event System (SteveP)
        • Matching Receive Event (SteveP)
          • Receive Event (SteveP)
  • Run Event Monitor (SteveP)
    • Initialize Event Monitor (SteveP)
    • Create Timer (SteveP)
    • Process Incoming Event (SteveP)
    • Publish Event Using Event System (SteveP)
  • Run a Pipeline (SteveP)
    • Configure Pipeline (SteveP)
      • Select Input Sources and Output Sinks (SteveP)
      • Set Monitoring and Control Parameters (SteveP)
        • Retrieve Default Pipeline Policies (SteveP)
      • Distribute Programs to Processing Nodes (SteveP)
      • Initialize Processing Nodes (SteveP)
    • Stage Input Data (SteveP, KT)
    • Pipeline Execution and Monitoring (Greg)
      • Load Pipeline Policy (Greg)
      • Record Pipeline Provenance (KT)
      • Initialize Pipeline Events (Greg)
      • Create Slices (Greg)
        • Create Slice Intracommunicator (Greg)
      • Execute Processing Stage (Greg)
    • Record Pipeline Execution Status (Greg)
      • Record Event (Greg)
    • Monitor Pipeline Execution (SteveP)
      • Detect Failure (SteveP, Greg)
      • Display Pipeline Status (SteveP)
    • Stop Pipeline Execution (SteveP, Greg)
      • Shutdown Slices (SteveP, Greg)
    • Clean Up after Execution (SteveP)

3.1. Composition of the Pipelines

DC3a: Alert Production

Notes:

  • Francesco is handling MOPS completely' the actual stages may be different than described below..
  • A separate, run-once script will run to prep the Ephemeris catalog.
  • Serge is handling Association completely; the actual stages may be different than described below.
Pipelines/Stages? Policy File Person Responsible
Pipeline: Night MOPS Francesco
* Input Stage for MOPSNA Francesco
* Perform MOPS nightmops/NightMopsStagePolicy.paf Francesco
* Output Stage for MOPS nightmops/output_policy.paf Francesco
* Issue event nightmops/output_policy.paf Francesco
Pipeline: Image Processing/Subtraction/Detection? K-T
* Input Stage for MaskedImage Triggered by Exposure1Event K-T (AndyB)
* * * Transform slice id into amp, CCD, and other identifiers (IPSD/Sliceinfo_policy.paf)
* * * Link input files into input directory (IPSD/SymLink_policy.paf)
* * * Load input image (exposure 0) (IPSD/ImageInput0_policy.paf)
* Output Stage for Visit metadata K-T
* * * Transform input event into visit metadata (exposure 0) (IPSD/visitMetadata0.policy.paf)
* * * Transform image metadata into LSST standard (exposure 0) (IPSD/TransformMetadata0_policy.paf)
* * * Validate that metadata is in LSST standard form (exposure 0) (IPSD/ValidateMetadata0_policy.paf)
* * * Persist the per-visit metadata (exposure 0) (IPSD/visitMetadataOutput0_policy.paf)
* * * Persist the per-exposure metadata and the raw image (exposure 0) (IPSD/rawImageAndMetadataOutput0_policy.paf)
* Input Stage for calibration products
* * * Determine which calibration data products to load (IPSD/identifyCalibrationProducts_policy.paf)
* * * Load the calibration data products (IPSD/calibrationInput_policy.paf)
* Perform ISR (exposure 0) (IPSD/isr0_policy.paf) AndyB
* Image characterization (per segment) (exposure 1) Martin/RHL
* * * Detect sources for WCS (exposure 0) (IPSD/sourceDetection0_policy.paf)
* * * Measure sources for WCS (exposure 0) (IPSD/sourceMeasurement0_policy.paf)
* * * Determine PSF (exposure 0) (IPSD/psfDetermination0policy.paf)
* Persist WCS sources and PSF (exposure 0) (IPSD/wcsSourcesAndPsfOutput0_policy.paf) K-T (RHL)
* Input Stage for (Masked?)Image (exposure 2) K-T
* * * Triggered by Exposure2Event
* * * - same as for exposure 0
* Perform ISR (exposure 0) (IPSD/isr1_policy.paf) AndyB
* Image characterization (per segment) (exposure 2) NA Martin/RHL
* Output Stage for WCSSources and PSF (exposure 2) NA K-T (RHL)
* Load WCS sources from entire CCD (IPSD/wcsSourcesInput_Policy.paf) K-T (RHL)
* Image characterization (per CCD) Martin/RHL
* * * Determine WCS based on CCD's WCS sources (IPSD/wcsDetermination_policy.paf)
* Persist calibrated science exposures (IPSD/calibratedExposuresOutput_policy.paf) K-T (RHL)
* Load WCS from template image (IPSD/templateMetadataInput_policy.paf) K-T (RHL)
* Determine bounding box of exposures within template image (IPSD/templateBbox_policy.paf) AndyB
* Input Stage for template MaskedImage K-T
* * * Load subimage of template corresponding to exposure (IPSD/templateSubimageInput_policy.paf)
* Image Subtraction AndyB
* * * Subtract template from exposure (exposure 0) (IPSD/imageDifference0_policy.paf)
* * * Subtract template from exposure (exposure 1) (IPSD/imageDifference1_policy.paf)
* Persist difference images and kernels (IPSD/differenceImageAndKernelOutput_policy.paf) K-T (AndyB)
* Co-add difference images and detect DIASources (IPSD/addAndDetect_policy.paf) Martin/RHL
* Source Classification SMM
* * * Measure DIASources in both exposures (IPSD/diasourceMeasurement_policy.paf)
* * * Convert the Sources to Dia sources (IPSD/sourceToDiaSource_policy.paf)
* * * Classify DIASources (IPSD/sourceClassification_policy.paf)
* Persist DIASources (IPSD/diaSourceOutput_policy.paf) K-T (Martin)
* Persist SDQA ratings (IPSD/sdqaOutput_policy.paf) K-T (Russ)
* Issue event for association indicating new detections available (IPSD/associationEvent_policy.paf)
Pipeline: Association                                Serge

Stages:
    +  loading                                       Serge
    +  input                                         Serge
    +  DIA Source Matching                           Serge
    +  output                                        Serge
    +  input                                         Serge
    +  MOPS Matching                                 Serge
    +  output                                        Serge
    +  storing                                       Serge

The dependency relationships between the underlying packages can be seen in wiki:DC3/PackageDependencies

4. Features to be tested

The DC3a goal is to

  • characterize performance of all pipelines within DC3a Alert Production;
  • characterize the difference in performance from DC2 to DC3a equivalent components (image subtraction through detection);
  • characterize the scalability of the algorithms over changing number of processors for all DC3a pipelines.

5. Approach

5.1 Testing Strategy

Unit tests for all Domain objects accessed during the Integration Testing should have been successfully run prior to the start of DC3a Integration Testing.

Integration Testing will proceed using a layered approach to adding new subsystems to the baseline. For composite items, innermost member items will be added to the validated baseline and tested, one at a time, until the entire composite item is incorporated. Then the composite item will be validated.

During the DC3a Integration and Test, the basic course scenario for use cases and the non-exception scenario for activity flows will be tested. The objectives of DC3a are throughput measures not reliability measures. DC3b's objectives include performance metrics such as reliability so that both the basic course and alternate courses will be exercised.

5.2 Test Types

Subsystem Integration tests verify that the major software components work correctly with the rest of the system, and as specified in the architectural design. The common tests performed include:

  • White-box Tests
  • Black-box Tests
  • Performance Tests

5.3 Overall Test Sequencing

5.3.1 Middleware Integration and Test Sequence

The Middleware components' integration and tests should all occur prior to the Applications components' integration and tests. Due to dependencies, the sequence of the integration testing should follow the order listed below. Test harnesses for the middleware integration tests which create the appropriate external operating environment, will be used.

  1. Persist Data from Pipeline
  2. Prepare Data Access for Pipeline
  3. Stage Input Data
  4. Record Pipeline Provenance
  5. Event Handling
  6. Run Event Monitor
  7. Run a Pipeline
    7.1 An accurately deployed pipeline (ie stages, events & communications framework) but using dummy stages
    7.2 An accurately deployed pipeline using stages implementing algorithms

5.3.2 Applications Integration and Test Sequence

Each pipeline will have a test harness which creates the appropriate external operating environment necessary for testing the pipeline's progress through its stages. This allows pipeline testing and validation in isolation.

When integrating the various pipelines into the Alert Production, its notable that "Run Night Mops Pipeline" executes in parallel with a group of the other pipelines. So "Run Night Mops Pipeline" integration testing may occur independently and in parallel with the integration of a specific group of other pipelines. The order and independence of the production integration test is shown below. Note: the Image Processing pipeline might be split into multiple pipelines depending on data partitioning factors.

1. Run the Image Processing Pipeline <-- in parallel with --> 3. Run Night Mops
2. Run the Association Pipeline

5.3.2.1 Testing Sequence for Image Processing Pipeline

  • Test each of the following algorithms individually.
    • ISR
    • Image Characterization
    • Image Subtraction
    • Add Difference Exposures
    • Detect Sources
    • Measure Simple Sources
    • Source Characterization
  • Test a skeleton of the Image Processing Pipeline using dummy stages.
  • In sequence shown, exchange a dummy stage for the appropriate algorithm's stage, and test the new composite pipeline.

5.3.2.2 Testing Sequence for Association Pipeline

  • Test the Association algorithm.
  • Test a skeleton of the Association Pipeline using dummy stages.
  • Test the Association pipeline using the Association algorithm stage.

5.3.2.3 Testing Sequence for Night MOPS

  • Test the Night MOPS algorithm;
  • Test a skeleton of the Night MOPS pipeline using dummy stage;
  • Test the Night MOPS pipeline using the Night MOPS stage.

5.3.2.4 Testing Sequence for Alert Production

The Alert Production will be composed incrementally from the previously tested sub-productions.

  • Test a skeleton of the Alert Production framework
  • In the sequence shown below, exchange a dummy sub-production for the appropriate sub-production's specification, and test the new composite Alert Production.
    • Night Mops Pipeline
    • Image Processing Pipeline
    • Association Pipeline

5.4 Test Coverage Metric

The DM Unit Testing Standard requires unit testing of each module prior to inclusion within the baseline software; it also supports the developer's use of coverage analysis tools to verify each unit test suite's adequacy. A minimum test coverage metric is not defined for DC3.

6. Test Criteria

  • Item Pass/Fail? Criteria
  • Entry & Exit Criteria
  • Suspension & Resumption Criteria

7. Test Deliverables

7.1 Prior to Testing

Tagged versions of every LSST software package to be compiled/or and linked within the product deliverable under test must be provided prior to testing. By definition, these tagged packages have already been successfully unit tested.

Each Pipeline Developer should prepare a test harness which steps through the entire pipeline process for a single slice of an image. The optimum test harness would use the pipeline harness (pex_harness) to setup a single node pipeline to process a single slice and using appropriately configured stages to execute the process. Alternatively, the test harness may be script-based where the script sets up the data, and processes each eventual stage in the appropriate sequence. The test harness and its dataset should be available from the SVN repository.

Test DC3a Test Plan (this document) should be available and include the following

  • Descriptions of Test Cases
  • Description of Test Procedures
  • Description of Test Data

7.2 On Completion

On completion of the Integration Test period, the following items should be available

  • Test reports including the build and test logs;
  • Record of Problem Report Tickets arising during the integration testing; and
  • Summary of the integration test results.

8. Testing Tasks

  • identify the set of tasks necessary to prepare for and perform testing;
  • identify all inter-task dependencies.

9. Environmental Needs

See DC3 Platforms, Section "LSST Development Cluster at NCSA" for the hardware and operation system specifications of the NCSA Cluster to be used for DC3a benchmarking runs.

The add-on third party software requirements are maintained within the LSST Build.

10. Schedule

See DC3Schedule for the project schedule and milestones.

11. Planning Risks and Contingencies

DC3a Alert Production is an enhancement on the DC2 Nightly Pipeline. The non-negotiable objective is to increase the speed of the original DC2 Nightly Pipeline components. Desirable objectives include the incorporation of additional pipeline functionality for ISR processing, WCS creation, paired-exposure processing, and the use of more accurate simulated LSST exposures.

11.1 Contingency: Simulated LSST Exposure Dataset

The optimum configuration for the DC3a benchmark includes the use of the new ISR pipeline and the new simulated LSST exposures. The datasets required for the ISR Pipeline may need to be enhanced to include all the calibration artifacts required.

Should the simulated LSST exposure dataset not include the calibrated artifacts required by ISR pipeline, the CFHT-LS dataset may be used as the alternate dataset. The simulated LSST calibration artifacts should be available for DC3b.

11.2 Contingency: Calibration Artifacts

If both the simulated LSST exposure dataset and the CFHT-LS dataset do not include the calibration artifacts required by ISR, the ISR Pipeline will be postponed until DC3b by which time the calibration artifacts will have been generated.

11.3 Contigency: ISR Pipeline

Should the new ISR Pipeline be unavailable, that pipeline will be dropped. The ISR Pipeline was not included in the DC2 benchmarking.

11.4 Contingency: WCS Pipeline

Should the new WCS Pipeline be unavailable, the WCS enbedded in the original source CFHT images will be used as was done in the DC2 benchmarking.

11.5 Contingency: Visit Processing Pipeline

Should paired exposures be unavailable for the selected dataset, only a single exposure will be processed through the Alert Production input stream as was done in the DC2 benchmarking.

12. Test Case Specification

Test specifications will be generated using the EA TestGen capability. Until that is delivered, the layout and the actual text of the Use Cases described in the Appendix below indicate the general flow of test case operation.

Appendix

Use Cases and I/O Requirements

NOTE: Missing Use Case ==> no EA Use Case object defined
NOTE: Missing Use Case text ==> no Note text within EA Use Case object
NOTE: Either add Use Case text or ensure maps to Domain object

This section is transitioning into the DC3a Pipeline & Stage structure. Many of the use cases represent computational steps within a single pipeline stage.

  • Alert Production a sequence of pipelines
    • Night MOPS pipeline, a single computational stage
    • Image Processing Pipeline, a sequence of computational stages
      • Input
        • event
        • raw CFHT science images in r & i filters
        • master calibration images for
          • bias
          • darks (if don't exist, skip 'Dark Current Correction'
          • flat(s)
          • bad pixel mask
          • fringe images
          • pupil images (DC3b)
          • scattered light images
          • illumination correction images (DC3b)
      • Policy
        • default Data Release Policy
      • Setup Image Processing Pipeline
      • Run ISR (per sub-CCD unit) Pipeline, a sequence of computational stages
        • Split FPA Exposure (missing Use Case text)
          • Input
          • Output
        • Validate Exposure Type
          • Input
          • Output
        • Remove Bias
          • Input
            • Raw Exposure
            • Master Bias Exposure
            • ISR Stage Policy
          • Output
            • bias corrected Raw Exposure Exposure
            • SDQA metrics
            • Provenance
        • Subtract Overscan
          • Input
            • Raw Exposure
            • ISR Stage Policy
          • Output
            • overscan corrected Raw Exposure
            • SDQA metrics
            • Provenance
        • Linearize (missing Use Case)
          • Input
            • raw Exposure
            • lookup table (optional)
            • ISR Stage Policy
          • Output
        • Correct for Dark Current (missing Use Case)
          • Input
          • Output
        • Flatten
          • Input
            • Chunk Exposure
            • Master Flat Field Chunk Exposure, in appr filter/bandpass, & type (done, twilight, night sky)
            • ISR Stage Policy
              • ISR Flat Field Correction Policy File
          • Output
            • flat field corrected Chunk Exposure
            • SDQA metrics
        • Defringe
          • Input
            • Chunk Exposure
            • Master Fringe Chunk Exposure
            • ISR Defringe Policy
            • ISR Stage Policy
          • Output
            • fringe corrected Chunk Exposure
            • SDQA metrics
        • Correct Saturation
          • Input
            • Chunk Exposure
            • ISR Saturation Correction Policy
            • ISR Stage Policy
          • Output
            • saturation corrected Chunk Exposure
            • SDQA metrics
        • Remove Cosmic Rays
          • Input
            • Chunk Exposure
            • ISR Cosmic Ray Detection Policy
            • ISR Stage Policy
          • Output
            • cosmic ray cleaned Chunk Exposure
            • SDQA metrics
        • Trim (missing Use Case text)
          • Input
          • Output
      • Run Image Characterization Pipeline (per sub-CCD unit) a single stage with multiple computational steps
      • Run Image Characterization (per CCD), a single stage with multiple computational steps
      • Run Image Subtraction, a single stage consisting of a sequence of computational steps
        • Get Template Exposure
          • Input
            • science chunk Exposure
            • template Exposure
            • Type and size of WCS matching kernel
          • Output
            • template chunk Exposure
        • WCS Match Exposure
          • Input
            • original Exposure
            • remappedExposure
            • ?desired sky coordinates of center?
            • desired size (col, row) in pixels
            • type and size (col, row) of geometry remapping kernel (from Policy)
          • Output
            • remapped Exposure
        • PSF Match Masked Image
        • Subtract Image from Template (missing Use Case text)
          • Input
          • Output
      • Run the Detection Algorithm
        • Input for Detection Pipeline
        • Run Visit Processing Stages, a singel stage consisting of multiple computational steps
        • Run Measure Simple Sources Pipeline - a single computational stage
          • Measure Simple Sources
            • Input
              • WCS extracted from Exposure
              • footprint for positive sources
              • footprint for negative sources
            • Output
              • DIA Source list
      • Run Source Characterization, a single stage consisting of multiple computational steps
        • .... call-out steps when known
      • Tear down Image Processing Pipeline
      • Output of Image Processing Pipeline
    • Run the Association Pipeline (missing Use Case text)
  • Run Association, a single stage consisting of multiple computational steps
    • Preprocess AstroObject Catalog
      • Input
        • Object catalog from deep detection
        • Height of stripes (H)
        • Width of chunks.
      • Output
        • Chunk files containing object attributes necessary for association
    • Calculate Object Zone Indices for Visit
      • Input
        • visitId of current visit
        • RA of the current telescope pointing
        • Dec of the current telescope pointing.
        • MJD of current exposure
        • Name of filter of current exposure ('u', 'g', 'r', 'i', 'z', or 'y').
      • Policy
        • Size of LSST FOV (radius)
        • Sky partitioning parameter
        • Location of chunk files/chunk delta files
      • Output
        • zone index for objects in the FOV
    • Match DIA Sources to AstroObjects
      • Input
        • (optional) Match radius R for difference source to object matching
        • Difference sources for visit from detection pipeline
        • Moving object predictions for visit from moving object pipeline
        • Zone index for objects from "Prepare for Visit"
      • Policy
        • Default match radius R for difference source to object matching.
        • Maximum semi-major axis length for moving object prediction error ellipses
        • Clamp values for semi-major and semi-minor axis lengths
      • Output
        • List of difference source to object matches
        • List of moving object prediction to difference source matches
        • List of difference sources to create new objects from
    • Update AstroObject and DIA Source Catalogs
      • Input
        • Difference sources for current visit from detection pipeline
        • Moving object predictions for current visit from moving object pipeline
        • List of difference source to object matches (from "Match DIA Sources to AstroObjectst??") for current visit
        • List of moving object prediction to difference source matches (from "Process Visit") for current visit.
        • List of ids for difference sources to create new objects from (from "Process Visit") for current visit
      • Policy
        • Sky partitioning parameters.
        • Location of chunk delta files.
      • Output
        • Updated historical Object and DIASource catalog
        • Updated chunk delta files
  • Tear down Association Pipeline
  • Output of Association Pipeline

Middleware Use Cases

Pipeline Control
Run a Pipeline
.. -- Configure Pipeline
...... -- Set Monitoring and Control Parameters
.......... -- Retrieve Default Pipeline Policies
...... -- Select Input Sources and Output Sinks
...... -- Distribute Programs to Processing Nodes
...... -- Initialize Processing Nodes

.. -- Stage Input Data
...... -- Setup access to Co-add/Template Collection
...... -- Setup access to Image Collection
...... -- Retrieve Image from Image Collection
...... -- Retrieve Template/Co-Add covering an area
...... -- Retrieve Image Fragments from Image Collection
...... -- Stage a Named Collection
...... -- Preload a Database

.. -- Pipeline Execution and Monitoring
...... -- Load Pipeline Policy
...... -- Execute Processing Stage
.......... -- Initialize Processing Stage
.......... -- Process a Data Input Through A Stage
.............. -- Pre-Process
.............. -- Perform InterSlice Communication
.................. -- Retrieve Shared Data from Clipboard
.................. -- Transmit Data between Slices
.................. -- Post Received Data to Clipboard
.............. -- Process
.............. -- Post Process
.......... -- Terminate Processing Stage
...... -- Record Pipeline Provenance
...... -- Initialize Pipeline Events
.......... -- Subscribe to an Event Topic
.............. -- Establish Event Receiver In Event System
.................. -- Create Event Receiver
.............. -- Retrieve Event Using Event System
.................. -- Receive Event
.............. -- Retrieve Matching Event Using Event System
.................. -- Matching Receive Event
...................... -- Receive Event
.......... -- Create Event Transmitter
...... -- Create Slices
.......... -- Create SliceIntracommunicator
.......... -- Load Pipeline Policy
.......... -- Execute Processing Stage
.............. -- Initialize Processing Stage
.............. -- Process a Data Input Through A Stage
.................. -- Pre-Process
.................. -- Perform InterSlice Communication
...................... -- Retrieve Shared Data from Clipboard
...................... -- Transmit Data between Slices
...................... -- Post Received Data to Clipboard
.................. -- Process
.................. -- Post Process
.............. -- Terminate Processing Stage

.. -- Monitor Pipeline Execution
...... -- Recover from Software Failure
...... -- Recover from Hardware Failure
...... -- Display Pipeline Status

.. -- Clean Up after Execution

.. -- Stop Pipeline Execution
...... -- Shutdown Slices

Event Handling
.. -- Run Event Monitor
.... -- Initialize Event Monitor
.... -- Publish Event Using Event System
.... -- Process Incoming Event
.... -- Publish Event
.... -- Create Timer

.. -- Subscribe to an Event Topic
.... -- Establish Event Receiver In Event System
........ -- Create Event Receiver
.... -- Retrieve Event Using Event System
........ -- Receive Event
.... -- Retrieve Matching Event Using Event System
........ -- Matching Receive Event
............ -- Receive Event

.. -- Record Event
.... -- Establish Event Transmitter In Event System
........ -- Create Event Transmitter
.... -- Publish Event Using Event System

Prepare Data Access for Pipeline
.. -- Generate list of Sky Patches
.. -- Prepare Data Access for Pipeline
.... -- Configure Object Catalog for association
.... -- Initialize Catalogs

.. -- Retrieve Image Fragments from Image Collection
.. -- Retrieve Template/Co-Add covering an area
.. -- Setup access to Co-add/Template Collection
.. -- Setup access to Image Collection

Persist Data from Pipeline (missing Use Case text)
.. Define persistence policies

.. Execute persistence
.... -- Obtain and configure Persistence object
.... -- Obtain and configure Storage object(s) for persistence
.... -- Specify additional object metadata
.... -- Format and send Persistable object to Storage(s)

.. Execute retrieval
.... -- Obtain and configure Persistence object
.... -- Obtain and configure Storage object(s) for persistence
.... -- Specify additional object metadata
.... -- Retrieve Persistable object from Storage(s)

.. Persist Data from Pipeline
.... -- Define persistence policies
.... -- Execute persistence
........ -- Obtain and configure Persistence object
........ -- Obtain and configure Storage object(s) for persistence
........ -- Specify additional object metadata
........ -- Format and send Persistable object to Storage(s)

Simulated Input Data

Initial Release

Date: Thu, 22 Jan 2009 12:55:19 -0800

We are happy to announce the availability of the initial image imulation data (10% of the full DC3a sample - see http://hypernews.slac.stanford.edu/HyperNews/LSST/get/ImageSim/27.html for details of the DC3 requirements).

While we arrange making them accessible from NCSA they can be downloaded from http://www.astro.washington.edu/ajc/LSST/ImageSims/images/v0.1

There are three sets of images and associated catalogs: Deep (19 images of the same field with variable seeing and airmass), Wide (162 images covering the focal plane - 3 rafts - at different airmass and

seeing), Ideal (27 images with 0.1 arcsec seeing, airmass of unity and no sky background which can be used as a "truth" image). Details of the parameters for each image can be found in the file dc3.txt.

The associated catalogs are imageName.stars.dat and imageName.galaxy.dat which contain all sources to r=28. All images are in the r band. We also provide catalogs of the stars used in defining the WCS for each image. Details for these catalogs can be found in the README at the url above.

All images have an associated WCS (fit independently as a linear relation for each CCD) and each image represents a full CCD (4096x4096).

These images are meant for evaluation by DM prior to the full DC3 run. We would greatly appreciate feedback on these images (including formats, header info and any enhancement requests for DC3a) before we move to generating the full data set. We'd like to get this feedback in the next 2 weeks to allow for time to make any required adjustments.

cheers,
ImageSim Team

Proposal to Upgrade for DC3a ISR Needs

Compliments of Andy Connolly (12 Feb 2009):

Thought this through a bit and have a strawman proposal. We post
process the simulated images to get the wcs so during this stage we
could:

1. Apply the inverse of an appropriate linearity table - someone would
have to point us to the appropriate values.

2. Extend the size of the image to include an overscan region and add
datasec to the fits header

3. Multiply by an appropriate flat field (with minimal noise)

4. Add a mean zero bias with appropriate noise to each image (and set
the readout noise in the header)

5. Add bad pixels/columns from a bad column file for each detector



We would also

1. Generate a zero mean master bias with some reduced amount of noise
that could be subtracted in ISR

2. Create a master flat that has mean unity and some reduced amount of
noise. This would initially be the same for each image but if we get
time we will generate something which maps the vignetting function.

3. Create a bad pixel mask - someone would have to explain the format
required for this

CFHT Input Data

Compliments of Nicole (12 Feb 2009):

>Is the existing CFHT++ dataset collected and/or manufactured for DC2 
>adequate for ISR needs?  
Probably not.  It is my understanding that DC2 ran on calibrated CFHT
data.  That data was processed through CFHT's Elixir pipeline which is
essentially CFHT's version of the ISR.  The ISR will need the CFHT raw
data and will then need to be 'chopped-up' appropriately into whatever
size images we'll be working on for DC3.  We decided to use the CFHT D4
data for DC3 which we felt most closely resembled LSST data for testing
purposes.  I do not know which of the CFHT data sets were used for DC2.

Dave Gehrig downloaded the CFHT D4 raw data to ds33.ncsa.uiuc.edu at
NCSA this summer.   The science data are in the subdirectories under
/lsst_ibrix/scratch/D4/.  Images ending with an 'o' are raw and images
ending in a 'p' are processed (calibrated) with Elixir.  We opted to
download only the r and i filter images as these have the most objects
and best signal-to-noise of all the filters.  These are all gzipped MEF
files.

The corresponding master calibration images are in
/lsst_ibrix/scratch/calib/.  It takes some archeology of the calibrated
image headers to determine which calibration image goes with which raw
science image.  I have not verified that each science image has all of
its corresponding calibration images in this directory so that will need
to be checked at some point. The only calibration images that were
downloaded are the master bias', master flats (in r and i filter), and
master fringe images (in r and i filter).    These are all gzipped MEF
files.
>Are there calibration products (i.e. input) which need to 
>be fabricated or acquired for any stages?  If so, what's missing and has 
>someone already been assigned to fabricate/collect them

To run all current DC3a stages of the ISR, which are as follows:
 * Saturation Correction
 * Bias Correction
 * Dark Current Correction
 * Overscan Correct and Trim
 * Linearization
 * Flat Field Correction
   * Illumination Correction for DR
   * Illumination Correction for Nightly
 * Fringe Correction
 * Cosmic Ray Detection (a rough first pass)

we need:
* raw CFHT science images in r & i filters.

All of these images then need to be unzipped, split from their MEF
format, headers updated to LSST standard, and converted to LSST
Exposures (I have python scripts to do this and I think someone wrote
one for DC2 that can probably be used.)

* master CFHT calibration images (with the same massaging as above) for
the following:
>    * bias
>  
These have been downloaded.  Need to verify that we have one for each night.

>    * darks
>  
There are no master darks in the calibration images directory.  This
could be for two reasons:
1) Dave missed them
2) CFHT did not take them for the deep images (which is the likely
explanation).  I recall poking around on the CFHT site for these a while
ago and didn't find any either but I didn't do a thorough search.

If 2) we can either setup the main ISR policy to skip the 'Dark Current
Correction' stage of the ISR or we can create a zero-valued image to
subtract off so we have something to run on for this stage.  We will
probably want to create the dummy dark image as we'll likely be
interested in timing the ISR at some point and will want to know how
long it takes to run each stage...but thats Tim or Robert's call.

>    * flat(s)
>  
These have been downloaded in both r and i filters.  Need to verify that
we have one for each filter for each night.

>    * bad pixel mask
>  
The bad pixel masks for these images will need to be downloaded. They do
exist (I found one for the test raw D4 image I have been using on the
CFHT site)  so we'll need to grab these from CFHT.

>    * fringe images
>  
These have been downloaded in both r and i filters.  Need to verify that
we have one for each filter for each night.

>    * pupil images
>  
These exist, but the pupil correction will require inter-slice
communication so we're skipping this stage for DC3a.  The implementation
of this stage is slated for DC3b.

>    * scattered light images 
>  
These exist as well but we need to determine exactly what these images
are.  I had assumed that by 'scattered light' the CFHT folks meant
'illumination correction' images.  But that may not be what these images
are.

The ISR will eventually need illumination correction (IC) images but it
is setup to run two different ways.  For nightly processing, the ISR
computes an estimate of the IC for the current images based on the
previous nights illumination correction.  For Data Release, we'll need
the full illumination correction for the current data.

So we need to verify that the scattered light images are indeed
illumination correction images.  I'll check on that. If they are, they
will need to be downloaded from CFHT for each night in each filter.

Attachments