wiki:DC3bReport
Last modified 7 years ago Last modified on 05/02/2012 04:28:49 PM

DC3b Final Report

The outline and intended content for the final report on Data Challenge 3b (through the Summer, 2011 data release and Winter, 2012 software release) is given below. The content for this report differs somewhat from that of the DC3a final report in that it will not include an explicit description of the following:

  • the software per se, its design, and organization
  • the processes used to create the software

These topics may be touched on throughout the report. Rather, the focus of the report will be on:

  • What was accomplished in DC3b development
  • What design implementation choices were made, and why
  • What was learned in DC3b that should carry forward to the next DC
  • What known aspects of DRP remain to be addressed

The updates to the software for the Winter 2012 release will be described at the end of relevant sections. These subsections should describe:

  • What the known issues were for the Summer 2011 DRP that were addressed in Winter 2012
  • What solution was selected, and why

The target date for completing this report is 25 May 2012.

Contents

Front Matter (50% Complete)

Title

LSST Data Challenge 3b: Data Release Production and Science Data Quality

Author List

  • Those who contribute materially to the report contents, followed by
  • Remaining Data Management System Team members, listed alphabetically by surname

Table of Contents

  • First- and second-level topics
  • (No lists of figures or tables)

Executive Summary (0% Complete)

1. Introduction (80% Complete)

Primary Authors: Shaw, (Freemon, Lim, Juric, others?)

  • Scope of DC3b
    • Description of the Data Release Production
    • What parts were prototyped?
  • Science and Technical Goals for DC3b
    • Data Simulation goals
    • Algorithm development goals
    • Production Goals
    • Quality Assessment goals
    • Science Engagement goals

2. Input Data (0% Complete)

Primary Authors: Krughoff, Peterson, Shaw

  • Image Simulations: which effects are simulated (and which are not), what types of objects are included, how are environmental effects included, etc.
  • Input Image Characteristics: how many images were processed, clouds or not, range of airmass, ambient seeing, sky brightness, distribution over area and filters, etc.
  • Calibration Data: which kinds of calibration are performed, how are calibration data prepared, etc.
  • Data Generation

3. Algorithm Development (0% Complete)

Primary authors: Lupton, Price, Becker, Lang, Owen (others?)

The subsections should describe what the various pipeline stages do (to the data) and, if there were significant design/implementation choices, what reasoning supported the choice.

  • Applications code in DRP context (Owen)
  • ISR (Becker, Krughoff)
    • How gain correction is applied, how backgrounds are determined, how CR detection on single frames works, why we only process one exposure in a visit, etc.
  • Image characterization (Lang, Price, Lupton)
    • Lengthier descriptions of how objects are detected, how reference stars are identified, how WCS calibration works, how aperture corrections are determined, photometric zero-points
  • Photometric Measurements (Price?)
    • Discussion of how the various types of photometric measurements work
  • Source Identification and Association (Monkewitz)
  • 3rd-party software that is employed (Lupton??)
  • Other topics?
  • Updates to Apps for W12 release
    • What were the problems?
    • Which ones were solved and how?

4. Production Management (0% Complete)

Primary authors: Freeman, Lim, Allsman, (Juric)

  • Software Configuration and Continuous Integration (Allsman)
  • Hardware Context (Freemon)
    • Compute Platforms
    • Storage Resources
  • Data Wrangling (Butler, etc.) (Lim, others?)
  • [ Other middleware topics? ]
  • Updates to Middleware for W12 release (Lim)
    • What were the problems?
    • Which ones were solved and how?

5. Data Products and Quality Assessment (0% Complete)

Primary authors: Shaw, Juric, Monkewitz, Becker, Price, Beckla, (Lupton, Monet?)

  • Data Product Description (Shaw)
  • Catalogs: discuss DB schema, mapping to representation in Gator catalogs (Beckla, Monkewitz)
  • Data Quality Artifacts: output from pipeQA (Price, Becker)
  • Quality Assessment
    • Completeness
    • World Coordinate Solutions
    • PSF shapes
    • Photometry
    • Astrometry
    • ImSim Fidelity
  • Updates to pipeQA for W12 release (Price, Becker)
    • What were the problems?
    • Which ones were solved and how?

6. User Engagement (0% Complete)

Primary Authors: Shaw, van Dyk, Juric, Kantor

  • Workshops (Shaw)
  • Documentation (Shaw)
  • Note on access to data by SciCollab members, their use of the data so far, etc. (Van Dyk, Shaw, Freemon)
  • User contributions to QA (Juric, Lupton?)
  • Requirements discovery & prototyping for SUI (Van Dyk)

7. Recommendations for Future Data Challenges (0% Complete)

Primary Authors: ???

References

Glossary