Changes between Version 5 and Version 6 of ProposalNDarray


Ignore:
Timestamp:
09/24/2009 05:47:44 PM (10 years ago)
Author:
jbosch
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • ProposalNDarray

    v5 v6  
    1919 * include both strong owning and non-owning variants, in which the owning version is convertible to the non-owning version (boost::gil::image and boost::gil::view). 
    2020Existing LSST code seems to prefer the former, and since ndarray supports both, I'll assume that usage.  Right now, LSST has: 
    21  * Two main one-dimensional array classes in use ({{{std::vector<double>}}}, {{{Eigen::VectorXd}}}).  Neither of these has shared ownership, and the most common one, {{{std::vector}}}, does not support views (STL iterators, of course, can substitute for views in many cases, but often require templates). 
    22  * Two main two-dimensional array classes in use ({{{afw::Image}}}, {{{Eigen::MatrixXd}}}). 
    23  * No three-dimensional array classes.  In our case, 3D arrays are almost always stacks of images, and while {{{std::vector<afw::image::Image>}}} is a possible workaround sometimes, it is no more the appropriate replacement for a true strided 3D array than a vector-of-vectors is a suitable replacement for an image. 
     21 * Two main one-dimensional array classes in use ({{{std::vector<double>}}}, {{{Eigen::VectorXd}}}).  Neither of these has shared ownership. 
     22 * Two main two-dimensional array classes in use ({{{afw::Image}}}, {{{Eigen::MatrixXd}}}).  The former has shared ownership, but only the latter support optimized linear algebra operations. 
     23 * No three-dimensional array classes.  In our case, 3D arrays are almost always stacks of images, and while {{{std::vector<afw::image::Image>}}} is a possible workaround sometimes, it is no more the appropriate replacement for a contiguous strided 3D array than a vector-of-vectors is a suitable replacement for an image. 
    2424 
    25 The situation is somewhat worse, however, because Eigen has different compile-time types for dealing with arrays that own their data, blocks of arrays that own their data, and arrays that reference external data ({{{Matrix}}}, {{{Block}}}, and {{{Map}}}, respectively).  This makes it necessary to use templates to support any operation on Eigen-based arrays that doesn't care about how they were allocated, which in turn makes it impossible to write such an operation as a virtual member function.  Meanwhile, {{{std::vector}}} has no support for views or shared data, while {{{afw::Image}}} only allows sharing between images, making it impossible to construct, for instance, an {{{Eigen::Map}}} that references data in an {{{afw::Image}}}, or to construct an {{{afw::Image}}} or {{{std::vector}}} view into a block of an {{{Eigen::VectorXd}}}. 
     25The situation is somewhat worse, however, because Eigen has different compile-time types for dealing with arrays that own their data, blocks of arrays that own their data, and arrays that reference external data ({{{Matrix}}}, {{{Block}}}, and {{{Map}}}, respectively).  This makes it necessary to use templates to support any operation on Eigen-based arrays that doesn't care about how they were allocated, which in turn makes it impossible to write such an operation as a virtual member function.  Meanwhile, {{{std::vector}}} has no support for views or shared data, while {{{afw::Image}}} only allows sharing between images, making it impossible to construct, for instance, an {{{Eigen::Map}}} that references data in an {{{afw::Image}}}, or to construct an {{{afw::Image}}} (or {{{std::vector}}}) view into a row of an {{{Eigen::MatrixXd}}}. 
    2626 
    2727Clearly none of these types should go away; they all have their specific uses.  And not all of the above cases are necessary.  However, algorithm code that operates on a simple 1-, 2-, or 3-dimensional strided array concept should be built around that bare concept, not on the details of how the memory that comprises that array was allocated or how its lifetime is managed, and all objects that can support that concept should somehow be adaptable to it. 
     
    3131For now, we can limit {{{ndarray}}} usage to the {{{multifit}}} package, and simply copy {{{afw::Image}}} objects into 2D {{{ndarray}}} objects on the boundary between multifit and other code.  In the future, we hope that either: 
    3232 * {{{ndarray}}} will be used more widely throughout the project, and integrated closely with {{{afw::Image}}} so that both {{{ndarray}}} and {{{afw::Image}}} can share references to the same memory (this will require changes to the internals of {{{afw::Image}}}, but crucially it need not change the ownership semantics or other external behavior of {{{afw::Image}}}); ''or'' 
    33  * {{{ndarray}}} will be replaced in multifit by a custom set of 1-, 2-, and 3-dimensional array classes that provide the {{{ndarray}}} functionality needed by multifit and are similarly integrated with {{{afw::Image}}}. 
     33 * {{{ndarray}}} will be replaced in multifit by a custom set of 1- and 3-dimensional array classes that provide the {{{ndarray}}} functionality needed by multifit and are similarly integrated with {{{afw::Image}}}, along with additional updates to {{{afw::Image}}} to allow it to fulfill the 2d {{{ndarray}}} role. 
    3434 
    35 We intend to use {{{Eigen}}} mostly via {{{Map}}} objects which will be constructed from {{{ndarray}}}-owned data; this will allow us to benefit from optimized {{{Eigen}}} operations while avoiding constructing {{{ndarray}}} objects (or wanting to construct {{{afw::Image}}} objects) that reference memory that is not reference-counted. 
     35We intend to use {{{Eigen}}} mostly via temporary {{{Map}}} objects which will be constructed from {{{ndarray}}}-owned data; this will allow us to benefit from optimized {{{Eigen}}} operations while avoiding constructing {{{ndarray}}} objects (or wanting to construct {{{afw::Image}}} objects) that reference memory that is not reference-counted. 
    3636 
    3737== Additional Materials ==