 
 
 
 
 
   
 than will the shadows and highlights.
This intuition is made formal by the
so-called certainty functionsmannwyckoff[3],
which are the derivatives of the response functions.
 than will the shadows and highlights.
This intuition is made formal by the
so-called certainty functionsmannwyckoff[3],
which are the derivatives of the response functions.
The certainty functions may therefore be used to weight the columns of
 , and the corresponding entries of
the vector
, and the corresponding entries of
the vector  , when solving for
, when solving for
 :
  A_w=A (w1)
:
  A_w=A (w1)
  A_w^TA_wF - A_w^TK_w = 0
  
where  denotes Hadamard
multiplication5,
 denotes Hadamard
multiplication5,
 is a column vector in which each entry is made up of
 is a column vector in which each entry is made up of
 where
 where  is the certainty function, and
 is the certainty function, and  is the column index
in which the quantity
 is the column index
in which the quantity  appears for that row.
 appears for that row.
In practice, we do not know the response function (this is the very entity we are trying to estimate) so we also do not know a-priori, its derivative, the certainty function, which we need to use in the weighting.
|  | 
 is not particularly sensitive to the shape of the certainty function.
A Gaussian weighting is generally used.
If desired, once
is not particularly sensitive to the shape of the certainty function.
A Gaussian weighting is generally used.
If desired, once  is found,
 is found,  can be differentiated and
used as the certainty function to weight the columns of
 can be differentiated and
used as the certainty function to weight the columns of  to
re-estimate
 to
re-estimate  .  This procedure can be repeated again (e.g.
giving rise to an iterative estimation of
.  This procedure can be repeated again (e.g.
giving rise to an iterative estimation of  ).
).
 
 
 
 
