|
|
|
| Decon in the log domain with variable gain | |
|
Next: ALGORITHM
Up: Claerbout et al.: Log
Previous: THE GRADIENT
We adopt the convention that components of a vector
range over the values of
, likewise for other vectors.
Given the gradient direction
we need to know the residual change
and a distance
to go:
and
.
A two-term example demonstrates a required linearization.
With that background,
neglecting
,
and knowing the gradient
,
let us work out the forward operator to find
.
Let ``
'' denote convolution.
|
|
|
(22) |
|
|
|
(23) |
|
|
|
(24) |
|
|
|
(25) |
|
|
|
(26) |
|
|
|
(27) |
|
|
|
(28) |
It is pleasing that
is proportional to
.
This might mean we can deal with a wide dynamic range within
.
The convolution, a physical process, occurs in the physical domain
which is only later gained to the statistical domain
.
Naturally, the convolution may be done as a product in the frequency domain.
To minimize
express it as a Taylor series approximation to quadratic order.
Minimizing yields
Update
and
,
optionally (Newton method) iterate (because the locations of the many Taylor series
have changed slightly with the change in
).
|
|
|
| Decon in the log domain with variable gain | |
|
Next: ALGORITHM
Up: Claerbout et al.: Log
Previous: THE GRADIENT
2012-05-10