next up previous [pdf]

Next: Derivatives with respect to Up: Theory Previous: Theory

Gradient of the objective function

I plan to solve the optimization problem defined in 11 by a gradient-based optimization algorithm. Therefore, the development of an algorithm to efficiently compute the gradient of the objective function with respect to slowness is an essential step to make the method practical. In this section I introduce the basic methodology to compute the gradients, and I leave some of the details to Appendix A.

The gradient of both the local objective function 3 and the global one 7 are computed using the chain rule. The first terms of the chains are the derivatives of the objective function with respect the moveout parameters. The second terms are the derivatives of the moveout parameters with respect to slowness; they are computed from the fitting objective functions 4 and 8.