Gradient Vector, Jacobian Matrix and Hessian Matrix
Gradient or Gradient Vector of any Multi-Variable Function is the Vector of all Partial Derivatives of the Function with respect to all its Variables.
For example, given a Function \(F\) of \(N\) Variables \(x_1, x_2, \cdots, x_n\), the Gradient of Function \(F\) is given as
Given a set of \(K\) Multi-Variable Functions \(F_1, F_2, F_3, \cdots, F_k\) dependent on same set of \(N\) Variables \(x_1, x_2, x_3, \cdots, x_n\),
the Jacobian Matrix \(J\) is given by the Partial Derivatives of the Functions as follows
The Columns of Transpose of any Jacobian Matrix are an composed of Gradients of contributing Functions.
A Hessian Matrix is a Square Matrix of Second-Order Partial Derivatives of a Multi-Variable Scalar Function.
For example, given a Function \(F\) of \(N\) Variables \(x_1, x_2, x_3, \cdots, x_n\), the \(N \times N\) Hessian Matrix for the function is given as