You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
i wrote counterparts to the existing finite_difference_jacobian for the case where the function of interest is analytically differentiable. happy to submit a PR if there is interest. would just need to know the best place to put it.
function analytic_jacobian{T<:Number}(fdot::Vector{Function}, x::Vector{T})
f_x = fdot[1](x)
J = Array(Float64,length(f_x),length(x))
J[:,1] = f_x
for i = 2:length(fdot)
J[:,i] = fdot[i](x)
end
J
end
"""
`jacobian(fdot::Vector{Function}) -> g(x::Vector)`
Given a function `f` whose partial derivatives are `fdot`, return a function
`g(x)` which itself returns the Jacobian of `f` at `x`.
"""
function jacobian(fdot::Vector{Function})
g(x::Vector) = analytic_jacobian(fdot, x)
return g
end
The text was updated successfully, but these errors were encountered:
this packages up the output of the derivatives into a single function which returns the Jacobian matrix. since it uses the same interface as it's finite difference counterpart, it's easy to switch between the two. for example, the code structure between the analytic and finite-difference versions of levenberg-marquardt least squares fit can be nearly identical with the addition of these two functions.
i wrote counterparts to the existing
finite_difference_jacobian
for the case where the function of interest is analytically differentiable. happy to submit a PR if there is interest. would just need to know the best place to put it.The text was updated successfully, but these errors were encountered: