Problems with autograd.hessian_vector_product and scipy.opti



我正在尝试使用scipy.optimize(包括NonlinearConstraint)来运行最小化问题。我真的不想自己编码衍生品,所以我正在使用autograd来做到这一点。但是,即使我遵循对minimizeNonlinearConstraint的参数完全相同的过程,第一个似乎可以工作,第二个行为也不起作用。

这是我的MWE:

useconstraint = False
import autograd
import autograd.numpy as np
from scipy import optimize
def function(x): return x[0]**2 + x[1]**2
functionjacobian = autograd.jacobian(function)
functionhvp = autograd.hessian_vector_product(function)
def constraint(x): return np.array([x[0]**2 - x[1]**2])
constraintjacobian = autograd.jacobian(constraint)
constrainthvp = autograd.hessian_vector_product(constraint)
constraint = optimize.NonlinearConstraint(constraint, 1, np.inf, constraintjacobian, constrainthvp)
startpoint = [1, 2]
bounds = optimize.Bounds([-np.inf, -np.inf], [np.inf, np.inf])
print optimize.minimize(
  function,
  startpoint,
  method='trust-constr',
  jac=functionjacobian,
  hessp=functionhvp,
  constraints=[constraint] if useconstraint else [],
  bounds=bounds,
)

当我关闭useconstraint(在顶部)时,它可以正常工作,并按预期在(0, 0)时最小化。当我打开它时,我会收到以下错误:

Traceback (most recent call last):
  File "test.py", line 29, in <module>
    bounds=bounds,
  File "/home/heshy/.local/lib/python2.7/site-packages/scipy/optimize/_minimize.py", line 613, in minimize
    callback=callback, **options)
  File "/home/heshy/.local/lib/python2.7/site-packages/scipy/optimize/_trustregion_constr/minimize_trustregion_constr.py", line 336, in _minimize_trustregion_constr
    for c in constraints]
  File "/home/heshy/.local/lib/python2.7/site-packages/scipy/optimize/_constraints.py", line 213, in __init__
    finite_diff_bounds, sparse_jacobian)
  File "/home/heshy/.local/lib/python2.7/site-packages/scipy/optimize/_differentiable_functions.py", line 343, in __init__
    self.H = hess(self.x, self.v)
  File "/home/heshy/.local/lib/python2.7/site-packages/autograd/wrap_util.py", line 20, in nary_f
    return unary_operator(unary_f, x, *nary_op_args, **nary_op_kwargs)
  File "/home/heshy/.local/lib/python2.7/site-packages/autograd/differential_operators.py", line 24, in grad
    vjp, ans = _make_vjp(fun, x)
  File "/home/heshy/.local/lib/python2.7/site-packages/autograd/core.py", line 10, in make_vjp
    end_value, end_node =  trace(start_node, fun, x)
  File "/home/heshy/.local/lib/python2.7/site-packages/autograd/tracer.py", line 10, in trace
    end_box = fun(start_box)
  File "/home/heshy/.local/lib/python2.7/site-packages/autograd/wrap_util.py", line 15, in unary_f
    return fun(*subargs, **kwargs)
  File "/home/heshy/.local/lib/python2.7/site-packages/autograd/differential_operators.py", line 88, in vector_dot_grad
    return np.tensordot(fun_grad(*args, **kwargs), vector, np.ndim(vector))
  File "/home/heshy/.local/lib/python2.7/site-packages/autograd/tracer.py", line 44, in f_wrapped
    ans = f_wrapped(*argvals, **kwargs)
  File "/home/heshy/.local/lib/python2.7/site-packages/autograd/tracer.py", line 48, in f_wrapped
    return f_raw(*args, **kwargs)
  File "/home/heshy/.local/lib/python2.7/site-packages/numpy/core/numeric.py", line 1371, in tensordot
    raise ValueError("shape-mismatch for sum")
ValueError: shape-mismatch for sum

我在做什么错?我认为问题是在hessian_vector_product中,因为我在错误消息中看到hess,但我不确定。

好吧,我找到了答案。这很困惑。

hessp参数 minimize期望的函数返回"目标函数的黑es词任意向量p"(源)。相比之下,hessNonlinearConstraint的参数期望"一个可召唤的(必须返回DOT的Hessian Matrix)(fun,v)"(源)。

如果您像我一样解释了第一个引用,"目标函数时间是任意向量p)",则与" dot(fun,v)的黑森矩阵"几乎相同。因此,我以为您可以对两者使用相同的autograd功能。

然而,正确的解释是"(目标函数的黑森)乘以任意向量p",这是完全不同的。autograd中的hessian_vector_product功能给出了第一个的正确结果,但是第二个功能需要不同的功能。

最新更新