刚刚定义了一个新变量,现在程序陷入了无限循环



我是编程的初学者,我写了一个实现优化算法的脚本。起初效果很好;但后来我试图通过定义一个新变量来使其更快,现在由于某种原因它似乎陷入了无限循环。这是第一个版本(我在评论中指出了将进行更改的位置(:

# This program uses the Steepest Descent Method to 
# minimize the Rosenbrock function
import numpy as np
import time
# Define the Rosenbrock Function
def f(x_k):
x, y = x_k[0, 0], x_k[0, 1] 
return 100 * (y - x**2)**2 + (1 - x)**2
# Gradient of f 
def gradient(x_k):
x, y = x_k[0, 0], x_k[0, 1] 
return  np.array([[-400*x*(y-x**2)-2*(1-x), 200*(y-x**2)]])


def main():
start = time.time()
# Define the starting guess
x_k = np.array([[2, 2]])
# Define counter for number of steps
numSteps = 0
# Keep iterating until both components of the gradient are less than 0.1 in absolute value
while abs((gradient(x_k)[0, 0])) > 0.1 or abs((gradient(x_k))[0, 1]) > 0.1:
numSteps = numSteps + 1
# Step direction
p_k = - gradient(x_k)
gradTrans = - p_k.T
# Now we use a backtracking algorithm to find a step length
alpha = 1.0
ratio = 0.8
c = 0.01 # This is just a constant that is used in the algorithm
# This loop selects an alpha which satisfies the Armijo condition  
#####################################
###### CHANGE WILL HAPPEN HERE ######
#####################################
while f(x_k + alpha * p_k) > f(x_k) + (alpha * c * (gradTrans  @ p_k))[0, 0]:
alpha = ratio * alpha
x_k = x_k + alpha * p_k
end =  time.time()
print("The number of steps is: ", numSteps)
print("The final step is:", x_k)
print("The gradient is: ", gradient(x_k))
print("The elapsed time is:", round(end - start, 2), "seconds.")

main()

现在,该程序效率非常低,因为在第二个while循环中,每次迭代时都会计算f(x_k) + (alpha * c * (gradTrans @ p_k))[0, 0]:数量,即使它是恒定的。所以我决定将这个数量命名为RHS = f(x_k) + (alpha * c * (gradTrans @ p_k))[0, 0]:并将其放在 while 循环中。新代码如下。我所做的只是将这个数量定义为一个变量,现在程序陷入了无限循环。非常感谢您的任何帮助。

# This program uses the Steepest Descent Method to 
# minimize the Rosenbrock function
import numpy as np
import time
# Define the Rosenbrock Function
def f(x_k):
x, y = x_k[0, 0], x_k[0, 1] 
return 100 * (y - x**2)**2 + (1 - x)**2
# Gradient of f 
def gradient(x_k):
x, y = x_k[0, 0], x_k[0, 1] 
return  np.array([[-400*x*(y-x**2)-2*(1-x), 200*(y-x**2)]])


def main():
start = time.time()
# Define the starting guess
x_k = np.array([[2, 2]])
# Define counter for number of steps
numSteps = 0
# Keep iterating until both components of the gradient are less than 0.1 in absolute value
while abs((gradient(x_k)[0, 0])) > 0.1 or abs((gradient(x_k))[0, 1]) > 0.1:
numSteps = numSteps + 1
# Step direction
p_k = - gradient(x_k)
gradTrans = - p_k.T
# Now we use a backtracking algorithm to find a step length
alpha = 1.0
ratio = 0.8
c = 0.01 # This is just a constant that is used in the algorithm
# This loop selects an alpha which satisfies the Armijo condition  
RHS = f(x_k) + (alpha * c * (gradTrans  @ p_k))[0, 0]
#####################################
###### CHANGE HAS OCCURED ###########
#####################################
while f(x_k + alpha * p_k) > RHS:
alpha = ratio * alpha
x_k = x_k + alpha * p_k
end =  time.time()
print("The number of steps is: ", numSteps)
print("The final step is:", x_k)
print("The gradient is: ", gradient(x_k))
print("The elapsed time is:", round(end - start), "seconds.")

main()
RHS

需要在循环中使用alpha的新值重新计算。 (不知道这如何加快速度。

最新更新