You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As I solved a large mip model on more datasets on a jupyter notebook on my wsl on windows, the memory it used increased. At last, the task will fail because it used up the memory(8G). But if I run the notebook one by one and each one just solve one dataset, it can work successfully.
I guess the memory won't be completely cleared up when a model ends its solving.
If u need more correspoding information, u can tell me how to get them.
The text was updated successfully, but these errors were encountered:
sorry for reply now for I being seriously ill recently. I have done another experient. In one notebook, I continuely run following function for solving nqueens for 40 times, the memory usage improves 1GB. And the memory used by the function should be freed after its calling.
def nqueens(N):
h = highspy.Highs()
h.silent()
x = h.addBinaries(N, N)
h.addConstrs(x.sum(axis=0) == 1) # each row has exactly one queen
h.addConstrs(x.sum(axis=1) == 1) # each col has exactly one queen
y = np.fliplr(x)
h.addConstrs(x.diagonal(k).sum() <= 1 for k in range(-N + 1, N)) # each diagonal has at most one queen
h.addConstrs(y.diagonal(k).sum() <= 1 for k in range(-N + 1, N)) # each 'reverse' diagonal has at most one queen
h.solve()
sol = h.vals(x)
As I solved a large mip model on more datasets on a jupyter notebook on my wsl on windows, the memory it used increased. At last, the task will fail because it used up the memory(8G). But if I run the notebook one by one and each one just solve one dataset, it can work successfully.
I guess the memory won't be completely cleared up when a model ends its solving.
If u need more correspoding information, u can tell me how to get them.
The text was updated successfully, but these errors were encountered: