You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi there, I'm new in this field and I tried to use my own dataset similar to the CAR dataset structure. I'm running on Windows 10 and I couldn't install faiss , i tried using auxiliaries_nofaiss.py for all the imports, is there any solution to this? the model is able to run for 1 epoch, however, the evaluation part is taking very long time, the report is saying 119 GB, and giving me the error as below:
Any help from anyone would be appreciated. Thanks a lot.
GPU:0, dataset:rp2k, arch:resnet50, embed_dim:128, embed_init:default
loss:fastap, sampling:None, samples_per_class:0, resize256:False
bs:32, lr:1e-05, fc_lr_mul:0, decay:0.0004, gamma:0.3, tau:[20], bnft:False
Running with learning rates 1e-05...
Epoch (Train) 0: Mean Loss [0.0136]: 100%|█████████████████████████████████████████| 4605/4605 [27:19<00:00, 2.92it/s]
Computing Evaluation Metrics...: 100%|█████████████████████████████████████████████| 5585/5585 [07:55<00:00, 13.33it/s]
Traceback (most recent call last):
File "Standard_Training.py", line 365, in
main()
File "Standard_Training.py", line 344, in main
eval.evaluate(opt.dataset, LOG, save=True, **eval_params)
File "C:\Users\user\Documents\DeepMetricLearningBaselines\evaluate.py", line 57, in evaluate
ret = evaluate_one_dataset(LOG, **kwargs)
File "C:\Users\user\Documents\DeepMetricLearningBaselines\evaluate.py", line 279, in evaluate_one_dataset
F1, NMI, recall_at_ks, feature_matrix_all = aux.eval_metrics_one_dataset(model, dataloader, device=opt.device, k_vals=opt.k_vals, opt=opt)
File "C:\Users\user\Documents\DeepMetricLearningBaselines\auxiliaries_nofaiss.py", line 239, in eval_metrics_one_dataset
k_closest_points = squareform(pdist(feature_coll)).argsort(1)[:, :int(np.max(k_vals)+1)]
File "C:\Users\user\anaconda3\envs\kunhe\lib\site-packages\scipy\spatial\distance.py", line 1985, in pdist
dm = np.empty((m * (m - 1)) // 2, dtype=np.double) MemoryError: Unable to allocate 119. GiB for an array with shape (15967113051,) and data type float64
@
The text was updated successfully, but these errors were encountered:
Hi there, I'm new in this field and I tried to use my own dataset similar to the CAR dataset structure. I'm running on Windows 10 and I couldn't install faiss , i tried using auxiliaries_nofaiss.py for all the imports, is there any solution to this? the model is able to run for 1 epoch, however, the evaluation part is taking very long time, the report is saying 119 GB, and giving me the error as below:
Any help from anyone would be appreciated. Thanks a lot.
@
The text was updated successfully, but these errors were encountered: