Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update weights optimization #1791

Merged
merged 13 commits into from
Nov 26, 2024

Conversation

clement-pages
Copy link
Collaborator

BREAKING: remove finetune_wavlm attributes from the PixIT task, and replace it with walm_frozen in Totatonet separation model, to be consistent with what is done for the segmentation model.

@clement-pages
Copy link
Collaborator Author

Merging this PR should fix #1789

@hbredin
Copy link
Member

hbredin commented Nov 20, 2024

@Lebourdais suggested that we set wavlm.requires_grad to False in __init__ instead of using the with torch.no_grad logic. I think it is indeed a better idea as it allows to get a better estimate of the number of trainable parameters (i.e. those for which requires_grad is set to True.

@clement-pages can you update your PR to make this change and also update the SSeRiouSS model and diarization task accordingly?

@clement-pages
Copy link
Collaborator Author

@clement-pages can you update your PR to make this change and also update the SSeRiouSS model and diarization task accordingly?

OK, I will update it.

clement-pages added 2 commits November 20, 2024 13:29
In the same way that is done for the speaker diarization task
@clement-pages clement-pages changed the title fix weights update issue when training separation model update weights optimization Nov 20, 2024
pyannote/audio/tasks/segmentation/speaker_diarization.py Outdated Show resolved Hide resolved
pyannote/audio/tasks/separation/PixIT.py Outdated Show resolved Hide resolved
@hbredin hbredin merged commit 9a2368b into pyannote:develop Nov 26, 2024
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants