-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question for pretrain pth #14
Comments
Hi, @Dorothy-h357h Feel free to comment here if you meet any problem during reproduction. |
Thanks for your reply. |
We use the ST-Adapter in ViT and just finetune ST-Adapter during pre-training, i.e., the other parts are fixed. |
Can you provide a piece of code for pretraining ST-Adapter? Thank you |
You may refer to this file of ST-Adapter: Only the adapters in ViT are trained, just like: |
I'm very sorry, but I still don't understand how you compare with methods like ST-Adapter and VCL. At the same time, I also found that ProViCo and CORP do not have open source code. Can you please explain in detail? |
Can you provide the weights for all the comparative methods trained on your dataset? Thank you~
The text was updated successfully, but these errors were encountered: