Skip to content

Brilliant-B/Vim_Research

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Vision Mamba Experiments

on --> Efficient Visual Representation Learning with Bidirectional State Space Model

Lianghui Zhu1 *,Bencheng Liao1 *,Qian Zhang2, Xinlong Wang3, Wenyu Liu1, Xinggang Wang1 📧

1 Huazhong University of Science and Technology, 2 Horizon Robotics, 3 Beijing Academy of Artificial Intelligence

(*) equal contribution, (📧) corresponding author.

ArXiv Preprint (arXiv 2401.09417)

Abstract

Try Hilbert Indexing on the serial.

Envs. for Pretraining & Finetuning with HI

  • Python 3.10.13

    • conda create -n your_env_name python=3.10.13
  • torch 2.1.1 + cu118

    • pip install torch==2.1.1 torchvision==0.16.1 torchaudio==2.1.1 --index-url https://download.pytorch.org/whl/cu118
  • Requirements: vim_requirements.txt

    • pip install -r vim/vim_requirements.txt
  • Install causal_conv1d and mamba

    • cd causal_conv1d; pip install -e .
    • cd mamba; pip install -e .

Train Your Vim

To train Vim-Ti on ImageNet-1K, run:

bash vim/scripts/vim-train.sh

To finetune Vim-Ti on ImageNet-1K based on the Published Checkpoint, run:

bash vim/scripts/vim-finetune.sh

Evaluation on Provided Weights

To evaluate Vim-Ti on ImageNet-1K, run:

bash vim/scripts/vim-eval.sh

Model Weights

Model #param. Top-1 Acc. Top-5 Acc. Hugginface Repo
Vim-tiny 7M 73.1 91.1 https://huggingface.co/hustvl/Vim-tiny

Acknowledgement ❤️

This project is based on Mamba (paper, code), Causal-Conv1d (code), DeiT (paper, code). Thanks for their wonderful works.

Citation

If you find Vim is useful in your research or applications, please consider giving us a star 🌟 and citing it by the following BibTeX entry.

 @article{vim,
  title={Vision Mamba: Efficient Visual Representation Learning with Bidirectional State Space Model},
  author={Lianghui Zhu and Bencheng Liao and Qian Zhang and Xinlong Wang and Wenyu Liu and Xinggang Wang},
  journal={arXiv preprint arXiv:2401.09417},
  year={2024}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published