-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix circular import in ds_transformer.py #5804
Conversation
sznmelvin
commented
Jul 28, 2024
- Removed conditional imports of TritonMLP and TritonSelfAttention from module level
- Implemented lazy imports for Triton modules inside init method
- This change aims to resolve circular dependency issues with DeepSpeed Transformer inference
- Removed conditional imports of TritonMLP and TritonSelfAttention from module level - Implemented lazy imports for Triton modules inside __init__ method - This change aims to resolve circular dependency issues with DeepSpeed Transformer inference
@@ -13,9 +13,6 @@ | |||
from deepspeed.accelerator import get_accelerator | |||
from deepspeed.ops.op_builder import InferenceBuilder | |||
import deepspeed | |||
if deepspeed.HAS_TRITON: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@eonsparks - could you please run the pre-commit formatter (pre-commit run --all-files
) to resolve te formatting errors?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@eonsparks - something looks to be weird merge-artifact wise here, since this file should already be in the repo, same with the nv-mii yml file. Could you take a look at this or synchronize your branch with the latest in DeepSpeed?