-
Notifications
You must be signed in to change notification settings - Fork 336
fix ci import error #2876
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix ci import error #2876
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/ao/2876
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 422a9d3 with merge base 9f1e32b ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
@@ -45,6 +45,12 @@ | |||
"torchtitan not installed, skipping MoE tests.", allow_module_level=True | |||
) | |||
|
|||
if torch.version.hip is not None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you add this to test_tp.py and test_fsdp_tp.py as well?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for adding this! Please make sure all CI is green for merging
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see module level skip here. any plan to enable the test suite in the future?
We would love to have AMD support in the future, but have no concrete plan/timeline for it. We are currently focused on B200 and H100 related workflows for PTC in October. If you'd like to contribute AMD support for MoE training code it would be welcome! |
No description provided.