-
Notifications
You must be signed in to change notification settings - Fork 79
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NEW: Migrate NVIDIA redist feedstocks to use cf-nvidia-tools #3580
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why is check solvable turned to false?
@beckermr, I guessed that this parameter means that the bot will trying to update the nodes in a certain order, but this migrator can be applied in any order. Am I interpreting this parameter incorrectly? |
check_solvable controls whether or not the bot ensures that the feedstock environments are solvable before it tries to issue a PR. |
This migrator shouldn't be affecting the solvability of a recipe's environments because it only adds one package to the build environment. If an environment is unsolvable, I will want the PR opened anyways, so that a human can fix the feedstock. |
I have tested my migration function locally, but not the filtering logic. Is there a sandbox for developing migrators, or a way to rate limit the migration? |
@jakirkham, please review |
Do not merge. Just actively looking for feedback at this point. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks Daniel! 🙏
Appreciate you putting this together so quickly 😄
Had a couple questions below
|
||
|
||
def _insert_requirements_build(meta): | ||
insert_before: bool = True |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Think we can add this after compilers and like (assuming I'm understanding the intent of this parameter correctly)
insert_before: bool = True | |
insert_before: bool = False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is not a function parameter. It is a variable which holds the state of "whether we are currently inserting lines into the variable named 'before'".
We're not going to do that because trying to locate the compiler packages and insert the new dependency after them is technically challenging and not necessary for a working migration. Adding the complexity of inserting after compiler packages instead of just inserting into requirements.build makes additional assumptions including 1. that the package has compiler packages listed in the top-level requirements.build section 2. the compiler packages are already sorted to the top of the requirements.build section
if build_found: | ||
insert_before = False | ||
elif line.startswith("requirements"): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if build_found: | |
insert_before = False | |
elif line.startswith("requirements"): | |
if line.startswith("requirements"): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This was intentional. Once you find the requirements.build section, you don't need to do any more searching for it because you already know where to split the text. Maybe I could refactor to not execute insert_before=False, but I'm not sure it's worth the effort?
if not (build_found or requirements_found): | ||
return |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So is the idea of this check that we are skipping metapackages? Or are there other cases this would be used for?
May help to have a comment above clarifying intent
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we didn't find the requirements.build section, then we cannot add our new dependency to the file.
f.writelines(before + [" - cf-nvidia-tools # [linux]\n"] + after) | ||
|
||
|
||
def _insert_build_script(meta): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What do you think about adding this to the recipe's test/commands
section?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No. There's no reason to wait until the test phase to check whether the binaries have the expected glibc symbols. We can save everyone time by failing as early as possible. Additionally, I think that the environment variable c_stdlib_version is not set during the test phase, and it's required for this check.
# STEP 3: Remove os_version keys from conda-forge.yml | ||
config = os.path.join(recipe_dir, "..", "conda-forge.yml") | ||
with open(config) as f: | ||
y = yaml_safe_load(f) | ||
y_orig = copy.deepcopy(y) | ||
y.pop("os_version", None) | ||
if y_orig != y: | ||
with open(config, "w") as f: | ||
yaml_safe_dump(y, f) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So we are dropping the os_version
specification of the Docker image to use?
Wonder if it makes to add/confirm {{ stdlib("c") }}
is used and c_stdlib_version
is set correctly
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes. It's not necessary to force use an old container.
The tool fails if c_stdlib_version is not set, so indirectly this is already checked?
"schedule another one. If you do not have permissions to add this " | ||
"label, you can use the phrase " | ||
"<code>@<space/>conda-forge-admin, please rerun bot</code> " | ||
"in a PR comment to have the `conda-forge-admin` add it for you.\n\n" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wonder if we should add some text on how to handle issues or ask for help in resolving
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please look at line 113. The description directs users (us) to ping me for questions.
@jakirkham, I have refactored the item insertion logic. Hopefully it is now more easily groked. |
@carterbox Let me know when you want review on this. I am not sure what the status is currently. |
import os.path | ||
from typing import Any | ||
|
||
import conda_forge_tick.migrators |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is probably the circular import.
Description:
Checklist:
Cross-refs, links to issues, etc: