Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NEW: Migrate NVIDIA redist feedstocks to use cf-nvidia-tools #3580

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

carterbox
Copy link

Description:

Checklist:

  • Pydantic model updated or no update needed

Cross-refs, links to issues, etc:

Copy link
Contributor

@beckermr beckermr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is check solvable turned to false?

@carterbox
Copy link
Author

@beckermr, I guessed that this parameter means that the bot will trying to update the nodes in a certain order, but this migrator can be applied in any order. Am I interpreting this parameter incorrectly?

@beckermr
Copy link
Contributor

check_solvable controls whether or not the bot ensures that the feedstock environments are solvable before it tries to issue a PR.

@carterbox
Copy link
Author

This migrator shouldn't be affecting the solvability of a recipe's environments because it only adds one package to the build environment. If an environment is unsolvable, I will want the PR opened anyways, so that a human can fix the feedstock.

@carterbox
Copy link
Author

I have tested my migration function locally, but not the filtering logic. Is there a sandbox for developing migrators, or a way to rate limit the migration?

@carterbox carterbox marked this pull request as ready for review January 17, 2025 18:02
@carterbox
Copy link
Author

@jakirkham, please review

@carterbox
Copy link
Author

Do not merge. Just actively looking for feedback at this point.

Copy link
Contributor

@jakirkham jakirkham left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Daniel! 🙏

Appreciate you putting this together so quickly 😄

Had a couple questions below



def _insert_requirements_build(meta):
insert_before: bool = True
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Think we can add this after compilers and like (assuming I'm understanding the intent of this parameter correctly)

Suggested change
insert_before: bool = True
insert_before: bool = False

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not a function parameter. It is a variable which holds the state of "whether we are currently inserting lines into the variable named 'before'".

We're not going to do that because trying to locate the compiler packages and insert the new dependency after them is technically challenging and not necessary for a working migration. Adding the complexity of inserting after compiler packages instead of just inserting into requirements.build makes additional assumptions including 1. that the package has compiler packages listed in the top-level requirements.build section 2. the compiler packages are already sorted to the top of the requirements.build section

Comment on lines 29 to 31
if build_found:
insert_before = False
elif line.startswith("requirements"):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
if build_found:
insert_before = False
elif line.startswith("requirements"):
if line.startswith("requirements"):

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was intentional. Once you find the requirements.build section, you don't need to do any more searching for it because you already know where to split the text. Maybe I could refactor to not execute insert_before=False, but I'm not sure it's worth the effort?

Comment on lines 41 to 42
if not (build_found or requirements_found):
return
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So is the idea of this check that we are skipping metapackages? Or are there other cases this would be used for?

May help to have a comment above clarifying intent

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we didn't find the requirements.build section, then we cannot add our new dependency to the file.

f.writelines(before + [" - cf-nvidia-tools # [linux]\n"] + after)


def _insert_build_script(meta):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What do you think about adding this to the recipe's test/commands section?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No. There's no reason to wait until the test phase to check whether the binaries have the expected glibc symbols. We can save everyone time by failing as early as possible. Additionally, I think that the environment variable c_stdlib_version is not set during the test phase, and it's required for this check.

Comment on lines +198 to +206
# STEP 3: Remove os_version keys from conda-forge.yml
config = os.path.join(recipe_dir, "..", "conda-forge.yml")
with open(config) as f:
y = yaml_safe_load(f)
y_orig = copy.deepcopy(y)
y.pop("os_version", None)
if y_orig != y:
with open(config, "w") as f:
yaml_safe_dump(y, f)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So we are dropping the os_version specification of the Docker image to use?

Wonder if it makes to add/confirm {{ stdlib("c") }} is used and c_stdlib_version is set correctly

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes. It's not necessary to force use an old container.

The tool fails if c_stdlib_version is not set, so indirectly this is already checked?

"schedule another one. If you do not have permissions to add this "
"label, you can use the phrase "
"<code>@<space/>conda-forge-admin, please rerun bot</code> "
"in a PR comment to have the `conda-forge-admin` add it for you.\n\n"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wonder if we should add some text on how to handle issues or ask for help in resolving

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please look at line 113. The description directs users (us) to ping me for questions.

@carterbox
Copy link
Author

@jakirkham, I have refactored the item insertion logic. Hopefully it is now more easily groked.

@carterbox carterbox requested a review from jakirkham January 21, 2025 20:09
@beckermr
Copy link
Contributor

@carterbox Let me know when you want review on this. I am not sure what the status is currently.

import os.path
from typing import Any

import conda_forge_tick.migrators
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is probably the circular import.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants