-
Notifications
You must be signed in to change notification settings - Fork 933
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix race condition in config handling (main) #2937
Fix race condition in config handling (main) #2937
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
} | ||
return err | ||
} | ||
// only delete old orphans which are not caught by the signal handler in WriteConfig |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The condition check on the next line is for skipping the newer files. That's why it feels a little bit confusing to read this comment together with the next line.
It would be better to say something like: "skip newer files created within 5 minutes".
Nevertheless thank you @gururajsh so much for fixing this!
Hi @gururajsh |
Hi @f-blass |
Hi @f-blass , |
Description of the Change
The CLI encounters race conditions when handling config files if multiple commands are executed in parallel.
Currently, the CLI writes a new config for every command executed due to the code in command_parser.go. However, it also deletes all temporary config files when reading the config, leading to race conditions between multiple processes. This PR introduces two changes to mitigate this problem:
cli/util/configv3/write_config.go
Lines 63 to 66 in 04df8ae
Why Is This PR Valuable?
Enables usage of the CF CLI in scripts which execute commands in parallel
Applicable Issues
fixes #2232
How Urgent Is The Change?
Medium
Other Relevant Parties
Anyone using the CLI in scripts with parallel execution. Multiple users have encountered this issue, as discussed in #2232.
Related PRs
v7
v8