-
Notifications
You must be signed in to change notification settings - Fork 63
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Create SOPS file with TF generated secrets #50
Comments
Hey @ztmr, |
A sample use-case is to generate some data via terraform and output it onto file-system encrypted. This can then be subsequently consumed by something like |
I don't like calling OS commands from Terraform, but this is a simplified version of what we have done as a workaround for now: resource "local_file" "environment-infra-yaml" {
sensitive_content = yamlencode({ some = 1, data = 2 })
filename = "environment-infra.yaml"
file_permission = "0640"
}
resource "null_resource" "sops-encrypt-infra-secrets" {
depends_on = [local_file.environment-infra-yaml]
triggers = {
# This is to handle updates
environment_infra_yaml_updated = local_file.environment-infra-yaml.id
}
provisioner "local-exec" {
command = "sops -e -i environment-infra.yaml"
}
} Definitely better than hooking the app deployment to Terraform directly -- it allows to keep infra and apps provisioning separated and allows for some debugging too. |
@ztmr problem with local-exec is that the file gets regenerated and re-encrypted at every run:
If the provider just had a resource type for sops file with a parameter for content to be encrypted, then it could decrypt existing in memory and compare with contents parameter and only regenerate if necessary. I tried trigger on the content of the file, and this works but has its own problems: if you remove the encrypted file but the content has not changed, the file does not get regenerated. |
@schollii you are absolutely right! Our use-case however is to run this in GitOps-style pipeline, so there is no human interaction or any pre-existing files. That's why we chose not to use I am still hoping to see a native solution without any intermediate files at some point! 😉 |
BTW I use sops in https://registry.terraform.io/modules/schollii/gen-files/local/latest |
An issue was opened there |
I'd like to add a TF snippet that we are going to use, hopefully it's helpful to anyone who wants to encrypt files with SOPS via Terraform. As mentioned earlier, the SOPS Go package does not expose encrypt functionality so it will not be possible to embed it in this provider. Calling the sops binary directly is what remains.
This code fills a template and passes the content via standard input to SOPS. It checks on the hash of the content to see if the file needs to be regenerated, clean and simple. |
https://github.com/lokkersp/terraform-provider-sops which appears to be a fork of this repo actually adds the writing of the file. Could we maybe merge these two workstream's so we can have the full functionality? |
@iverberk problem is that local-exec is done, not at plan time but at apply time. Let's say I want to use the result of the encrypted file to put he value on S3 (ex if I use terraform to template a flux S3 source bucket). Then the problem is the local-exec cannot be used as an output, and the file gets updated only after the apply (I would have to apply 2 times to get the change, it's not consistent). Would be really great to have the capacity for the plugin to generate in memory SOPS file at plan time. |
For posterity I have a solution that works with an "external" instead of a "null_ressource" that allow to call SOPS at plan time. // template all files locally
module "template_infrastructure_files" {
source = "hashicorp/dir/template"
version = "1.0.2"
base_dir = "${path.module}/s3-structure/infrastructure"
template_vars = {
# Pass in any values that you wish to use in your templates.
traefik_user_secret_token = var.traefik_dashboard_secret
}
}
// this filter only the file that include -secret in the name to know which one we need to call SOPS on
locals {
// this list all files of the infra directory that do not contains secret
infrastructure_file_not_encrypted = {for k, v in module.template_infrastructure_files.files : k => v if length(regexall("-secret",k))<=0 }
infrastructure_file_to_encrypt = {for k, v in module.template_infrastructure_files.files : k => v if length(regexall("-secret",k))>0 }
}
// execute local bash to call SOPS
data "external" "encrypt_infra_secrets" {
for_each = local.infrastructure_file_to_encrypt
program = ["bash", "${path.module}/encrypt_with_sops.sh"]
query = {
content = each.value.content # Mean each value must come from a templates , if not we could also pass the source and update script accordingly
encrypted_regex = "^(data|stringData)$"
bucket_path = each.key
kms_arn = var.sops_kms_arn
}
}
// Ex of usage with S3 to upload to the bucket the encrypted value
resource "aws_s3_object" "infrastructure_bucket_encrypted" {
for_each = data.external.encrypt_infra_secrets
bucket = var.flux_bucket_name
key = "infrastructure/${each.value.result.bucket_path}"
content = each.value.result.encrypted_content
source_hash = each.value.result["md5_hash"] # Not mandatory
}
The bash : #!/bin/bash
# Require JQ
# Require MD5 brew install md5sha1sum (not mandatory)
# Read inputs from stdin
eval "$(jq -r '@sh "CONTENT=\(.content) ENCRYPTED_REGEX=\(.encrypted_regex) BUCKET_PATH=\(.bucket_path) SOPS_KMS_ARN=\(.kms_arn)"')"
export SOPS_KMS_ARN=$SOPS_KMS_ARN
# Encrypt the content
ENCRYPTED_CONTENT=$(echo "$CONTENT" | sops -e --encrypted-regex "$ENCRYPTED_REGEX" --input-type yaml --output-type yaml /dev/stdin)
MD5_HASH=$(echo -n "$CONTENT" | md5sum | awk '{ print $1 }')
# Return the encrypted content and bucket path as JSON
jq -n --arg md5_hash "$MD5_HASH" --arg encrypted_content "$ENCRYPTED_CONTENT" --arg bucket_path "$BUCKET_PATH" '{"encrypted_content":$encrypted_content, "bucket_path":$bucket_path, "md5_hash":$md5_hash}'
The only drawback is that since the SOP execution always regenerate a different output, subsequent execution will end with different output and this trigger re-upload on s3. |
Is there any way to produce a SOPS encrypted file from Terraform?
I can certainly do this by using
local_file
module/resource and call SOPS just after the Terraform has dumped this in plaintext. It would be much more convenient to do it directly, given we already have this SOPS provider.The text was updated successfully, but these errors were encountered: