Skip to content

Commit

Permalink
Updating docs and repo name
Browse files Browse the repository at this point in the history
  • Loading branch information
borkweb committed Feb 17, 2021
1 parent cac8ee0 commit 0165a1b
Show file tree
Hide file tree
Showing 4 changed files with 123 additions and 22 deletions.
2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
FROM python:3.7-alpine

LABEL version="1.0.0"
LABEL repository="https://github.com/the-events-calendar/action-s3-exists"
LABEL repository="https://github.com/the-events-calendar/action-s3-utility"
LABEL maintainer="The Events Calendar <support@theeventscalendar.com>"

# https://github.com/aws/aws-cli/blob/master/CHANGELOG.rst
Expand Down
133 changes: 116 additions & 17 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,19 +1,40 @@
# GitHub Action to check for file existence in an S3 bucket
# GitHub Action to perform various S3 commands

This action uses the AWS CLI to verify a file's existence to help prevent executing GitHub Workflows when they are unnecessary.
This action uses the AWS CLI to execute various S3 commands (ls, sync, rm) and helper commands (exists). Helper commands are simplified GitHub Action-specific commands for S3 operations.

## Usage
## Configuration

This action relies on having an S3 bucket on which to make requests.
The following settings must be passed as environment variables for all
of the commands provided by this GitHub Action. Sensitive information, especially `S3_ACCESS_KEY_ID` and
`S3_SECRET_ACCESS_KEY` should be set as encrypted secrets — otherwise,
they'll be public to anyone browsing your repository's source code and
CI logs.

| Key | Value | Suggested Type | Required | Default |
| ------------- | ------------- | ------------- | ------------- | ------------- |
| `COMMAND` | The action command (see below) you wish to execute | `env` | **Yes** | N/A |
| `S3_ACCESS_KEY_ID` | Your AWS Access Key. [More info here.](https://docs.aws.amazon.com/general/latest/gr/managing-aws-access-keys.html) | `secret env` | **Yes** | N/A |
| `S3_SECRET_ACCESS_KEY` | Your AWS Secret Access Key. [More info here.](https://docs.aws.amazon.com/general/latest/gr/managing-aws-access-keys.html) | `secret env` | **Yes** | N/A |
| `S3_BUCKET` | The name of the bucket you're syncing to. For example, `jarv.is` or `my-app-releases`. | `secret env` | **Yes** | N/A |
| `S3_REGION` | The region where you created your bucket. Set to `us-east-1` by default. [Full list of regions here.](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/using-regions-availability-zones.html#concepts-available-regions) | `env` | No | `us-east-1` |
| `S3_ENDPOINT` | The endpoint URL of the bucket you're syncing to. Can be used for [VPC scenarios](https://aws.amazon.com/blogs/aws/new-vpc-endpoint-for-amazon-s3/) or for non-AWS services using the S3 API, like [DigitalOcean Spaces](https://www.digitalocean.com/community/tools/adapting-an-existing-aws-s3-application-to-digitalocean-spaces). | `env` | No | Automatic (`s3.amazonaws.com` or AWS's region-specific equivalent) |

### `workflow.yml` Step Example

## Commands

### `exists`

This command performs `aws s3api header-object` to verify if a file exists.

#### `workflow.yml` Step Example

You can add the following as a step in one of your workflows:

```
- name: S3 File Exists
uses: the-events-calendar/action-s3-exists
- name: S3 exists
uses: the-events-calendar/action-s3-utility@main
env:
COMMAND: exists
S3_BUCKET: ${{ secrets.S3_BUCKET }}
S3_ACCESS_KEY_ID: ${{ secrets.S3_ACCESS_KEY_ID }}
S3_SECRET_ACCESS_KEY: ${{ secrets.S3_SECRET_ACCESS_KEY }}
Expand All @@ -24,17 +45,95 @@ You can add the following as a step in one of your workflows:

#### Configuration

The following settings must be passed as environment variables as shown
in the example. Sensitive information, especially `S3_ACCESS_KEY_ID` and
`S3_SECRET_ACCESS_KEY` should be set as encrypted secrets — otherwise,
they'll be public to anyone browsing your repository's source code and
CI logs.
Additional configuration for this command:

| Key | Value | Suggested Type | Required | Default |
| ------------- | ------------- | ------------- | ------------- | ------------- |
| `S3_ACCESS_KEY_ID` | Your AWS Access Key. [More info here.](https://docs.aws.amazon.com/general/latest/gr/managing-aws-access-keys.html) | `secret env` | **Yes** | N/A |
| `S3_SECRET_ACCESS_KEY` | Your AWS Secret Access Key. [More info here.](https://docs.aws.amazon.com/general/latest/gr/managing-aws-access-keys.html) | `secret env` | **Yes** | N/A |
| `S3_BUCKET` | The name of the bucket you're syncing to. For example, `jarv.is` or `my-app-releases`. | `secret env` | **Yes** | N/A |
| `S3_REGION` | The region where you created your bucket. Set to `us-east-1` by default. [Full list of regions here.](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/using-regions-availability-zones.html#concepts-available-regions) | `env` | No | `us-east-1` |
| `S3_ENDPOINT` | The endpoint URL of the bucket you're syncing to. Can be used for [VPC scenarios](https://aws.amazon.com/blogs/aws/new-vpc-endpoint-for-amazon-s3/) or for non-AWS services using the S3 API, like [DigitalOcean Spaces](https://www.digitalocean.com/community/tools/adapting-an-existing-aws-s3-application-to-digitalocean-spaces). | `env` | No | Automatic (`s3.amazonaws.com` or AWS's region-specific equivalent) |
| `FILE` | The file you want to check for the existence of | `env` | No | `/` (root of bucket) |

### `ls`

This command performs `aws s3 ls` and returns the output into the `outputs.ls_output`.

#### `workflow.yml` Step Example

You can add the following as a step in one of your workflows:

```
- name: S3 ls
uses: the-events-calendar/action-s3-utility@main
env:
COMMAND: ls
S3_BUCKET: ${{ secrets.S3_BUCKET }}
S3_ACCESS_KEY_ID: ${{ secrets.S3_ACCESS_KEY_ID }}
S3_SECRET_ACCESS_KEY: ${{ secrets.S3_SECRET_ACCESS_KEY }}
S3_REGION: ${{ secrets.S3_REGION }}
S3_ENDPOINT: ${{ secrets.S3_ENDPOINT }}
FILE: 'some-file-name.txt'
```

#### Configuration

Additional configuration for this command:

| Key | Value | Suggested Type | Required | Default |
| ------------- | ------------- | ------------- | ------------- | ------------- |
| `FILE` | The file you want to check for the existence of | `env` | No | `/` (root of bucket) |

### `rm`

This command performs `aws s3 rm`.

#### `workflow.yml` Step Example

You can add the following as a step in one of your workflows:

```
- name: S3 rm
uses: the-events-calendar/action-s3-utility@main
env:
COMMAND: rm
S3_BUCKET: ${{ secrets.S3_BUCKET }}
S3_ACCESS_KEY_ID: ${{ secrets.S3_ACCESS_KEY_ID }}
S3_SECRET_ACCESS_KEY: ${{ secrets.S3_SECRET_ACCESS_KEY }}
S3_REGION: ${{ secrets.S3_REGION }}
S3_ENDPOINT: ${{ secrets.S3_ENDPOINT }}
FILE: 'some-file-name.txt'
```

#### Configuration

Additional configuration for this command:

| Key | Value | Suggested Type | Required | Default |
| ------------- | ------------- | ------------- | ------------- | ------------- |
| `FILE` | The file you want to check for the existence of | `env` | No | `/` (root of bucket) |

### `sync`

This command performs `aws s3 sync`.

#### `workflow.yml` Step Example

You can add the following as a step in one of your workflows:

```
- name: S3 sync
uses: the-events-calendar/action-s3-utility@main
env:
COMMAND: sync
S3_BUCKET: ${{ secrets.S3_BUCKET }}
S3_ACCESS_KEY_ID: ${{ secrets.S3_ACCESS_KEY_ID }}
S3_SECRET_ACCESS_KEY: ${{ secrets.S3_SECRET_ACCESS_KEY }}
S3_REGION: ${{ secrets.S3_REGION }}
S3_ENDPOINT: ${{ secrets.S3_ENDPOINT }}
SOURCE_DIR: 'some-dir/'
```

#### Configuration

Additional configuration for this command:

| Key | Value | Suggested Type | Required | Default |
| ------------- | ------------- | ------------- | ------------- | ------------- |
| `SOURCE_DIR` | The file or directory you wish to sync | `env` | No | `/` (root of bucket) |
8 changes: 5 additions & 3 deletions action.yml
Original file line number Diff line number Diff line change
@@ -1,12 +1,14 @@
name: 'S3 Exists'
description: 'Check if a file or directory exists within an S3 bucket'
name: 'S3 Utility'
description: 'Execute various s3 (ls, sync, rm) and s3 helper commands (exists)'
author: 'The Events Calendar (support@theeventscalendar.com)'
branding:
icon: file
color: blue
outputs:
exists:
description: \'true\' if file or directory exists, \'false\' otherwise
description: With the exists s3 helper command - \'true\' if file or directory exists, \'false\' otherwise
ls_output:
description: Output from \'aws s3 ls\' command
runs:
using: 'docker'
image: 'Dockerfile'
2 changes: 1 addition & 1 deletion commands/ls.sh
Original file line number Diff line number Diff line change
Expand Up @@ -16,4 +16,4 @@ output=$(sh -c "$the_command")

echo $output

echo "::set-output name=value::${output}"
echo "::set-output name=ls_output::${output}"

0 comments on commit 0165a1b

Please sign in to comment.