Skip to content

Commit

Permalink
Merge pull request #155 from jkaninda/nightly
Browse files Browse the repository at this point in the history
 fix: the configuration file path is not being detected when it is enclosed in quotes
  • Loading branch information
jkaninda authored Jan 12, 2025
2 parents 104c9b5 + 1297606 commit 2c9790c
Show file tree
Hide file tree
Showing 19 changed files with 909 additions and 499 deletions.
6 changes: 3 additions & 3 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name: Build
on:
push:
branches: ['develop']
branches: ['nightly']
jobs:
docker:
runs-on: ubuntu-latest
Expand All @@ -26,7 +26,7 @@ jobs:
file: "./Dockerfile"
platforms: linux/amd64,linux/arm64,linux/arm/v7
build-args: |
appVersion=develop-${{ github.sha }}
appVersion=nightly
tags: |
"${{vars.BUILDKIT_IMAGE}}:develop-${{ github.sha }}"
"${{vars.BUILDKIT_IMAGE}}:nightly"
15 changes: 8 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ It supports a variety of storage options and ensures data security through GPG e
- Local storage
- AWS S3 or any S3-compatible object storage
- FTP
- SFTP
- SSH-compatible storage
- Azure Blob storage

Expand All @@ -36,7 +37,7 @@ It supports a variety of storage options and ensures data security through GPG e
## Use Cases

- **Automated Recurring Backups:** Schedule regular backups for PostgreSQL databases.
- **Cross-Environment Migration:** Easily migrate your PostgreSQL databases across different environments using supported storage options.
- **Cross-Environment Migration:** Easily migrate PostgreSQL databases across different environments using supported storage options.
- **Secure Backup Management:** Protect your data with GPG encryption.


Expand Down Expand Up @@ -189,13 +190,13 @@ Documentation references Docker Hub, but all examples will work using ghcr.io ju
## References
We decided to publish this image as a simpler and more lightweight alternative because of the following requirements:
We created this image as a simpler and more lightweight alternative to existing solutions. Here’s why:
- **Lightweight:** Written in Go, the image is optimized for performance and minimal resource usage.
- **Multi-Architecture Support:** Supports `arm64` and `arm/v7` architectures.
- **Docker Swarm Support:** Fully compatible with Docker in Swarm mode.
- **Kubernetes Support:** Designed to work seamlessly with Kubernetes.
- The original image is based on `Alpine` and requires additional tools, making it heavy.
- This image is written in Go.
- `arm64` and `arm/v7` architectures are supported.
- Docker in Swarm mode is supported.
- Kubernetes is supported.
## License
Expand Down
52 changes: 40 additions & 12 deletions docs/how-tos/azure-blob.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,22 +4,43 @@ layout: default
parent: How Tos
nav_order: 5
---
# Azure Blob storage

{: .note }
As described on local backup section, to change the storage of your backup and use Azure Blob as storage. You need to add `--storage azure` (-s azure).
You can also specify a folder where you want to save you data by adding `--path my-custom-path` flag.
# Backup to Azure Blob Storage

To store your backups on Azure Blob Storage, you can configure the backup process to use the `--storage azure` option.

## Backup to Azure Blob storage
This section explains how to set up and configure Azure Blob-based backups.

```yml
---

## Configuration Steps

1. **Specify the Storage Type**
Add the `--storage azure` flag to your backup command.

2. **Set the Blob Path**
Optionally, specify a custom folder within your Azure Blob container where backups will be stored using the `--path` flag.
Example: `--path my-custom-path`.

3. **Required Environment Variables**
The following environment variables are mandatory for Azure Blob-based backups:

- `AZURE_STORAGE_CONTAINER_NAME`: The name of the Azure Blob container where backups will be stored.
- `AZURE_STORAGE_ACCOUNT_NAME`: The name of your Azure Storage account.
- `AZURE_STORAGE_ACCOUNT_KEY`: The access key for your Azure Storage account.

---

## Example Configuration

Below is an example `docker-compose.yml` configuration for backing up to Azure Blob Storage:

```yaml
services:
mysql-bkup:
# In production, it is advised to lock your image tag to a proper
# release version instead of using `latest`.
# Check https://github.com/jkaninda/pg-bkup/releases
# for a list of available releases.
# In production, lock your image tag to a specific release version
# instead of using `latest`. Check https://github.com/jkaninda/pg-bkup/releases
# for available releases.
image: jkaninda/pg-bkup
container_name: pg-bkup
command: backup --storage azure -d database --path my-custom-path
Expand All @@ -29,16 +50,23 @@ services:
- DB_NAME=database
- DB_USERNAME=username
- DB_PASSWORD=password
## Azure Blob configurations
## Azure Blob Configuration
- AZURE_STORAGE_CONTAINER_NAME=backup-container
- AZURE_STORAGE_ACCOUNT_NAME=account-name
- AZURE_STORAGE_ACCOUNT_KEY=Ppby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==
# pg-bkup container must be connected to the same network with your database

# Ensure the pg-bkup container is connected to the same network as your database
networks:
- web

networks:
web:
```
---
## Key Notes
- **Custom Path**: Use the `--path` flag to specify a folder within your Azure Blob container for organizing backups.
- **Security**: Ensure your `AZURE_STORAGE_ACCOUNT_KEY` is kept secure and not exposed in public repositories.
- **Compatibility**: This configuration works with Azure Blob Storage and other compatible storage solutions.
55 changes: 42 additions & 13 deletions docs/how-tos/backup-to-ftp.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,22 +4,43 @@ layout: default
parent: How Tos
nav_order: 4
---
# Backup to FTP remote server

# Backup to FTP Remote Server

As described for s3 backup section, to change the storage of your backup and use FTP Remote server as storage. You need to add `--storage ftp`.
You need to add the full remote path by adding `--path /home/jkaninda/backups` flag or using `REMOTE_PATH` environment variable.
To store your backups on an FTP remote server, you can configure the backup process to use the `--storage ftp` option. This section explains how to set up and configure FTP-based backups.

{: .note }
These environment variables are required for SSH backup `FTP_HOST`, `FTP_USER`, `REMOTE_PATH`, `FTP_PORT` or `FTP_PASSWORD`.
---

## Configuration Steps

1. **Specify the Storage Type**
Add the `--storage ftp` flag to your backup command.

2. **Set the Remote Path**
Define the full remote path where backups will be stored using the `--path` flag or the `REMOTE_PATH` environment variable.
Example: `--path /home/jkaninda/backups`.

3. **Required Environment Variables**
The following environment variables are mandatory for FTP-based backups:

- `FTP_HOST`: The hostname or IP address of the FTP server.
- `FTP_PORT`: The FTP port (default is `21`).
- `FTP_USER`: The username for FTP authentication.
- `FTP_PASSWORD`: The password for FTP authentication.
- `REMOTE_PATH`: The directory on the FTP server where backups will be stored.

---

```yml
## Example Configuration

Below is an example `docker-compose.yml` configuration for backing up to an FTP remote server:

```yaml
services:
pg-bkup:
# In production, it is advised to lock your image tag to a proper
# release version instead of using `latest`.
# Check https://github.com/jkaninda/pg-bkup/releases
# for a list of available releases.
# In production, lock your image tag to a specific release version
# instead of using `latest`. Check https://github.com/jkaninda/pg-bkup/releases
# for available releases.
image: jkaninda/pg-bkup
container_name: pg-bkup
command: backup --storage ftp -d database
Expand All @@ -29,16 +50,24 @@ services:
- DB_NAME=database
- DB_USERNAME=username
- DB_PASSWORD=password
## FTP config
## FTP Configuration
- FTP_HOST="hostname"
- FTP_PORT=21
- FTP_USER=user
- FTP_PASSWORD=password
- REMOTE_PATH=/home/jkaninda/backups

# pg-bkup container must be connected to the same network with your database
# Ensure the pg-bkup container is connected to the same network as your database
networks:
- web

networks:
web:
```
```
---
## Key Notes
- **Security**: FTP transmits data, including passwords, in plaintext. For better security, consider using SFTP (SSH File Transfer Protocol) or FTPS (FTP Secure) if supported by your server.
- **Remote Path**: Ensure the `REMOTE_PATH` directory exists on the FTP server and is writable by the specified `FTP_USER`.
102 changes: 70 additions & 32 deletions docs/how-tos/backup-to-s3.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,22 +4,44 @@ layout: default
parent: How Tos
nav_order: 2
---
# Backup to AWS S3
# Backup to AWS S3

{: .note }
As described on local backup section, to change the storage of you backup and use S3 as storage. You need to add `--storage s3` (-s s3).
You can also specify a specify folder where you want to save you data by adding `--path /my-custom-path` flag.
To store your backups on AWS S3, you can configure the backup process to use the `--storage s3` option. This section explains how to set up and configure S3-based backups.

---

## Configuration Steps

1. **Specify the Storage Type**
Add the `--storage s3` flag to your backup command.

2. **Set the S3 Path**
Optionally, specify a custom folder within your S3 bucket where backups will be stored using the `--path` flag.
Example: `--path /my-custom-path`.

3. **Required Environment Variables**
The following environment variables are mandatory for S3-based backups:

## Backup to S3
- `AWS_S3_ENDPOINT`: The S3 endpoint URL (e.g., `https://s3.amazonaws.com`).
- `AWS_S3_BUCKET_NAME`: The name of the S3 bucket where backups will be stored.
- `AWS_REGION`: The AWS region where the bucket is located (e.g., `us-west-2`).
- `AWS_ACCESS_KEY`: Your AWS access key.
- `AWS_SECRET_KEY`: Your AWS secret key.
- `AWS_DISABLE_SSL`: Set to `"true"` if using an S3 alternative like Minio without SSL (default is `"false"`).
- `AWS_FORCE_PATH_STYLE`: Set to `"true"` if using an S3 alternative like Minio (default is `"false"`).

```yml
---

## Example Configuration

Below is an example `docker-compose.yml` configuration for backing up to AWS S3:

```yaml
services:
pg-bkup:
# In production, it is advised to lock your image tag to a proper
# release version instead of using `latest`.
# Check https://github.com/jkaninda/pg-bkup/releases
# for a list of available releases.
# In production, lock your image tag to a specific release version
# instead of using `latest`. Check https://github.com/jkaninda/pg-bkup/releases
# for available releases.
image: jkaninda/pg-bkup
container_name: pg-bkup
command: backup --storage s3 -d database --path /my-custom-path
Expand All @@ -29,60 +51,76 @@ services:
- DB_NAME=database
- DB_USERNAME=username
- DB_PASSWORD=password
## AWS configurations
## AWS Configuration
- AWS_S3_ENDPOINT=https://s3.amazonaws.com
- AWS_S3_BUCKET_NAME=backup
- AWS_REGION="us-west-2"
- AWS_REGION=us-west-2
- AWS_ACCESS_KEY=xxxx
- AWS_SECRET_KEY=xxxxx
## In case you are using S3 alternative such as Minio and your Minio instance is not secured, you change it to true
## Optional: Disable SSL for S3 alternatives like Minio
- AWS_DISABLE_SSL="false"
- AWS_FORCE_PATH_STYLE=false # true for S3 alternative such as Minio

# pg-bkup container must be connected to the same network with your database
## Optional: Enable path-style access for S3 alternatives like Minio
- AWS_FORCE_PATH_STYLE=false

# Ensure the pg-bkup container is connected to the same network as your database
networks:
- web

networks:
web:
```
### Recurring backups to S3
---
## Recurring Backups to S3
As explained above, you need just to add AWS environment variables and specify the storage type `--storage s3`.
In case you need to use recurring backups, you can use `--cron-expression "0 1 * * *"` flag or `BACKUP_CRON_EXPRESSION=0 1 * * *` as described below.
To schedule recurring backups to S3, use the `--cron-expression` flag or the `BACKUP_CRON_EXPRESSION` environment variable. This allows you to define a cron schedule for automated backups.

```yml
### Example: Recurring Backup Configuration

```yaml
services:
pg-bkup:
# In production, it is advised to lock your image tag to a proper
# release version instead of using `latest`.
# Check https://github.com/jkaninda/pg-bkup/releases
# for a list of available releases.
# In production, lock your image tag to a specific release version
# instead of using `latest`. Check https://github.com/jkaninda/pg-bkup/releases
# for available releases.
image: jkaninda/pg-bkup
container_name: pg-bkup
command: backup --storage s3 -d my-database"
command: backup --storage s3 -d database --cron-expression "0 1 * * *"
environment:
- DB_PORT=5432
- DB_HOST=postgres
- DB_NAME=database
- DB_USERNAME=username
- DB_PASSWORD=password
## AWS configurations
## AWS Configuration
- AWS_S3_ENDPOINT=https://s3.amazonaws.com
- AWS_S3_BUCKET_NAME=backup
- AWS_REGION="us-west-2"
- AWS_REGION=us-west-2
- AWS_ACCESS_KEY=xxxx
- AWS_SECRET_KEY=xxxxx
# - BACKUP_CRON_EXPRESSION=0 1 * * * # Optional
#Delete old backup created more than specified days ago
## Optional: Define a cron schedule for recurring backups
#- BACKUP_CRON_EXPRESSION=0 1 * * *
## Optional: Delete old backups after a specified number of days
#- BACKUP_RETENTION_DAYS=7
## In case you are using S3 alternative such as Minio and your Minio instance is not secured, you change it to true
## Optional: Disable SSL for S3 alternatives like Minio
- AWS_DISABLE_SSL="false"
- AWS_FORCE_PATH_STYLE=true # true for S3 alternative such as Minio
# pg-bkup container must be connected to the same network with your database
## Optional: Enable path-style access for S3 alternatives like Minio
- AWS_FORCE_PATH_STYLE=false

# Ensure the pg-bkup container is connected to the same network as your database
networks:
- web

networks:
web:
```
---
## Key Notes
- **Cron Expression**: Use the `--cron-expression` flag or `BACKUP_CRON_EXPRESSION` environment variable to define the backup schedule. For example, `0 1 * * *` runs the backup daily at 1:00 AM.
- **Backup Retention**: Optionally, use the `BACKUP_RETENTION_DAYS` environment variable to automatically delete backups older than a specified number of days.
- **S3 Alternatives**: If using an S3 alternative like Minio, set `AWS_DISABLE_SSL="true"` and `AWS_FORCE_PATH_STYLE="true"` as needed.

Loading

0 comments on commit 2c9790c

Please sign in to comment.