Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Google Dataflow docs #3148

Open
wants to merge 17 commits into
base: main
Choose a base branch
from
Open

Add Google Dataflow docs #3148

wants to merge 17 commits into from

Conversation

BentsiLeviav
Copy link
Contributor

Summary

These pages organize the knowledge around ClickHosue and Dataflow, including Dataflow template coverage.

Checklist

@BentsiLeviav BentsiLeviav requested a review from a team as a code owner January 27, 2025 14:57
@BentsiLeviav BentsiLeviav requested review from saisrirampur, mzitnik, mshustov and laeg and removed request for a team January 27, 2025 14:57
Copy link
Contributor

@gingerwizard gingerwizard left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@@ -3452,4 +3452,24 @@ znode
znodes
zookeeperSessionUptime
zstd
DataFlow
Dataflow
DataflowTemplates
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we dont globally exclude this? add a file level specific exclusion

GoogleSQL
InputTableSpec
KMSEncryptionKey
clickHousePassword
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

see above

clickHouseUsername
insertDeduplicate
insertDistributedSync
insertQuorum
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

these are parametters. I dont want to exclude this globally. put settings in ``

maxRetries
outputDeadletterTable
queryLocation
queryTempDataset
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same for these, these arent valid global exclusions

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we use images for folders - scripts will rely on

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Renamed the folder to images.


[Google Dataflow](https://cloud.google.com/dataflow) is a fully managed stream and batch data processing service. It supports pipelines written in Java or Python and is built on the Apache Beam SDK.

There are two main ways to use Google Dataflow with ClickHouse, both are leveraging [`ClickHouseIO`](../../apache-beam):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@gingerwizard do we have any guidelines on using relative vs absolute links?

Co-authored-by: Mikhail Shustov <restrry@gmail.com>

## List of ClickHouse Templates
* [BigQuery To ClickHouse](./templates/bigquery-to-clickhouse)
* GCS To ClickHouse (coming soon!)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are we going to work on these in the foreseeable future? If not, I'd recommend creating issues in https://github.com/ClickHouse/DataflowTemplates, linking them here with a CTA to upvote the issue and provide more details about the use case
@laeg, do you have something else in mind to track the signals from the field?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Issues are disabled for this fork, I've contacted the relevant folks to enable it.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I created these two feature request issues:

  1. [Feature Request]: Create Template for GCS to ClickHouse DataflowTemplates#3
  2. [Feature Request]: Create Template for Pub/Sub to ClickHouse DataflowTemplates#4

and also linked to them from the docs (List of ClickHouse Templates).

Co-authored-by: Mikhail Shustov <restrry@gmail.com>
| `clickHouseUsername` | The ClickHouse username to authenticate with. | ✅ | |
| `clickHousePassword` | The ClickHouse password to authenticate with. | ✅ | |
| `clickHouseTable` | The target ClickHouse table name to insert the data to. | ✅ | |
| `maxInsertBlockSize` | The maximum block size for insertion, if we control the creation of blocks for insertion (ClickHouseIO option). | | A `ClickHouseIO` option. |
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sadly, we don't document the options in ClickHouseIO docs. Maybe we should move these to ClickHouseIO page and link them here

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In pure Beam code (without this template) the parameters are being set differently:
ClickHouseIO - we set the parameters within the code with setter functions like ClickHouseIO.Write.withMaxInsertBlockSize(long),
In this template, we pass the parameters as a template options.

I'll create a section there, and add a link from the template docs to ClickHouseIO's .

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changes were added also to the Apache Beam documentation.


| BigQuery Type | ClickHouse Type | Notes |
|-----------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| [**Array Type**](https://cloud.google.com/bigquery/docs/reference/standard-sql/data-types#array_type) | [**Array Type**](https://clickhouse.com/docs/en/sql-reference/data-types/array) | The inner type must be one of the supported primitive data types listed in this table. |
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so the nested arrays aren't supported, are they?

Copy link
Contributor Author

@BentsiLeviav BentsiLeviav Jan 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Currently, they are not.
I added an indication to take it into account in this apache/beam#33692 issue we opened about it.

…flow-docs

# Conflicts:
#	docs/en/integrations/index.mdx
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants