Back to source plugin

Export from S3 to PostgreSQL

CloudQuery is an open-source data integration platform that allows you to export data from any source to any destination.

The CloudQuery S3 plugin allows you to sync data from S3 to any destination, including PostgreSQL. It takes only minutes to get started.

S3
s3
Official
Premium

S3

This plugin is in preview.

The CloudQuery S3 source plugin reads parquet files and loads them into any supported CloudQuery destination (e.g. PostgreSQL, BigQuery, Snowflake, and more)

Publisher

cloudquery

Latest version

v1.1.6

Type

Source

Platforms
Date Published

postgresql
Official

PostgreSQL

This destination plugin lets you sync data from a CloudQuery source to a PostgreSQL compatible database.

Publisher

cloudquery

Repositorygithub.com
Latest version

v8.0.4

Type

Destination

Platforms
Date Published

MacOS Setup

Step 1. Install CloudQuery

brew install cloudquery/tap/cloudquery

Step 2. Log in to CloudQuery CLI

Logging in is required to use premium plugins and premium tables in open-core plugins.

cloudquery login

Step 3. Configure S3 source plugin

You can find more information about the configuration in the plugin documentation

kind: source
spec:
  name: s3

  path: cloudquery/s3
  registry: cloudquery
  version: "v1.1.6"
  tables: ["*"]
  destinations: ["postgresql"]

  spec:
    # TODO: Update it with the actual spec 
    bucket: "<BUCKET_NAME>"
    region: "<REGION>"
    # path_prefix: "" # optional. Only sync files with this prefix
    # concurrency: 50 # optional. Number of files to sync in parallel. Default: 50

Step 4. Configure PostgreSQL destination plugin

You can find more information about the configuration in the plugin documentation

kind: destination
spec:
  name: "postgresql"
  path: "cloudquery/postgresql"
  registry: "cloudquery"
  version: "v8.0.4"

  spec:
    connection_string: "${POSTGRESQL_CONNECTION_STRING}" # set the environment variable in a format like postgres://postgres:pass@localhost:5432/postgres?sslmode=disable
    # you can also specify it in DSN format, which can hold special characters in the password field:
    # connection_string: "user=postgres password=pass+0-[word host=localhost port=5432 dbname=postgres"
    # Optional parameters:
    # pgx_log_level: error
    # batch_size: 10000 # 10K entries
    # batch_size_bytes: 100000000 # 100 MB
    # batch_timeout: 60s

Step 5. Run Sync

cloudquery sync s3.yml postgresql.yml
Subscribe to product updates

Be the first to know about new features.