Back to source plugin

Export from Alibaba Cloud to Kafka

CloudQuery is an open-source data integration platform that allows you to export data from any source to any destination.

The CloudQuery Alibaba Cloud plugin allows you to sync data from Alibaba Cloud to any destination, including Kafka. It takes only minutes to get started.

alicloud
Official
Premium

Alibaba Cloud

The Alibaba Cloud source plugin for CloudQuery extracts configuration from the Alibaba Cloud API and loads it into any supported CloudQuery destination

Publisher

cloudquery

Repositorygithub.com
Latest version

v5.0.0

Type

Source

Platforms
Date Published

Kafka
kafka
Official

Kafka

This plugin is in preview.

This destination plugin lets you sync data from a CloudQuery source to Kafka in various formats such as CSV, JSON. Each table will be pushed to a separate topic

Publisher

cloudquery

Repositorygithub.com
Latest version

v3.3.9

Type

Destination

Platforms
Date Published

MacOS Setup

Step 1. Install CloudQuery

brew install cloudquery/tap/cloudquery

Step 2. Log in to CloudQuery CLI

Logging in is required to use premium plugins and premium tables in open-core plugins.

cloudquery login

Step 3. Configure Alibaba Cloud source plugin

You can find more information about the configuration in the plugin documentation

kind: source
spec:
  name: "alicloud"
  path: "cloudquery/alicloud"
  registry: "cloudquery"
  version: "v5.0.0"
  tables: ["*"]
  destinations: ["kafka"]
  spec:
    accounts:
      - name: my_account
        regions:
        - cn-hangzhou
        - cn-beijing
        - eu-west-1
        - us-west-1
        # ...
        access_key: ${ALICLOUD_ACCESS_KEY}
        secret_key: ${ALICLOUD_SECRET_KEY}
    # Optional parameters
    # concurrency: 50000
    # bill_history_months: 12

Step 4. Configure Kafka destination plugin

You can find more information about the configuration in the plugin documentation

kind: destination
spec:
  name: "kafka"
  path: "cloudquery/kafka"
  registry: "cloudquery"
  version: "v3.3.9"
  write_mode: "append"
  spec:
    # required - list of brokers to connect to
    brokers: ["<broker-host>:<broker-port>"]
    # optional - if connecting via SASL/PLAIN, the username and password to use. If not set, no authentication will be used.
    sasl_username: "${KAFKA_SASL_USERNAME}"
    sasl_password: "${KAFKA_SASL_PASSWORD}"
    format: "json" # options: parquet, json, csv
    format_spec:
      # CSV-specific parameters:
      # delimiter: ","
      # skip_header: false

    # Optional parameters
    # compression: "" # options: gzip
    # client_id: cq-destination-kafka
    # verbose: false
    # batch_size: 1000

Step 5. Run Sync

cloudquery sync alicloud.yml kafka.yml
Subscribe to product updates

Be the first to know about new features.