Query your Cloud Asset Inventory Using Natural Language - Discover our MCP Server. Learn more ❯

CloudQuery

Tutorials

Running Embedded ELT with CloudQuery and Snowpark Container Service

This tutorial will guide you through running CloudQuery ELT Connectors within your Snowflake account using the Snowpark Container Service.
Due to CloudQuery’s standalone architecture, it is straightforward to run CloudQuery in any other container service or even inside your Airflow or any other orchestrator. Snowpark Container Service has its own advantages, especially if you are a heavy Snowflake user or have special data requirements where you want your ELT workloads to run near your data warehouse.

Prerequisites #

You will need the following:
  • A Snowflake account with an active subscription (Snowpark is not supported on trial accounts)
  • Permissions to create databases and warehouses in Snowflake
Caveats:
  • Snowpark is GA, but some features are not available on all clouds (this tutorial was tested on Snowflake AWS)
Run the following in a Snowflake Worksheet to create the database, warehouse, and compute pool where the CloudQuery job will be executed (You can also reuse an existing setup once):
USE ROLE ACCOUNTADMIN;

CREATE ROLE IF NOT EXISTS cq_role;

CREATE DATABASE IF NOT EXISTS cq_db;
GRANT OWNERSHIP ON DATABASE cq_db TO ROLE cq_role COPY CURRENT GRANTS;

CREATE OR REPLACE WAREHOUSE cq_warehouse WITH
 WAREHOUSE_SIZE='X-SMALL';
GRANT USAGE ON WAREHOUSE cq_warehouse TO ROLE cq_role;

GRANT BIND SERVICE ENDPOINT ON ACCOUNT TO ROLE cq_role;

CREATE COMPUTE POOL cq_compute_pool
 MIN_NODES = 1
 MAX_NODES = 1
 INSTANCE_FAMILY = CPU_X64_XS;
GRANT USAGE, MONITOR ON COMPUTE POOL cq_compute_pool TO ROLE cq_role;

GRANT ROLE cq_role TO USER <your_logged_in_username>
Create the following repository where you will push the CloudQuery Docker and the stage where you can put the CloudQuery config files to run different ELT (sync) jobs (You can also reuse repositories and stages if you already have those).
USE ROLE cq_role;
USE DATABASE cq_db;
USE WAREHOUSE cq_warehouse;

CREATE SCHEMA IF NOT EXISTS cq_schema;
USE SCHEMA cq_schema;
CREATE IMAGE REPOSITORY IF NOT EXISTS cq_repository;
CREATE STAGE IF NOT EXISTS cq_stage
 ENCRYPTION = ( TYPE = 'SNOWFLAKE_SSE')
 DIRECTORY = ( ENABLE = true );

Upload CloudQuery Docker #

You will only need to perform the following step once, as Snowflake Container Service doesn’t yet support running public Docker. Therefore, you will need to download and then push the CloudQuery Docker to the Snowflake repository. You may want to upload a new Docker when a new version of CloudQuery CLI is released, but the integrations version is downloaded at runtime and can be updated via configuration.
# Update the version after a new snowflake is released
docker pull ghcr.io/cloudquery/cloudquery:6.21-linux-amd64

# Authenticate with the snowflake docker repository
# This is only possible if the administrator enables username/password login otherwise use the snowspcs command in the following line
docker login <account_identifier>.registry.snowflakecomputing.com/cq_db/cq_schema/cq_repository -u <username>

# Use it if the previuos one didn't work
snow spcs image-registry login --account <account_identifier> --user <username> --mfa-passcode <mfa-passcode>

# will prompt you for a password (You should have multi factor auth enabled)
docker tag  ghcr.io/cloudquery/cloudquery:6.21-linux-amd64 <account_identifier>.registry.snowflakecomputing.com/cq_db/cq_schema/cq_repository/cloudquery:6.21

docker push <account_identifier>.registry.snowflakecomputing.com/cq_db/cq_schema/cq_repository/cloudquery:6.21

Upload CloudQuery Config #

Next, you’ll need to upload the CloudQuery config. This will make it easy to mount the different configs required for the CloudQuery job to run.
You can easily do that via the UI by going to the stage and pressing “Add File”.
The config file in this tutorial looks like this:
kind: source
spec:
 name: "hackernews"
 path: "cloudquery/hackernews"
 registry: "cloudquery"
 version: "v3.7.21"
 tables: ["*"]
 destinations:
   - "snowflake"
 # Learn more about the configuration options at https://cql.ink/hackernews_source
 spec:
   item_concurrency: 100
   start_time: "2025-06-28T00:00:00Z"
---
kind: destination
spec:
 name: snowflake
 path: cloudquery/snowflake
 version: "v4.5.1"
 write_mode: "append"
 # Learn more about the configuration options at https://cql.ink/snowflake_destination
 spec:
   connection_string: "${SNOWFLAKE_HOST}/${SNOWFLAKE_DATABASE}/${SNOWFLAKE_SCHEMA}?warehouse=${SNOWFLAKE_WAREHOUSE}&authenticator=oauth&token=token"

Running CloudQuery #

Execute the following from a CloudQuery worksheet:
-- this needs to run from accountadmin
use database cq_db;
use schema public;
CREATE OR REPLACE NETWORK RULE allow_all_egress_rule
 MODE = EGRESS
 TYPE = HOST_PORT
 VALUE_LIST = ('0.0.0.0:80', '0.0.0.0:443');

CREATE EXTERNAL ACCESS INTEGRATION cloudquery_access_integration
 ALLOWED_NETWORK_RULES = (allow_all_egress_rule)
 ENABLED = true;
GRANT USAGE ON INTEGRATION cloudquery_access_integration TO ROLE cq_role;
use schema cq_schema;
Now you are finally ready to run the job!
EXECUTE JOB SERVICE
 IN COMPUTE POOL cq_compute_pool
 NAME=cq_sync
 EXTERNAL_ACCESS_INTEGRATIONS = (cloudquery_access_integration)
 FROM SPECIFICATION $$
 spec:
   container:
   - name: main
     image: /cq_db/cq_schema/cq_repository/cloudquery:6.22
     env:
       SNOWFLAKE_WAREHOUSE: cq_warehouse
       CLOUDQUERY_API_KEY: ****
     volumeMounts:
       - name: configs
         mountPath: /tmp/configs
     args:
       - "sync"
       - "--log-console"
       - "/tmp/configs/hn_sync.yaml"
   volumes:
     - name: configs
       source: "@cq_stage"
 $$
That’s it! Now you should be able to query HackerNews comments in your Snowflake database!
If you need to sync data from 60+ sources, check out CloudQuery Hub, or if you need to write a custom plugin, take a look at our developer guide!
Turn cloud chaos into clarity

Find out how CloudQuery can help you get clarity from a chaotic cloud environment with a personalized conversation and demo.


© 2025 CloudQuery, Inc. All rights reserved.