How to setup CloudQuery to build your cloud asset inventory in PostgreSQL and connect it to Google Data Studio for visualization, monitoring and reporting.
In this blog post, we will walk you through how to setup CloudQuery to build your cloud asset inventory in PostgreSQL and connect it to Google Data Studio for visualization, monitoring and reporting.
If it’s your first time using CloudQuery we suggest you first run it locally to get familiar with the tool, take a look at our Getting Started with GCP Guide.
If you are already familiar with CloudQuery, take a look at how to deploy it to GCP on Cloud SQL and GKE at https://github.com/cloudquery/terraform-gcp-cloudquery.
You can only connect Data Studio to a public PostgreSQL (GCP Cloud SQL).
For security purpose you should allow connection only from Google Data Studio IP Addresses.
See connection full walkthrough.
Click Create New datasource and choose PostgresSQL (In this tutorial we will connect to publicly accessible RDS with authorized Data Studio IP Address) and fill-in the connection details:
Choose the table you want to visualize, in this case we will choose the gcp_resources
view.
💡 To create the gcp_resources
view, run the following view before importing to the data studio.
Choose the table to visualize
Design your report
You can reuse Data Studio to export/share those reports as well!
In this post we showed you how to build an open-source cloud asset inventory with CloudQuery as the ETL (Extract-Transform-Load) / data-ingestion layer and Google Data Studio as the visualization platforms. This approach eliminates the yet-another-dashboard fatigue and gives you the ability to pick the best-in-class visualization tools and/or reuse your current stack.
Subscribe for monthly updates
Open-source, product updates, blog-posts, news and more