Integrations and deployment

Deploy

Simply run STRM as SaaS or deploy inside your public or private cloud or even on-prem - and make sure all your data stays where it should: with you.

GCP

Azure

Amazon Web Services

Through our HELM chart, you can run STRM on *any* Kubernetes-cluster - from a local test run to a full-scale production deploy on-prem.

Integrations

Read/write into a wide range of standard sources and destinations and even natively integrate into your cloud DWH.

STRM natively supports streaming architectures and event-driven systems, built on top of Kafka. Read/write to your existing Kafka cluster, and add safe data usage on top.

Natively read/write into PostgresSQL through JDBC as both destination and source.

Read and write safe data directly to an AWS S3 storage bucket in the convenient JSONL format.

Read and write safe data directly to a GCP storage bucket in the convenient JSONL format.

Read and write safe data directly to an Azure storage blob in the convenient JSONL format.

Bring your data to Databricks and enable purpose-based decryption and views with STRM key management.

Natively read/write into Google BigQuery through JDBC and enable purpose-based views by integrations with our KMS.

Bring your data to Snowflake and enable purpose-based decryption and views with STRM key management.

Natively read/write into MongoDB through JDBC as both destination and source.

Natively read/write into Apache Derby through JDBC as both destination and source.

Natively read/write into Maria DB through JDBC as both destination and source.

Natively read/write into Microsoft SQL through JDBC as both destination and source.

Natively read/write into Oracle databases through JDBC as both destination and source.

Natively read/write into SQLite through JDBC as both destination and source.

Missing an option? Let us know!

Read docs