Back
September 12, 2022
6
min read

Decodable's Imply Polaris Connector: The Druid Easy Button

By
Decodable Team
Share this post

Decodable now includes a native sink connector to send data to Imply Polaris.

Imply Polaris® is a real-time database for modern analytics applications, built from Apache Druid® and delivered as a fully managed database as a service (DBaaS). It provides a complete, integrated experience that simplifies everything from ingesting data to using visualizations to extracting valuable insights from your data.

Imply Polaris & Druid

Imply Polaris can be used as an alternative to a self-managed Druid deployment. So, Decodable's Imply Polaris connector is the alternative to using Decodable’s  Druid sink connector. Unlike the Druid connector, no Kafka is necessary to work with Polaris.

Key benefits of Polaris include:

  • A fully managed cloud service. You do not have to configure and run your own Kafka data sources to ingest data to Polaris (as you would need to with Druid). Just point, click, and stream.
  • A single development experience, with push-based streaming built on Confluent Cloud.
  • Database optimization.
  • Scale in seconds.
  • Resiliency and security.

Decodable + Imply Polaris

Decodable provides a low-latency transport and transformation for ingesting data in a way that's matched to Imply Polaris' real-time analytics. After all, there's no point running real-time queries on stale data! Imply Polaris and Apache Druid performs much more efficiently if the data it ingests is pre-processed, and Decodable is the ideal tool to perform this transformation, as described in this blog post.

Getting Started With The Imply Polaris Connector

Imply Polaris is a sink connection, meaning that data can only be written from Decodable to Polaris after processing in a pipeline using SQL.

Connecting Decodable to Polaris consists of 4 steps:

  • Create a Polaris table
  • Create a Decodable stream and associated schema, which in turn will take data from one or more Decodable pipelines.
  • In Polaris, create a push_streaming connection and streaming job with schema matching that in Decodable.
  • Select the Polaris connection in the Decodable create connection dialog

Complete the Polaris connection configuration by specifying:

  • The name of the Polaris connection you created in step 3.
  • your Polaris organization name.
  • your Polaris API client ID.
  • the secret associated with your API client ID.

Hit "Next" and select the Decodable stream you created in step 2

Finally, you'll be asked to confirm the schema is correct and name the Decodable connection. That's it - Happy Decoding!

For a more thorough walkthrough of using the Polaris connector, please check out the documentation or watch the video demo that follows.


You can get started with Decodable for free - our developer account includes enough for you to build a useful pipeline and - unlike a trial - it never expires.

Learn more:

Join the community Slack

📫 Email signup 👇

Did you enjoy this issue of Checkpoint Chronicle? Would you like the next edition delivered directly to your email to read from the comfort of your own home?

Simply enter your email address here and we'll send you the next issue as soon as it's published—and nothing else, we promise!

👍 Got it!
Oops! Something went wrong while submitting the form.
Decodable Team

Related Posts

No items found.

Decodable now includes a native sink connector to send data to Imply Polaris.

Imply Polaris® is a real-time database for modern analytics applications, built from Apache Druid® and delivered as a fully managed database as a service (DBaaS). It provides a complete, integrated experience that simplifies everything from ingesting data to using visualizations to extracting valuable insights from your data.

Imply Polaris & Druid

Imply Polaris can be used as an alternative to a self-managed Druid deployment. So, Decodable's Imply Polaris connector is the alternative to using Decodable’s  Druid sink connector. Unlike the Druid connector, no Kafka is necessary to work with Polaris.

Key benefits of Polaris include:

  • A fully managed cloud service. You do not have to configure and run your own Kafka data sources to ingest data to Polaris (as you would need to with Druid). Just point, click, and stream.
  • A single development experience, with push-based streaming built on Confluent Cloud.
  • Database optimization.
  • Scale in seconds.
  • Resiliency and security.

Decodable + Imply Polaris

Decodable provides a low-latency transport and transformation for ingesting data in a way that's matched to Imply Polaris' real-time analytics. After all, there's no point running real-time queries on stale data! Imply Polaris and Apache Druid performs much more efficiently if the data it ingests is pre-processed, and Decodable is the ideal tool to perform this transformation, as described in this blog post.

Getting Started With The Imply Polaris Connector

Imply Polaris is a sink connection, meaning that data can only be written from Decodable to Polaris after processing in a pipeline using SQL.

Connecting Decodable to Polaris consists of 4 steps:

  • Create a Polaris table
  • Create a Decodable stream and associated schema, which in turn will take data from one or more Decodable pipelines.
  • In Polaris, create a push_streaming connection and streaming job with schema matching that in Decodable.
  • Select the Polaris connection in the Decodable create connection dialog

Complete the Polaris connection configuration by specifying:

  • The name of the Polaris connection you created in step 3.
  • your Polaris organization name.
  • your Polaris API client ID.
  • the secret associated with your API client ID.

Hit "Next" and select the Decodable stream you created in step 2

Finally, you'll be asked to confirm the schema is correct and name the Decodable connection. That's it - Happy Decoding!

For a more thorough walkthrough of using the Polaris connector, please check out the documentation or watch the video demo that follows.


You can get started with Decodable for free - our developer account includes enough for you to build a useful pipeline and - unlike a trial - it never expires.

Learn more:

Join the community Slack

📫 Email signup 👇

Did you enjoy this issue of Checkpoint Chronicle? Would you like the next edition delivered directly to your email to read from the comfort of your own home?

Simply enter your email address here and we'll send you the next issue as soon as it's published—and nothing else, we promise!

Decodable Team