Connectors
Connectors are part of our open source repository of samples, examples and integrations.
Connectors help our users connect with other vendors such as AWS and Kafka.
You can explore the connector README files here in Quix Docs. When you are ready to start using them, head over to the Quix Code Samples GitHub repository, or sign up and log in to the platform.
Sources
-
Consume data from a Kafka topic in Confluent Cloud and publish it to a topic in Quix
-
Periodically query InfluxDB 2.0 and publish the results to a Kafka topic.
-
Use the InfluxDB 3.0 query API to periodically query InfluxDB and publish the results to a Kafka topic.
-
Install a Kafka Connect source connector in the Quix platform
-
Consume data from an MQTT broker and publish it to a Kafka topic.
-
Capture changes to a Postgres database table and publish the change events to a Kafka topic.
-
Periodically query a Redis database and publish the results to a Kafka topic.
-
Capture changes to an SQL database table and publish the change events to a Kafka topic.
-
Read event data from Segment and publish it to a Kafka topic.
Destinations
-
Stream data from Quix to BigQuery
-
Consume data from a Kafka topic in Quix and publish it to a topic in Confluent Cloud
-
Consume data from a Kafka topic in Quix and persist the data to an InfluxDB 3.0 database.
-
Install a Kafka Connect sink connector in the Quix platform
-
Consume data from a Kafka topic and publish it to an MQTT broker
-
Consume data from a Kafka topic and persist it to Redis.
-
Consume data from a Kafka topic and send Slack notifications based on your matching criteria.