Airflow for streaming

Real-time data pipelines made simple

Quix is an Airflow alternative for streaming. Build event-driven pipelines — no sensors or scheduler needed. Work with your favourite Python libraries, a DAG interface and open source connectors without the hassle of setting up event streaming infrastructure. Run Quix in your own cloud or use our fully-managed service.

Easy user interface and pure Python for streaming pipelines

Process Quix line.
1/3

Develop in pure Python

No Java dependencies, no more hassle debugging your Spark or Flink jobs.

Use Python to create your own tasks and workflows. Import your favourite Python libraries for real-time processing and enjoy the ease of developing, testing and debugging your application in pure Python.

Quix is based on Docker so it's easy to add system dependencies or libraries to support your application.

Quickly get started using pre-configured transformation and connector templates. Integrate your data with our open source plug-and-play connectors or build your own. This makes Quix easy to apply to your current infrastructure on AWS, GCP, Microsoft Azure and many other third-party services.

2/3

Event-driven design

No need for clunky sensors or a scheduler. Quix is event-based at its core. This means workflows that respond to real-time events as soon as they happen.

Move from scheduling batches to real-time processing by building asynchronous workflows with event-based triggers. An event-driven approach is more efficient than traditional batch processing because event handlers can be built to react to new published information in real time. This allows you to model complex business workflows that are highly performant and scalable with strong backup and recovery characteristics.

Quix helps you to reinforce software engineering and microservices best practices like modularity and reusability. This helps you to cut your development time and infrastructure bill.

Each task in a Quix pipeline is an individual service in your Git repo. This means that you can build up a library of templates that teams can reuse. Quix provides a visual DAG interface and also defines your pipeline in a YAML file so you have versioned pipelines as code, which ensures consistency and repeatability across development and production environments.

3/3

Serverless event streaming infrastructure

Building real-time pipelines on Apache Kafka is notoriously hard because you also need to configure additional processing infrastructure like Spark or Flink and Kafka Connect for integrating with data sources and sinks.

Now you can forget about setting up all the individual components. Quix provides serverless Apache Kafka, Kafka Connect, Docker, CI/CD, and connectors that run in a K8’s package. You can self-host Quix on your cloud or use a fully hosted and managed service built on Azure, AWS or GCP.

However you deploy your pipelines, you’ll be abstracted off of the underlying infrastructure such that building a pipeline requires just a few clicks to configure dev, staging or prod environments. And if you need more processing power you can scale each app horizontally with replicas, all without changing a line of code.

It’s free to get started

Sign up now and start building your first event streaming app with free credits to use in compute and streaming.