Dataflow Automation with Prefect and Dask

Our first #ScienceThursday was so much fun we can’t wait to do it again this week!

And we’re excited to announce that our good friends at Prefect will be joining to show us how they leverage Dask for their modern workflow orchestration system: Prefect was built to help you schedule, orchestrate and monitor your data work and was designed to be maximally useful when things go wrong.

We’re doubly excited as Prefect was built to scale out on Dask from day one and lets users natively tap into Dask’s power via a familiar Pythonic interface. All you really need is a use case and some knowledge of Python to get started!

Prefect logo with slogan "The new standard in dataflow automation."

If you know a bit of Python, you’ll learn how you can use distributed computing with Dask and Prefect to define and debug your data workflows.

If you’re comfortable with Dask, you’ll learn how to leverage Prefect to define complex, distributed workflows.

Features we’ll cover:

  • how to define your workflows and configure them to run in multiple execution environments, from local development to different types of clusters;
  • scheduling work, including the full life cycle of a cluster;
  • parallelizing your task runs with Dask and scaling out to multiple machines via Coiled;

Join us this Thursday, July 9th at 5pm US Eastern time on our YouTube channel to discover how to use Dask and Prefect for your modern data workflow orchestration.

Dask logo with matrix
Share