Home > Blog > Uncategorized > Using upstream Apache Airflow Hooks and Operators in Cloud ComposerUsing upstream Apache Airflow Hooks and Operators in Cloud ComposerGoogle Cloud Customer Engineer

Using upstream Apache Airflow Hooks and Operators in Cloud ComposerUsing upstream Apache Airflow Hooks and Operators in Cloud ComposerGoogle Cloud Customer Engineer

For engineers or developers in charge of integrating, transforming, and loading a variety of data from an ever-growing collection of sources and systems, Cloud Composer has dramatically reduced the number of cycles spent on workflow logistics. Built on Apache Airflow, Cloud Composer makes it easy to author, schedule, and monitor data pipelines across multiple clouds and on-premises data centers.

Let’s walk through an example of how Cloud Composer makes building a pipeline across public clouds easier. As you design your new workflow that’s going to bring data from another cloud (Microsoft Azure’s ADLS, for example) into Google Cloud, you notice that upstream Apache Airflow already has an ADLS hook that you can use to copy data. You insert an import statement into your DAG file, save, and attempt to test your workflow. “ImportError – no module named x.” Now what?

As it turns out, functionality that has been committed upstream—such as brand new Hooks and Operators—might not have made its way into Cloud Composer just yet. Don’t worry, though: you can still use these upstream additions by leveraging the Apache Airflow Plugin interface.

Using the upstream AzureDataLakeHook as an example, all you have to do is the following:

  1. Copy the code into a separate file (ensuring adherence to the Apache License)

  2. Import the AirflowPlugin module (from airflow.plugins_manager import AirflowPlugin)

  3. Add the below snippet to the bottom of the file:

Cloud Hosting And SEO

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *