Comms As A Data Engineer

Comms As A Data Engineer

Comms as a Data Engineer can be tough. Should you email a group of people? Should you dump a message in a public Slack channel they frequent? Should you follow up daily, weekly, etc? It’s a lot of manual labor. I don’t like manual work.

Also, I hate email. This seems to be a common opinion. Slacks better. But slack comms still often go unread. By comms I mean indirect messages to general groups of people. It’s still very important to get information out to your stakeholders. You might have to tell them about Airflow temporarily being down, tell them about table depreciations, or you might have to tell them about a new feature or product your team just released.

Comms are important. Projects and products are irrelevant if you can’t drive adoption or awareness very often. I’ve found a bit of success using the tools that people frequent to communicate with my stakeholders. One of the tools my team’s stakeholders most often use is Airflow.

How Do I Put My Message In Airflow?

Here are the Airflow docs on the topic. As always it’s more thorough and will cover additional features we won’t be discussing.

Get Airflow Up And Running

Before we start, if you don’t have your local environment set up, check out this blog post about getting started with Docker and Airflow.

We’re going to assume you already have your local environment setup.

Add Your Config Python File

We need a Python file in a folder named config that holds our alert info. So create a folder in your local Airflow repo in your local_env folder named config. Inside of that folder add a file named `airflow_local_config.py`. It should look something like this:

The airflow_local_settings.py file should contain the following:

from airflow.www.utils import UIAlert

DASHBOARD_UIALERTS = [
    UIAlert("AIRFLOW WILL BE DOWN NEXT WEEK"),
]

The UIAlert function can take HTML as input to make your messages more vibrant or interesting.

Update your docker-compose.yaml

So, the Airflow docs just mention creating a config folder and adding a file named airflow_local_settings.py. Your file tree would probably look similar to:

But, without having Docker copy the config folder, it won’t show up in your environment. So we’re going to have to tell Docker to copy our files. We can do that by editing the docker-compost.yaml file. You’ll find a volumes chunk to the file around line 60 if you haven’t made too many changes to the docker-compose.yaml from the Airflow docs.

That section will look like:

  volumes:
    - ./dags:/opt/airflow/dags
    - ./logs:/opt/airflow/logs
    - ./plugins:/opt/airflow/plugins

Update it to copy your local config folder into your Docker environment. The after should look something like:

  volumes:
    - ./dags:/opt/airflow/dags
    - ./config:/opt/airflow/config
    - ./logs:/opt/airflow/logs
    - ./plugins:/opt/airflow/plugins

Notice the new config chunk.

Run Your Docker Commands To Rebuild Your Local Airflow Env

Now run the following in your local_env folder to rebuild your Airflow env.

docker-compose up airflow-init && docker compose up

You should now see something like the following in Airflow UI.

Now, you’re stakeholders will come across your comms whether they’d like to or not. They naturally go check on ETLs or go into Airflow during pipeline dev. It’s hard to ignore in house banner ads.

Happy coding!