So your Apache Airflow DAG is failing silently. Are you running an ETL on a huge dataset? This is a symptom of an Airflow instance without sufficient memory.
Dig into your instances logs and you’ll probably see an evicted worker if your running your instance’s workers on Kubernetes. You’ll see similar logs wherever you run your jobs worded differently.
I’ll assume you’re using Cloud Composer which gets going with just a few clicks. Follow this
how to under Upgrading the machine type for GKE nodes. Pick a larger machine with more memory to vertically scale your instance. In an ideal situation your system would automatically scale to handle your job and then die to save money. I’ll cover this in a future post. Happy coding! Check this post out to setup a local Airflow instance. Hopefully you now know what to do when an Apache Airflow DAG is failing silently.