How to use SSHOperator to submit jobs from Airflow to Spark clusters running in Docker containers
This article shows how to use Airflow's SSHOperator to submit PySpark jobs on Standalone Spark cluster. Airflow and Spark are both running inside Docker containers
2023, Jan 01 — 13 minute read