About the position
This is a remote position.
Join Our Data Engineering Crew as an Apache Airflow Optimization Specialist!
Are you a data wizard who thrives on optimizing workflows and conquering cloud challenges? We’re on the hunt for a rockstar Apache Airflow SME to turbocharge our Azure data platform and take our enterprise-scale data pipelines to the next level! If you’re passionate about orchestration, scalability, and Azure’s cutting-edge tools, this is your chance to shine! ?
What’s the Gig?
As our Airflow Optimization Specialist, you’ll be the maestro of our data orchestration, fine-tuning our Apache Airflow environment to deliver high-performance, reliable, and scalable solutions. You’ll collaborate with our data engineering and platform teams, wield Azure’s powerful tools, and craft CI/CD pipelines that make data flow smoother than a summer breeze. Ready to make an impact?
Here’s what you’ll be doing:
- Analyse and optimize the current Apache air flow environment, identify performance bottlenecks and implement best practices for orchestration and scheduling
- Design and implement scalable modular and reusable DHG's to support complex data workflows
- Collaborates with data engineers and platform teams to integrate airflow with Azure data factory Azure data bricks and other Azure-native services
- Develop and maintain CI CD pipelines using Azure DevOps for air flow DAG deployment testing and version control.
- Establish monitoring alerting and logging standards for air flow jobs to ensure operational excellence and rapid incident response
- Provide architectural guidance and hands-on support for new data pipeline development using air flow and Azure servicesDocuments air flow configurations deployment processes and operational runbooks for E internal teams
- Mentor engineers and contribute to knowledge sharing sessions on orchestration and workflow management.
Required skills and qualifications
- Proven experience as an Apache air flow SME or lead developer in a production-grade environment with a strong understanding of air flow internals includin
- scheduler executor types (Celery Kubernetes) and plug-in developments.
- Experience with workload orchestration and auto scaling using K EDA based event driven auto scaler and familiarity with celery to distributed task execution and background job processing particularly in data pipeline or micro services environments
- Hands-on experience with Azure cloud services, especially as your data factory as your data bricks, as your storage and as your synapse
- Proficiency in designing and deploying CI CD pipelines using Azure DevOps (YAML pipelines, release management, artifact handling)
- Solid programming skills in Python with experience in writing modular testable and reusable code.
- With containerization (Docker) and orchestration (Kubernetes) as it relates to airflow deployment
If you’re ready to optimize, orchestrate, and dominate the data pipeline game, we want YOU! Apply now and let’s make data magic together.
Let’s make data flow like never before!
Desired Skills:
- Apache Airflow
- Azure
- Azure Data Bricks
Desired Qualification Level:
About The Employer: