Hotline: 0123-456-789

Airflow Data Engineer (Remote) | HonorVet Technologies


Job title: Airflow Data Engineer (Remote)

Company: HonorVet Technologies

Job description: Position: Airflow Data Engineer
Location: Denver- Fully Remote
Duration: 06+ ContractPosition Description
Required Skills: MUST have experience standing up in an Apache Airflow environment from the ground up. REQUIRED Tech Stack —-Apache Airflow—-Astonomer—Azure—-DatabricksJob Summary:
We are seeking a skilled Data Engineer join our data engineering team. The successful candidate will be responsible for enhancing and automating our data workflows using Astronomer Airflow. This role involves improving data processing capabilities, optimizing task scheduling, and reducing our dependency on Databricks for orchestration. The specialist will also manage and monitor our data pipelines to ensure efficient and reliable operations.Key Responsibilities:

  • Astronomer Airflow Setup:
  • Install and configure Astronomer Airflow.
  • Establish necessary connections and ensure the Airflow web server and scheduler are operational.
  • Integrate CI/CD pipelines and GitHub for version control and automated deployment of workflows.

Workflow Migration:

  • Migrate existing workflows to the Astronomer Airflow platform.
  • Ensure workflows run error-free and meet performance benchmarks.

Team Training:

  • Conduct training sessions for the data engineering team.
  • Provide supporting materials and documentation.
  • Ensure team proficiency in creating and managing workflows and DAGs.

Monitoring and Alerting:

  • Set up monitoring tools to track workflow performance.
  • Configure alerts for critical issues.
  • Establish regular reporting procedures for ongoing monitoring.

Optimization and Documentation:

  • Review and optimize workflows for performance.
  • Create detailed documentation covering workflows, configurations, and best practices.
  • Establish a feedback loop for continuous improvement.

Expand Orchestration Capabilities:

  • Integrate Airflow with Azure functions, REST endpoints, and event-driven architectures.
  • Implement support for external services to extend orchestration beyond Databricks.
  • Create modular workflows for integrating Client/AI batch workloads.

Job and Data Dependency Management:

  • Develop a system to handle job dependencies.
  • Implement checks to ensure jobs run only if the underlying data has changed.

Required Qualifications:

  • Proven experience with Astronomer Airflow and data workflow orchestration.
  • Strong understanding of CI/CD pipelines and version control systems (e.g., GitHub).
  • Experience with cloud platforms (e.g., Azure) and integrating external services.
  • Proficiency in Python and SQL.
  • Excellent problem-solving skills and attention to detail.

Preferred Qualifications:

  • Experience with Databricks and Snowflake.
  • Familiarity with machine learning and artificial intelligence workflows.
  • Strong communication and training skills.
  • Ability to work collaboratively in a team environment.

Expected salary:

Location: Arkansas

Job date: Sat, 12 Oct 2024 05:23:55 GMT

Apply for the job now!

Apply for this job

Leave your thoughts

Share this job

Remotejobscape.com is your first destination for true freedom, we help you to find a well paying remote job that you will like.

Remotejobscape

RemoteJobscape.com is your premier destination for finding remote job opportunities across various industries. Our platform connects talented professionals with forward-thinking companies looking to embrace the future of work.