Software Engineer 2 (Hybrid) - Linux/Bash/Apache Airflow/Apache Spark/Docker/Podman/Git/AWS

Captivation SoftwareAnnapolis Junction, MD
7h$130,000 - $270,000Hybrid

About The Position

Captivation has built a reputation on providing customers exactly what is needed in a timely manner. Our team of engineers take pride in what they develop and constantly innovate to provide the best solution. Captivation is looking for software developers who can get stuff done while making a difference in support of the mission to protect our country. Captivation Software is looking for a mid level software engineer who shall be responsible for creating and maintaining data workflows and automation pipelines using Apache Airflow. This role focuses on building reliable, scalable, and observable workflow orchestration solutions that support data engineering, analytics, and operational use cases. The engineer will collaborate closely with data engineers, platform teams, and stakeholders to ensure workflows are efficient, secure, and production ready.

Requirements

  • Must currently hold a Top Secret/SCI U.S. Government security clearance with a favorable Polygraph, therefore all candidates must be a U.S. citizen
  • Master's degree in Computer Science or related discipline from an accredited college or university, plus three (3) years of experience as a SWE, in programs and contracts of similar scope, type, and complexity.
  • Bachelor's degree in Computer Science or related discipline from an accredited college or university, plus five (5) years of experience as a SWE, in programs and contracts of similar scope, type, and complexity
  • Seven (7) years of experience as a SWE, in programs and contracts of similar scope, type, and complexity.
  • Experience using the Linux CLI and Linux tools
  • Experience developing Bash scripts to automate manual processes
  • Recent software development experience using Python and Java
  • Experience using Apache Airflow (DAG design, scheduling, operators, sensors) to orchestrate, schedule, and monitor complex workflows
  • Experience using Distributed Big Data processing engines including Apache Spark
  • Experience with containerization technologies such as Docker, containers, and Podman
  • Experience with Git Source Control System

Nice To Haves

  • Experience using the Atlassian Tool Suite (JIRA, Confluence)
  • Familiar with AWS Cloud Services and Infrastructure

Responsibilities

  • Creating and maintaining data workflows and automation pipelines using Apache Airflow.
  • Building reliable, scalable, and observable workflow orchestration solutions that support data engineering, analytics, and operational use cases.
  • Collaborating closely with data engineers, platform teams, and stakeholders to ensure workflows are efficient, secure, and production ready.

Benefits

  • Annual Salary: $130,000 - $270,000 (Depends on the Years of Experience)
  • Up to 20% 401k contribution (No Matching Required and Vested from Day 1)
  • Above Market Hourly Rates
  • $3,600 HSA Contribution
  • 6 Weeks Paid Time Off
  • Company Paid Employee Medical/Dental/Vision Insurance/Life Insurance/Short-Term & Long-Term Disability/AD&D
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service