Databricks DevOps Engineer

ECS Tech Inc
3h$130,000 - $140,000Remote

About The Position

ECS is seeking a Databricks DevOps Engineer to work remotely. Please Note: This position is contingent upon additional funding. ECS is seeking a talented Databricks DevOps Engineer to join our dynamic data team supporting the Department of Homeland Security (DHS) Office of the Chief Information Officer (OCIO). Our team supports a new enterprise Databricks platform implementation that is focused on providing solutions that empower our Federal customers with the tools and capabilities needed to turn data into actionable insights. As a DevOps/Platform Engineer, you will help to maintain and enhance our robust data sharing platform, and work closely with the Databricks Architect to operate and manage the platform for the enterprise.

Requirements

  • Must be a US Citizen
  • Must be able to obtain a Public Trust Clearance
  • Prior professional services or federal consulting experience; experience supporting DHS preferred
  • Strong technical skills, including proficiency in programming languages such as Python
  • Familiar with DevOps and CI/CD processes, methodologies, and philosophies
  • Experience with Docker/Podman containerization and code management in GitHub or GitLab
  • Strong background in Linux system administration/shell scripting
  • Experience with working in cloud environments such as AWS/Azure
  • Cloud security (AWS/Azure) certifications (e.g., AWS Certified DevOps Engineer - Professional)
  • Experience with configuration or infrastructure management tools (e.g., Ansible, Terraform)
  • Excellent communication, interpersonal, and leadership skills
  • Understanding of modern Big Data technologies
  • Experience with database technologies such as PostgreSQL
  • Previous experience in working with data lakes and data warehouses
  • Document architectures, SOPs, and engineering workflows.
  • Databricks platform performance optimization, monitoring, and reporting
  • DeltaShare setup, operations and maintenance, and overall management/execution
  • Experience with Unity Catalog and MLFlow within Databricks
  • Experience with agentic AI tools such as Databricks' Agent Bricks and Genie

Responsibilities

  • Working closely with the Databricks Architect to manage and implement use cases within the Databricks platform for enterprise use
  • Configuration management and testing of complex software engineering infrastructures utilized by code development teams.
  • Working closely with architects, developers, technical leads, etc. to collect requirements and design solutions.
  • Using scripting/programming skills to solve problems.
  • Containerization technologies (e.g., Docker/Podman).
  • Develop and implement CI/CD pipelines for automated deployment and testing of applications and infrastructure.
  • Performing O&M support on applications of various sizes and complexity.
  • Continuously evaluating and recommending best technologies for platform implementation and improvement.
  • Automate repetitive tasks and processes to improve efficiency and reduce manual effort.
  • Monitor system performance, troubleshoot issues, and ensure high availability and reliability of the platform.
  • Being an active and collaborative member of an Agile/Scrum team and following all Agile/Scrum best practices.
  • Collaborating with others as part of a cross-functional team that includes data architects, data scientists, DBAs, security engineers, etc.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service