Staff Data Engineer

Intrado Life & Safety, Inc.
3h$210,000 - $225,000

About The Position

Intrado is dedicated to saving lives and protecting communities, helping them prepare for, respond to, and recover from critical events. Our cutting-edge company strives to become the most trusted, data-centric emergency services partner by uniting fragmented communications into actionable intelligence for first responders. At Intrado, all of our work truly matters. We looking for an exceptional Staff Data Engineer to build the high[1]performance foundation that powers our company’s internal business analytics. In this pivotal role, you will partner with a newly hired Staff Analytics Engineer to design and build the end-to[1]end delivery of our data ecosystem, ensuring that our leadership team has the timely, actionable insights they need to make informed decisions. You will be responsible for building “the plumbing” of this data ecosystem, including the ingestion of data from diverse sources into our Azure data lake, transformation of data in Databricks, and delivery of gold layer data to visualization tools. You will enable the seamless flow of financial and operational data from source systems to decision-makers, eliminating the technical bottlenecks that delay critical business insights. This is a demanding role in a results-oriented environment with high expectations for agency, speed, and ownership.

Requirements

  • Experience: 10+ years of progressive experience in Data Engineering, with a specific focus on designing and building cloud infrastructure and high-volume data movement.
  • Cloud Infrastructure Architecture: Deep expertise in architecting the Azure Data Stack (Azure Data Factory, Azure Data Lake Storage, Databricks).
  • High-Scale Data Ingestion: Proven ability to build robust, scalable ELT/ETL pipelines using Azure Data Factory and Databricks.
  • Advanced Python & Spark: Expert-level proficiency in Python and Apache Spark for distributed data processing.
  • Governance & Security: Experience implementing enterprise-grade data governance, and data lineage.
  • DevOps & CI/CD: Strong experience implementing CI/CD pipelines (Azure DevOps or GitHub Actions) for data infrastructure.
  • LLM Application: Experience leveraging LLMs and AI-assisted development tools to accelerate data engineering workflows, improve code quality, and automate repetitive technical tasks.
  • Education: Bachelor’s degree in Computer Science, Software Engineering, Data Engineering, or a closely related technical discipline.

Nice To Haves

  • Master’s or equivalent in Computer Science, Engineering, or Cloud/Data Systems
  • Prior experience working in a technology company or SaaS environment

Responsibilities

  • Infrastructure Architecture: Design and implement the core architecture for the company’s data ecosystem that will be used for business analytics. This includes the end-to-end architecture from the data in source systems to delivery in visualizations.
  • High-Scale Ingestion: Build robust Azure Data Factory pipelines to pull data from disparate "Source A" systems (Salesforce, ServiceNow, D365) to "Sink B" (Azure data lake).
  • Standards & Governance: Set the technical standards for the Business Operations engineering team. You will define how the team handles CI/CD, version control, and data quality testing at the ingestion level.
  • System Reliability: Ensure the raw and bronze data layers are available and up to date, minimizing downtime.

Benefits

  • At Intrado, we offer a comprehensive benefits package that includes what you’d expect (medical, dental, vision, life and disability coverage, paid time off, a 401(k) retirement plan, and several that go above and beyond – paid parental leave, access to a robust library of personal and professional training resources, employee discounts, critical illness, hospital indemnity, access to legal support, pet insurance, identity theft protection, an EAP (Employee Assistance Program) that includes free mental health resources/support, and more!
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service