Data Engineer

Intrado Life & Safety, Inc.
8h$185,000 - $200,000

About The Position

Intrado is dedicated to saving lives and protecting communities, helping them prepare for, respond to, and recover from critical events. Our cutting-edge company strives to become the most trusted, data-centric emergency services partner by uniting fragmented communications into actionable intelligence for first responders. At Intrado, all of our work truly matters. We are seeking an exceptional Data Engineer to build the robust data pipelines that will power our company’s internal business analytics. Working under the guidance of the Staff Data Engineer, you will ensure that the raw data from multiple systems is consistently ingested, cleaned, and made ready for analysis. By building stable and efficient pipelines, you will directly support the timely generation of visualizations that leadership relies on to make informed decisions. This is a demanding role in a results-oriented environment with high expectations for agency, speed, and ownership.

Requirements

  • Experience: 5+ years of experience in Data Engineering, specifically focused on building and maintaining ETL/ELT pipelines of large-scale operational and financial data in a cloud environment.
  • Pipeline Development: Proficiency in building and optimizing data pipelines using Azure Data Factory and Databricks.
  • Technical Proficiency (SQL & Python): Strong proficiency in SQL for data analysis and Python for scripting and transformation.
  • Data Quality Assurance: Experience implementing automated data quality checks (e.g., schema validation, null checks). A proactive approach to identifying pipeline failures and implementing fixes to prevent recurrence.
  • Platform & Data Familiarity: Experience working with data schemas and APIs from common enterprise platforms like Microsoft Dynamics 365 F&O, Salesforce, ServiceNow.
  • LLM Application: Demonstrated experience using LLMs to streamline data engineering workflows and improve development efficiency.
  • Education: Bachelor’s degree in Computer Science, Software Engineering, Data Engineering, or a closely related technical field.

Nice To Haves

  • Prior experience working in a technology company or SaaS environment

Responsibilities

  • Pipeline Execution: Build and maintain Azure Data Factory pipelines to ingest data from multiple sources.
  • Silver Layer Transformation: Write the Python code in Databricks to clean raw data and move it into the silver layer, handling deduplication, type casting, and validation.
  • Reliability: Monitor daily jobs and troubleshoot failures. You are the first line of defense in ensuring that pipelines are stable and do not break.
  • Data Quality: Implement automated checks to verify that data arriving in the lake matches the source systems.

Benefits

  • At Intrado, we offer a comprehensive benefits package that includes what you’d expect (medical, dental, vision, life and disability coverage, paid time off, a Registered Retirement Savings Plan (RRSP), and several that go above and beyond – tuition reimbursement, paid parental leave, access to a comprehensive library of personal and professional training resources, employee discounts, insurance coverage and more!
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service