Data Engineer

HigginbothamLehi, UT

About The Position

The Data Engineer is responsible for designing, building, and maintaining the modern data infrastructure, with a primary focus on developing and supporting a Microsoft Fabric-based data warehouse. This role ensures reliable data pipelines, scalable architecture, and high-quality data availability across the organization. In addition to core data engineering responsibilities, this position will develop and maintain internal tools and user interfaces that enable business users to input, retrieve, and interact with data—such as guided workflows. These tools will integrate with internal systems to streamline data entry, updates, and reporting. The Data Engineer plays a critical role in managing the full data lifecycle—from ingestion and transformation to delivery—while enabling data-driven decision-making and operational efficiency.

Requirements

  • Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or related field (or equivalent experience)
  • 3–5+ years of experience in data engineering or related roles
  • Experience building and maintaining data warehouses and pipelines in modern data platforms (preferably Microsoft Fabric or Azure ecosystem)
  • Hands-on experience with ETL/ELT processes, data modeling, and API integrations
  • Experience with Microsoft Fabric (Data Warehouse, Lakehouse, Pipelines) for building and managing modern data platforms.
  • Strong proficiency with SQL and relational databases for querying, modeling, and performance optimization.
  • Experience using Python for data processing, automation, and pipeline development.
  • Familiarity with APIs and data integration tools for connecting internal and external systems.
  • Working knowledge of front-end or interface development (e.g., web-based tools, forms, workflow UIs) to support user interaction with data.
  • Proficiency with version control systems (e.g., Git) for code management and collaboration.
  • Experience with business intelligence and reporting tools (e.g., Power BI) for data visualization and analytics.

Nice To Haves

  • Microsoft Azure Data Engineer Associate (DP-203) or equivalent
  • SQL or database-related certifications
  • Property & Casualty license preferred, or willingness to obtain within a designated timeframe with company support.
  • Completion of relevant training or coursework as determined by the company upon hire

Responsibilities

  • Design, build, and maintain scalable data pipelines and data models within Microsoft Fabric using strong SQL and Python expertise.
  • Develop and manage the enterprise data warehouse, applying best practices in data modeling and warehouse design to ensure performance, reliability, and scalability.
  • Integrate and standardize data across insurance management systems, CRM, web, financial/bordereaux systems, APIs, SQL databases, and external data feeds.
  • Build and maintain internal tools and user interfaces (e.g., web-based tools, forms, workflow-driven applications) that allow business users to input and interact with data in structured workflows.
  • Ensure seamless data flow between user-facing tools and backend systems through effective API and system integration.
  • Maintain and optimize ETL/ELT processes for accuracy, efficiency, and automation.
  • Ensure data quality, integrity, and governance across all systems and pipelines.
  • Collaborate with Product and business teams to understand data requirements and translate them into scalable technical solutions.
  • Troubleshoot and resolve data-related issues across the full data stack.
  • Support reporting, analytics, and downstream data consumption needs, including integration with BI tools such as Power BI.
  • Contribute to the adoption of AI/ML-driven solutions and automation within the data ecosystem.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service