Associate Data Engineer

Baker TillyFrisco, WI
2d

About The Position

As a Senior Consultant – Associate Data Engineer you will design, build, and optimize modern data solutions for our mid‑market and enterprise clients. Working primarily inside the Microsoft stack (Azure, Synapse, and Microsoft Fabric), you will transform raw data into trusted, analytics‑ready assets that power dashboards, advanced analytics, and AI use cases. You’ll collaborate with solution architects, analysts, and client stakeholders while sharpening both your technical depth and consulting skills.

Requirements

  • Education – Bachelor’s in Computer Science, Information Systems, Engineering, or related field (or equivalent experience)
  • Experience – 2–3 years delivering production data solutions, preferably in a consulting or client ‑ facing role.
  • Technical Skills: Strong T ‑ SQL for data transformation and performance tuning. Python for data wrangling, orchestration, or notebook ‑ based development. Hands ‑ on ETL/ELT with at least one Microsoft service (ADF, Synapse Pipelines, Fabric Data Pipelines).

Nice To Haves

  • Project experience with Microsoft Fabric (OneLake, Lakehouses, Data Pipelines, Notebooks, Warehouse, Power BI DirectLake) preferred
  • Familiarity with Databricks, Delta Lake, or comparable lakehouse technologies preferred
  • Exposure to DevOps (YAML pipelines, Terraform/Bicep) and test automation frameworks preferred
  • Experience integrating SaaS/ERP sources (e.g., Dynamics 365, Workday, Costpoint) preferred

Responsibilities

  • Data Engineering: Develop scalable, well‑documented ETL/ELT pipelines using T‑SQL, Python, Azure Data Factory/Fabric Data Pipelines, and Databricks; implement best‑practice patterns for performance, security, and cost control.
  • Modeling & Storage: Design relational and lakehouse models; create Fabric OneLake shortcuts, medallion‑style layers, and dimensional/semantic models for Power BI.
  • Quality & Governance: Build automated data‑quality checks, lineage, and observability metrics; contribute to CI/CD workflows in Azure DevOps or GitHub.
  • Client Delivery: Gather requirements, demo iterative deliverables, document technical designs, and translate complex concepts to non-technical audiences.
  • Continuous Improvement: Research new capabilities, share findings in internal communities of practice, and contribute to reusable accelerators.
  • Collaborate with clients and internal stakeholders to design and implement scalable data engineering solutions.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service