Senior Data Engineer

West MonroeChicago, IL
3h

About The Position

West Monroe is seeking a Data Engineer for the Intellio Labs team, part of the Office of the CAIO (Chief AI Officer). The Data Engineer is responsible for leading the design, build, and delivery of data pipelines and data solutions that support Intellio Labs products and internal analytics assets. This role balances hands-on technical execution with delivery leadership, ensuring that data solutions meet quality, performance, reliability, and governance standards. The Data Engineer partners closely with Product, Data Science, Internal IT, and Line consultants to translate business needs into scalable, production-ready data assets that enable effective selling, delivery, and AI-driven insights.

Requirements

  • 5+ years of professional experience in data engineering; consulting experience preferred.
  • 3+ years of hands-on experience with Databricks and Spark.
  • Advanced SQL skills, including complex transformations and performance optimization.
  • Experience designing and implementing metadata-driven ETL/ELT patterns.
  • Experience with distributed data processing and streaming technologies (e.g., Spark, Kafka, Kinesis, Hadoop, Lambda).
  • Experience with ETL orchestration tools such as Azure Data Factory, AWS Glue, Informatica, Talend, or IBM DataStage.
  • Proven experience in technical testing, issue identification, and production support.
  • Strong verbal and written communication skills.

Nice To Haves

  • Experience with AWS-based data platforms (preferred but not required)
  • Exposure to analytics, BI, and AI/ML-enabled data solutions. (preferred but not required)

Responsibilities

  • Design and implement scalable, secure, and reliable data pipelines, models, and integrations using Databricks, Spark, and cloud-native technologies.
  • Lead day-to-day execution of assigned data engineering workstreams, including effort estimation, task breakdown, and delivery commitments.
  • Build and maintain metadata-driven ELT/ETL pipelines that support reuse, extensibility, and operational efficiency.
  • Integrate data from multiple internal and external source systems into unified, analytics-ready outputs.
  • Develop data assets that support analytics and AI/ML workflows, including feature-ready datasets and experimentation environments.
  • Ensure data quality, reliability, security, and performance across owned pipelines and datasets.
  • Define and implement automated data quality checks, validation rules, and monitoring.
  • Troubleshoot pipeline failures, interpret system errors, and resolve data defects in collaboration with source system owners.
  • Optimize pipeline performance and ensure successful execution of scheduled jobs.
  • Participate in production support, incident resolution, and post-incident root cause analysis for owned pipelines.
  • Collaborate with Product, Engineering, Analytics, and Data Science partners to deliver data-driven capabilities.
  • Gather and document technical requirements and contribute to solution design across the Data Engineering & Analytics (DE&A) team.
  • Coordinate with application and data source owners to understand source data structures, constraints, and integration requirements.
  • Create curated data outputs that support reporting, dashboarding, and semantic models for tools such as Tableau and Power BI.
  • Review data engineering work to ensure adherence to standards, best practices, and governance guidelines.
  • Mentor and support junior engineers through code reviews, technical guidance, and knowledge sharing.
  • Contribute to continuous improvement of engineering practices, patterns, and tooling within Intellio Labs.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service