Moser Consulting-posted 1 day ago
$100,000 - $135,000/Yr
Full-time • Mid Level
Hybrid • Indianapolis, IN
51-100 employees

We are seeking a highly skilled Data Engineer to lead complex data transformation initiatives, including the adoption and implementation of Microsoft Fabric. This role combines advanced technical expertise in SQL and Python development. Candidates must have capabilities to design, build, and optimize scalable analytical solutions through best-in-class data modeling practices, that deliver measurable business impact for our client engagements.

  • Design and architect end-to-end Microsoft Fabric solutions including data lakes, data warehouses, and real-time analytics platforms
  • Design, build, and troubleshoot data acquisition pipelines from diverse sources including SQL databases, APIs, Salesforce, Oracle, and Dynamics into Microsoft Fabric using Dataflows Gen2 and Azure Data Factory.
  • Implement Direct Lake connectivity and optimize semantic models for high-performance analytics and reporting.
  • Lead comprehensive data modeling initiatives including dimensional modeling, star/snowflake schemas, and data vault methodologies
  • Implement real-time streaming analytics solutions using Event Streams and KQL Database
  • Lead the development of complex SQL queries, stored procedures, and database optimization strategies
  • Build robust Python applications for data processing, automation, and analytics workflows
  • Implement data quality frameworks and monitoring solutions
  • Design and optimize data warehouse and data lake architectures with focus on scalable data models
  • Experience with Fabric Capacity Metrics app for performance monitoring and optimization.
  • Develop high-performance SQL code for data transformation and business logic implementation
  • Design sophisticated dimensional models leveraging star/snowflake schemas optimized for high-performance BI workloads across Power BI and Microsoft Fabric platforms
  • Implement dimensional modeling best practices including slowly changing dimensions (SCD) and fact table optimization
  • Create comprehensive data documentation, lineage tracking, and data model documentation
  • Lead technical discovery sessions to understand client data landscape and requirements
  • Collaborate with business stakeholders to translate requirements into technical solutions
  • Mentor junior data engineers and guide technical decision-making
  • Conduct code reviews and establish development best practices
  • Provide technical expertise during pre-sales activities and proposal development
  • Optimize database performance through indexing, partitioning, and query tuning
  • Implement data security and privacy controls in compliance with regulations (GDPR, HIPAA, SOX)
  • Establish CI/CD pipelines for data engineering workflows
  • Monitor and troubleshoot production data systems
  • 5+ years of progressive experience in data engineering or analytics engineering roles with strong focus on cloud-native data architectures and data modeling
  • Hands-on experience with Microsoft Fabric and its components (Data Factory, OneLake, Lakehouse, Dataflows Gen2, Direct Lake, Notebooks, Power BI, KQL databases).
  • Advanced Python programming skills with expertise in data libraries (Pandas, NumPy, SQLAlchemy, PySpark, Snowflake Python Connector)
  • Expert-level proficiency in SQL with experience across multiple database platforms (SQL Server, MySQL, Snowflake, PostgreSQL)
  • Strong experience with cloud data platforms (Azure and AWS) and their integration with Snowflake
  • Proficiency with version control systems (Git) and collaborative development workflows
  • Experience with containerization technologies (Docker, Kubernetes)
  • Expert-level understanding of data modeling concepts including dimensional modeling, normalization, denormalization, and data vault methodology
  • Advanced knowledge of slowly changing dimensions (SCD Types 1-7) and their implementation patterns
  • Deep knowledge of Snowflake architecture, micro-partitions, clustering keys, and query optimization
  • Knowledge of data warehouse concepts and modern data architecture patterns including data mesh and lakehouse architectures
  • Experience with both relational and NoSQL databases
  • Understanding of data governance, lineage, and quality management principles
  • Strong problem-solving skills with ability to debug complex data issues
  • Experience with Agile/Scrum methodologies and project management practices
  • Excellent communication skills for technical and non-technical audiences
  • Ability to work independently and manage multiple client engagements simultaneously
  • Bachelor’s degree in Computer Science, Data Engineering, or related technical field
  • Cloud certifications (Microsoft Fabric Data Engineer, Azure Administration, Azure Solutions Architect)
  • Snowflake certifications (SnowPro Core, SnowPro Advanced Data Engineer)
  • Experience with additional programming languages (Scala, Java, R)
  • Experience with data visualization tools (Tableau, Power BI, Looker)
  • Background in specific industries (financial services, healthcare, retail, manufacturing)
  • Previous consulting experience
  • Training Opportunities: We believe in lifelong learning and provide numerous avenues for skill enhancement.
  • Fully Invested 401K Plan: We help secure your future with a fully invested 401K plan.
  • PPO and HDHP Medical Plans: Choose the health insurance program that best fits your needs.
  • Employer-Paid Dental and Vision Plans: We cover dental and vision plans, ensuring our employees have access to comprehensive health care.
  • Onsite Fitness Center: Stay fit and healthy with our state-of-the-art fitness center.
  • Wellness Program: We promote a healthy lifestyle with our wellness program.
  • Catered Lunches: Enjoy delicious catered lunches regularly.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service