Blue Margin-posted 4 months ago
$110,000 - $140,000/Yr
Full-time • Senior
Fort Collins, CO
11-50 employees

We help mid-market companies turn their data into a strategic asset. Our clients rely on us to design and deliver reporting platforms that fuel better, faster decision-making. We’re passionate about helping clients increase company value through better analysis and decision-making, and we’re looking for a Senior Data Engineer to strengthen our team. As a Senior Data Engineer, you will lead the design, optimization, and scalability of data platforms that power analytics for our clients. You will be hands-on with data pipelines, large-scale data processing, and modern cloud data stacks while mentoring team members and helping shape best practices.

  • Architect, design, and optimize large-scale data pipelines using tools like PySpark, SparkSQL, Delta Lake, and cloud-native tools.
  • Drive efficiency in incremental/delta data loading, partitioning, and performance tuning.
  • Lead implementations across Azure Synapse, Microsoft Fabric, and/or Snowflake environments.
  • Collaborate with stakeholders and analysts to translate business needs into scalable data solutions.
  • Evaluate and incorporate AI/automation to improve development speed, testing, and data quality.
  • Oversee and mentor junior data engineers, establishing coding standards and best practices.
  • Ensure high standards for data quality, security, and governance.
  • Participate in solution design for client engagements, balancing technical depth with practical outcomes.
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
  • 5+ years of professional experience in data engineering, with emphasis on Python & PySpark/Apache Spark.
  • Proven ability to manage large datasets and optimize for speed, scalability, and reliability.
  • Strong SQL skills and understanding of relational and distributed data systems.
  • Experience with Azure Data Factory, Synapse Pipelines, Fivetran, Delta Lake, Microsoft Fabric, or Snowflake.
  • Knowledge of data modeling, orchestration, and Delta/Parquet file management best practices.
  • Familiarity with CI/CD, version control, and DevOps practices for data pipelines.
  • Experience leveraging AI-assisted tools to accelerate engineering workflows.
  • Strong communication skills; ability to convey complex technical details to both engineers and business stakeholders.
  • Relevant certifications (Azure, Snowflake, or Fabric) a plus.
  • Competitive pay ($110K - $140K)
  • Strong benefits
  • Flexible hybrid work setup—with in-office collaboration based in Fort Collins
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service