About The Position

The Senior Engineer in Production Management, specializing in Big Data with a focus on Snowflake, will be responsible for the design, development, and overall implementation of robust and scalable data solutions in a complex, critical, and large cross-departmental and multi-disciplinary area. This role is crucial for building robust pipelines, ensuring data quality, and driving performance improvements across our Big Data initiatives.. The role requires a comprehensive understanding of various facets of Big Data technologies and how they interact to achieve strategic data objectives. The candidate will apply in-depth understanding of the business impact of data-driven insights and be accountable for the delivery of a full range of end-to-end data engineering projects.

Requirements

  • Relevant experience in a critical Big Data engineering role with high business impact, and a strong ability to understand how data solutions deliver business value.
  • Excellent engineering skills with expertise in Snowflake data warehousing.
  • Excellent working knowledge of key computer science concepts, especially as they apply to distributed systems, data structures, and algorithms for large-scale data processing.
  • Excellent understanding of data engineering concepts like Data Modeling, Data Lakes, Data Warehouses, and Data Marts.
  • Strong debugging and analytical skills: ability to isolate root causes across data pipelines, Snowflake, cloud infrastructure, and data processing layers.
  • Operational experience of designing, deploying, and managing data pipelines and Snowflake solutions at scale, potentially leveraging cloud platforms (AWS) and related services.
  • Excellent critical thinking and problem-solving skills with a strong analytical mindset.
  • Ability to work independently and collaboratively in a fast-paced environment.
  • Strong communication skills to articulate technical concepts and solutions effectively.

Nice To Haves

  • Operational experience with orchestration tools for CI/CD and Infrastructure-as-Code tooling (Terraform, Cloud Formation, dbt, etc.) is highly desirable for data platform automation.
  • Degree in computer science/mathematics/physics or related technical subject is desirable.

Responsibilities

  • Demonstrates an in-depth understanding of the Software Development Lifecycle and its integration within the overall data technology landscape to deliver scalable, reliable, and performant Big Data solutions, particularly within Snowflake.
  • Conduct in-depth data analysis, troubleshoot complex data issues, and ensure the accuracy, reliability, and integrity of data.
  • Optimize Big Data workflows, including Snowflake query optimization, leveraging partitioning strategies and indexing techniques in distributed storage systems.
  • Perform rigorous unit testing and validation of data pipelines and transformations.
  • Collaborate with data scientists, analysts, and other engineers to understand data requirements and deliver robust data solutions.

Benefits

  • Medical, vision, and dental benefits, 401k retirement plan, variable pay/incentives, paid time off, and paid holidays are available for full time employees.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service