About The Position

DataRobot delivers AI that maximizes impact and minimizes business risk. Our platform and applications integrate into core business processes so teams can develop, deliver, and govern AI at scale. DataRobot empowers practitioners to deliver predictive and generative AI, and enables leaders to secure their AI assets. Organizations worldwide rely on DataRobot for AI that makes sense for their business — today and in the future. About DataRobot DataRobot delivers the industry-leading agentic AI applications and platform that maximize impact and minimize risk for your business. DataRobot’s enterprise AI platform democratizes data science with end-to-end automation for building, deploying, and managing machine learning models. This platform maximizes business value by delivering AI at scale and continuously optimizing performance over time. The company’s proven combination of cutting-edge software and world-class AI implementation, training, and support services empowers any organization, regardless of size, industry, or resources, to drive better business outcomes with AI. About the role: As a technical driver and hands-on expert, the Senior Data Engineer will shape our end-to-end data strategy and guide the team’s technical execution. This role is responsible for building scalable Data Warehouse and Lakehouse solutions on Snowflake, championing the ELT paradigm, and ensuring robust data governance and cost optimization. We are looking for a seasoned engineer who combines deep technical mastery with a passion for mentoring others to build and influence high-impact, data-driven solutions.

Requirements

  • 5-7 years of experience in a data engineering or data analyst role.
  • Experience building and maintaining product analytics pipelines, including the implementation of event tracking (e.g., Snowplow) and the integration of behavioral data into Snowflake from platforms like Amplitude.
  • Strong understanding of data warehousing concepts, working experience with relational databases (Snowflake, Redshift, Postgres, etc.), and SQL.
  • Experience working with cloud providers like AWS, Azure, GCP, etc.
  • Solid programming foundations and proficiency in data-related languages like Python, Scala, and R.
  • Experience with DevOps workflows and tools like DBT, GitHub, Airflow, etc.
  • Experience with an infrastructure-as-code tool such as Terraform or CloudFormation
  • Excellent communication skills. Ability to effectively communicate with both technical and non-technical audiences
  • Knowledge of real-time stream technologies like AWS Firehose, Spark, etc.
  • Highly collaborative in working with teammates and stakeholders

Nice To Haves

  • AWS cloud certification is a plus
  • BA/BS preferred in a technical or engineering field

Responsibilities

  • Architect and deliver scalable, reliable data warehouses, analytics platforms, and integration solutions. Critical role in supporting our internal AI strategy.
  • Partner with Product Manager, Analytics to shape our project roadmap and lead its implementation.
  • Collaborate with and mentor cross-functional teams to design and execute sophisticated data software solutions that elevate business performance and align to coding standards and architecture.
  • Develop, deploy, and support analytic data products, such as data marts, ETL jobs (extract/transform/load), functions (in Python/SQL/DBT) in a cloud data warehouse environment using Snowflake, Stitch/Fivetran/Airflow, AWS services (e.g., EC2, lambda, kinesis).
  • Navigate various data sources and efficiently locate data in a complex data ecosystem.
  • Work closely with data analysts, and data scientists to build models and metrics to support their analytics needs.
  • Data modeling enhancements caused by upstream data changes.
  • Instrument telemetry capture and data pipelines for various environments to provide product usage visibility.
  • Maintain and support deployed ETL pipelines and ensure data quality.
  • Develop monitoring and alerting systems to provide visibility into the health of data infrastructure, cloud applications, and data pipelines.
  • Partner with the IT enterprise applications and engineering teams on integration efforts between systems that impact data & Analytics
  • Work with R&D to answer complex technical questions about product analytics and corresponding data structure.

Benefits

  • Medical, Dental & Vision Insurance
  • Flexible Time Off Program
  • Paid Holidays
  • Paid Parental Leave
  • Global Employee Assistance Program (EAP)
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service