Data Engineer

ROLLERAustin, TX
8h

About The Position

We are seeking a highly technical, hands-on Senior Data Engineer to architect and scale our data infrastructure from the ground up. As a core member of the data function within our Business Technology team, you will be responsible for building the pipelines and models that power smarter decision-making and operational efficiency across the entire organization. In this foundational role, you will build & architect the modern data stack, ensuring our infrastructure is robust, automated, and governed. You collaborate as a technical partner with GTM ,Product, Finance, P&C to build high-performance data models and pipelines that transform raw data into actionable cross-functional insights. Joining ROLLER at this pivotal growth stage, you will have the opportunity to make an outsized impact on our technical landscape, directly influencing how we process data to deliver world-class experiences for our customers globally.

Requirements

  • 5+ years of experience in Data & Analytics or Data Engineering with a focus on Scalable cloud driven data pipeline build
  • Proven experience in dimensional modeling (Star/Snowflake schema) and designing architectures that support both historical reporting and real-time needs.
  • Expert-level proficiency in Snowflake or BigQuery, including performance tuning, cost optimization, and access control (RBAC).
  • Hands-on experience with Google Cloud Platform services, specifically Cloud Function, Cloud Run for containerized workloads and Cloud Composer for orchestration.
  • Through hands-on experience of dbt (Data Build Tool) to manage transformations, including writing custom macros, implementing tests, and generating documentation.
  • Solid experience using Terraform to provision cloud resources, ensuring that infrastructure is version-controlled and reproducible.
  • Proficiency in GitHub Actions to automate testing and deployment pipelines for data models and infrastructure.
  • Ability to design high-performance dashboards that prioritize user experience and fast load times.
  • Experience collaborating with Finance, GTM, and Product to define key metrics (like CAC, LTV, or Churn) and codifying them into the data layer.
  • Familiarity with data ingestion tools like Fivetran , Airbyte, Workato, Mulesoft to sync data from various SaaS platforms into the warehouse.
  • Operates as a full-stack Data Engineer with the technical authority to architect and build entire data lifecycles, from raw source ingestion to high-impact data visualization.
  • Familiarity with business-critical systems like Salesforce, Chargbee, Gainsight, and product analytics tools.
  • Strong communicator with a track record of working across technical and non-technical teams to drive alignment.

Nice To Haves

  • Experience partnering with IT or Business Technology teams to drive cross-functional outcomes.
  • Exposure to customer success and revenue workflows, such as onboarding journeys, renewal metrics, or support analytics.
  • Familiarity with AI/ML pipelines and experimentation frameworks that support data-driven innovation.

Responsibilities

  • Scalable Architecture: Design and implement robust Data Modeling and Architecture strategies that support high-volume data processing and maintain long-term scalability.
  • Cloud Ecosystem: Develop and manage end-to-end data pipelines using Google Cloud Platform services, specifically leveraging Cloud Run, Cloud Function for compute, BigQuery for storage, and Cloud Composer (Airflow) for orchestration.
  • Transformation Excellence: Build, maintain, and document complex data transformation layers using DBT (Data Build Tool) to ensure high data quality and reliability.
  • Infrastructure as Code: Automate the provisioning of cloud resources using Terraform and streamline deployment workflows via GitHub Actions CI/CD pipelines.
  • Cloud Data Warehousing: Demonstrate expert-level proficiency in managing and optimizing Cloud Data Warehouses, such as Snowflake & BigQuery.
  • Team Collaboration: Partner with cross-functional stakeholders to build and scale high-performing data infrastructure that supports engineering, analytics, and Data Program needs.
  • Stack Management: Architect and maintain a modern data stack by implementing best-in-class tools including Snowflake, Bigquery, dbt, Fivetran, Looker, Tableau and Airflow.
  • Pipeline Engineering: Develop scalable ETL/ELT workflows and robust data models to deliver real-time analytics and unified business reporting.
  • Data Visualization: Create and maintain intuitive, high-performance Looker dashboards will be an added advantage.

Benefits

  • You'll get to work on a category-leading product that customers love in a fun, high-growth industry! Check our Capterra and G2 reviews.
  • 4 ROLLER Recharge days per year (When we hit our goals each quarter, we take a well-earned day off together to relax, recharge, and celebrate our wins).
  • Engage in our 'Vibe Tribe' - led by our team members; you can contribute to company-wide initiatives directly. Regular events and social activities, fundraising & cause-related campaigns...you name it. We're willing to make it happen!
  • Team Member Assistance Program to proactively support our team's health and wellbeing - access to coaching, education modules, weekly webinars, and more.
  • 16 weeks paid Parental Leave for primary carers and 4 weeks paid Parental Leave for secondary carers.
  • Work with a driven, fun, and switched-on team that likes to raise the bar in all we do!
  • Individual learning & development budget plus genuine career growth opportunities as we continue to expand!
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service