Data Engineer

ROLLERAustin, TX
Remote

About The Position

ROLLER is seeking a highly technical, hands-on Senior Data Engineer to architect and scale their data infrastructure. This role is a core part of the data function within the Business Technology team, responsible for building pipelines and models to improve decision-making and operational efficiency. The Senior Data Engineer will build and architect the modern data stack, ensuring robustness, automation, and governance. They will collaborate with Go-To-Market (GTM), Product, Finance, and People & Culture (P&C) teams to create high-performance data models and pipelines that transform raw data into actionable cross-functional insights. This is a foundational role with the opportunity to significantly impact ROLLER's technical landscape during a pivotal growth stage.

Requirements

  • 5+ years of experience in Data & Analytics or Data Engineering with a focus on Scalable cloud driven data pipeline build.
  • Proven experience in dimensional modeling (Star/Snowflake schema) and designing architectures that support both historical reporting and real-time needs.
  • Expert-level proficiency in Snowflake or BigQuery, including performance tuning, cost optimization, and access control (RBAC).
  • Hands-on experience with Google Cloud Platform services, specifically Cloud Function, Cloud Run for containerized workloads and Cloud Composer for orchestration.
  • Through hands-on experience of dbt (Data Build Tool) to manage transformations, including writing custom macros, implementing tests, and generating documentation.
  • Solid experience using Terraform to provision cloud resources, ensuring that infrastructure is version-controlled and reproducible.
  • Proficiency in GitHub Actions to automate testing and deployment pipelines for data models and infrastructure.
  • Ability to design high-performance dashboards that prioritize user experience and fast load times.
  • Experience collaborating with Finance, GTM, and Product to define key metrics (like CAC, LTV, or Churn) and codifying them into the data layer.
  • Familiarity with data ingestion tools like Fivetran, Airbyte, Workato, Mulesoft to sync data from various SaaS platforms into the warehouse.
  • Operates as a full-stack Data Engineer with the technical authority to architect and build entire data lifecycles, from raw source ingestion to high-impact data visualization.
  • Familiarity with business-critical systems like Salesforce, Chargbee, Gainsight, and product analytics tools.
  • Strong communicator with a track record of working across technical and non-technical teams to drive alignment.
  • High appetite for Technology and AI and a natural curiosity for how it can transform work.
  • Comfortable using AI tools to automate repetitive 'busy work,' freeing up focus for high-impact work.

Nice To Haves

  • Experience partnering with IT or Business Technology teams to drive cross-functional outcomes.
  • Exposure to customer success and revenue workflows, such as onboarding journeys, renewal metrics, or support analytics.
  • Familiarity with AI/ML pipelines and experimentation frameworks that support data-driven innovation.
  • Create and maintain intuitive, high-performance Looker dashboards.

Responsibilities

  • Design and implement robust Data Modeling and Architecture strategies that support high-volume data processing and maintain long-term scalability.
  • Develop and manage end-to-end data pipelines using Google Cloud Platform services, specifically leveraging Cloud Run, Cloud Function for compute, BigQuery for storage, and Cloud Composer (Airflow) for orchestration.
  • Build, maintain, and document complex data transformation layers using DBT (Data Build Tool) to ensure high data quality and reliability.
  • Automate the provisioning of cloud resources using Terraform and streamline deployment workflows via GitHub Actions CI/CD pipelines.
  • Demonstrate expert-level proficiency in managing and optimizing Cloud Data Warehouses, such as Snowflake & BigQuery.
  • Partner with cross-functional stakeholders to build and scale high-performing data infrastructure that supports engineering, analytics, and Data Program needs.
  • Architect and maintain a modern data stack by implementing best-in-class tools including Snowflake, Bigquery, dbt, Fivetran, Looker, Tableau and Airflow.
  • Develop scalable ETL/ELT workflows and robust data models to deliver real-time analytics and unified business reporting.
  • Create and maintain intuitive, high-performance Looker dashboards (this will be an added advantage).

Benefits

  • ROLLER Recharge days to celebrate and recharge once we've hit our goals
  • Engage in our 'Vibe Tribe' - led by our team members; you can contribute to company-wide initiatives directly. Regular events and social activities, fundraising & cause-related campaigns.
  • Team Member Assistance Program to proactively support our team's health and wellbeing - access to coaching, education modules, weekly webinars, and more.
  • 16 weeks paid Parental Leave for primary carers and 4 weeks paid Parental Leave for secondary carers.
  • Individual learning & development budget plus genuine career growth opportunities as we continue to expand!
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service