Data Engineer

SandboxxMiddleburg, VA
30dHybrid

About The Position

You love digging into data and making it sing—and at Sandboxx, you’ll be the person everyone leans on when they need answers. You’ll sit at the intersection of engineering and business—partnering closely with Marketing, Product, and our Executive team—to build trusted pipelines, analytics models, and dashboards that power everything from day-to-day operations to board-level reporting. This is a hands-on role. You’ll build and maintain ELT workflows, plug in third-party APIs (CDPs, MMPs, Iterable, partner feeds, etc.), and scale our BI stack with BigQuery, Airflow, and Looker (or Looker Studio). While we offer the flexibility to hire remote talent for the right fit, we have a strong preference for in-person collaboration at our Middleburg, VA headquarters and offer an enhanced compensation package for candidates willing to be onsite. Sandboxx’s growth and operations run on data, and your work will turn messy, complex feeds into clean, trusted insights—so nobody ever has to wait in a queue for the story behind the numbers. Why We’re Hiring For This Role Now Our data lives in pockets of well-built pipelines—but without integration, we lack a holistic view of our customer journey. Teams wrestle with one-off queries and spreadsheets to get the needed numbersjust to get the numbers they need, blocking deeper analysis. We don’t have a unified attribution model or single source of truth, so it’s hard to tie marketing and product efforts back to real impact. Real-time insights are out of reach, forcing stakeholders into reactive mode instead of proactive optimization. There’s no dedicated owner of our BI layer—no one ensuring data consistency, reliability, and accessibility across the company.

Requirements

  • A hands-on data engineer who loves turning ambiguity into clarity and takes pride in seeing teams succeed with the metrics you build
  • Deep experience designing and maintaining ELT pipelines in a cloud warehouse (BigQuery, Snowflake, or Redshift) using orchestrators like Airflow or Prefect
  • SQL mastery—you can write, optimize, and troubleshoot complex queries without hesitation
  • Proven track record with dbt (or an equivalent) for modular, testable transformations and a CI/CD mindset for data workflows
  • Comfort modeling data and building self-serve dashboards in Looker, Tableau, or Power BI that non-technical teammates lean on daily
  • A relentless focus on data quality—automated tests, monitoring, and clear documentation are second nature to you
  • Excellent communicator who can translate technical details into actionable insights for Marketing, Product, Finance, and leadership
  • A builder’s mentality: you see gaps in our analytics stack and jump in to solve them rather than waiting for direction
  • Using Airflow, Prefect, or similar, you’ve built production ELT pipelines in a cloud data warehouse (BigQuery, Snowflake, or Redshift).
  • SQL is second nature—you write complex transformations, optimize performance, and debug slow queries with confidence
  • Hands-on experience with dbt (or an equivalent) and a track record of shipping modular, tested data models
  • You’ve authored dashboards in Looker, Tableau, or Power BI that non-technical teams rely on daily.
  • You care deeply about data quality—writing tests, documenting pipelines, and monitoring health so trust in “the numbers” never wavers.
  • You communicate clearly, whether you’re walking a marketer through a dashboard or presenting metrics to the Executive team.

Nice To Haves

  • Have a bachelor’s degree in a technical field such as Computer Science, Data Science, Engineering, Mathematics, Statistics, or a related discipline
  • Have scaled a data stack from thousands to millions of daily events, maintaining pipeline reliability under heavy load
  • Built analytics models for subscription-driven businesses, mastering cohort, funnel, and revenue attribution analyses
  • Integrated multiple marketing and engagement platforms (Iterable, Segment, Branch, Adjust) so acquisition metrics flow seamlessly into your warehouse
  • Enjoy mentoring—running data office hours, pairing on complex queries, and helping teammates level up their SQL skills
  • Have experience with reverse-ETL or operationalizing insights to feed marketing and product tools directly
  • Prior work with military, veteran, or family‑support organizations
  • Startup or high‑growth environment exposure
  • Experience with dbt or custom transformation frameworks
  • Familiarity with marketing and product analytics stacks

Responsibilities

  • Designing, building, and maintaining robust ELT workflows that consolidate data from all internal and external systems into our warehouse
  • Integrating with third-party platforms—Iterable, mParticle, partner APIs, and more—to ensure every metric is consistently captured
  • Architecting and optimizing our BigQuery environment for performance, cost-efficiency, and scalability
  • Modeling data and authoring Looker (Looker Studio, Tableau, etc.) dashboards that give every team self-serve access to reliable insights
  • Implementing automated tests and monitoring to catch data issues early and maintain trust in our metrics
  • Writing clear documentation and hosting regular “office hours” to empower non-technical teammates to explore data confidently
  • Collaborating with BI analysts to establish standards, conduct code reviews, and foster company-wide data ownership
  • Owning and maintaining ELT pipelines using Apache Airflow, custom scripting, and BigQuery
  • Contributing to data governance, access controls, and system scalability as our platform grows

Benefits

  • Equity incentive plan
  • 401k with matching and sharing
  • Top-tier health insurance and family benefits
  • Flexible paid vacation
  • Paid parental leave
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service