Full Stack Cloud & Data Engineer

ShyftOffTampa, FL
12dOnsite

About The Position

We’re hiring our next engineer — someone who’s obsessed with data, passionate about systems that: 1. work 2. are performant and 3. are clean (in that order!), and eager to design, build, and optimize data pipelines that power product growth. In this role, you’ll own the entire data product ecosystem — from how data flows through the platform to how it’s surfaced for decision-making. You’ll play a key role in designing data systems, enabling data-informed insights, and ensuring our platform scales efficiently. This role is onsite in Tampa, FL

Requirements

  • 4+ years of professional experience in Data Engineering or a related backend engineering field.
  • Strong command of PostgreSQL (schema design, optimization, complex queries).
  • Proficient in Python and experience creating and maintaining Airflow DAGs.
  • Hands-on experience with AWS Cloud Services (S3, ECS, RDS, DynamoDB).
  • Proven ability to design, build, troubleshoot, and maintain robust ETL/ELT pipelines.
  • Strong understanding of software engineering principles, data modeling, and distributed systems.
  • Excellent communication skills and ability to collaborate effectively with non-technical teams.
  • Thrives in fast-paced startup environments where speed and ownership matter.

Nice To Haves

  • AWS Certification (Solutions Architect, Data Engineer, or equivalent).
  • Prior startup experience or experience as an early technical team member.
  • Strong GitHub presence or portfolio of open-source contributions.

Responsibilities

  • Build and maintain scalable data models, cloud infrastructure, and E2E pipelines, that power ShyftOff’s platform.
  • Design, implement, and optimize workflows using Airflow for ETL/ELT processes.
  • Develop and maintain PostgreSQL databases.
  • Write clean, maintainable Python code (with a focus on Pandas for data manipulation).
  • Partner cross-functionally with Sales, Marketing, and Operations to drive data-informed decisions.
  • Manage integrations between internal systems, ensuring smooth data flow across the business.
  • Maintain, monitor, and troubleshoot production data systems hosted in AWS (RDS, S3, ECS, Lambda) and GCP (BigQuery, Looker Studio).
  • Own the data lifecycle—from schema design and ingestion through transformation, validation, and reporting.
  • Champion reliability, scalability, monitoring, and performance across the data platform.
  • Contribute ideas, explore new tools/technologies, and take pride in building something foundational.

Benefits

  • Competitive salary and equity
  • Health and wellness benefits
  • Professional development opportunities
  • High-impact role with visibility across the company
  • The chance to help shape the technical culture and data infrastructure of a growing startup

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

5,001-10,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service