About The Position

We are seeking a high-caliber, hands-on Senior Cloud Data Engineer. While you will provide architectural guidance, your primary impact will come from hands-on engineering: building production-ready data pipelines, containerizing microservices for Amazon ECS, and executing the technical migration of legacy on-premises systems to AWS.

Requirements

  • 8+ years of hands-on experience in Data Engineering and Cloud Infrastructure, with a focus on building and migrating production workloads.
  • Deep technical expertise in Amazon ECS (Fargate/EC2), including networking (ALB/NLB), task placement strategies, and container security.
  • Proven experience with modern data platforms such as Snowflake (AI Data Cloud) and cloud-native services. Good understanding of open-source table formats, specifically Apache Iceberg, to enable interoperability, schema evolution, and high-performance analytics across multiple engines.
  • Expert-level proficiency in Java, Python and SQL.
  • Hands-on experience with Spark, Kafka, and orchestration tools like Apache Airflow, Dagster, or dbt.
  • Deep understanding of data warehousing and modern data lakehouse architecture.
  • Proven track record of upskilling junior engineers.
  • Ability to explain complex technical concepts to non-technical stakeholders in the wealth management business.
  • A "builder" mindset with the ability to navigate ambiguity in a fast-paced environment.
  • Bachelor’s or Master’s degree in computer science, Engineering, Mathematics, or a related field.

Responsibilities

  • Active Migration Execution: Directly execute the migration of legacy ETL and microservices to AWS. This includes refactoring monolithic code into containerized services and deploying them to Amazon ECS (Fargate/EC2).
  • Containerization & Orchestration: Build and maintain Docker images, write complex ECS Task Definitions, and configure service-to-service communication using Amazon ECS Service Connect and AWS Cloud Map.
  • Data Pipeline Engineering: Develop end-to-end data flows using AWS Glue (PySpark), Amazon EMR, and Snowflake. Implement "Lakehouse" patterns using Apache Iceberg to ensure data portability.
  • IaC Development: Write and maintain production-grade Terraform or AWS CDK modules to provision VPCs, ECS clusters, and RDS instances. Ensure all infrastructure is version-controlled and deployed via GitHub Actions or GitLab CI.
  • AI-Augmented Coding: Actively use AI coding assistants (e.g., GitHub Copilot) to refactor legacy SQL, generate unit tests, and automate the creation of boilerplate pipeline code.
  • Toil Reduction: Identify manual bottlenecks in the migration process and build custom automation tools in Python or Go to streamline data validation and schema conversion.
  • Code Reviews & Standards: Lead rigorous peer code reviews, enforcing standards for performance, security (IAM least privilege), and maintainability.
  • Observability Implementation: Hands-on configuration of Amazon CloudWatch Container Insights, and OpenTelemetry to ensure deep visibility into migrated microservices and data jobs.
  • Performance Tuning: Directly optimize Spark job configurations, Snowflake warehouse sizing, and ECS auto-scaling policies to balance performance.

Benefits

  • training and development opportunities
  • firmwide networks
  • benefits
  • wellness
  • personal finance offerings
  • mindfulness programs
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service