About The Position

Goldman Sachs is seeking a high-caliber, hands-on Senior Cloud Data Engineer for their Asset & Wealth Management division. While providing architectural guidance, the primary impact will come from hands-on engineering, including building production-ready data pipelines, containerizing microservices for Amazon ECS, and executing the technical migration of legacy on-premises systems to AWS. The role involves active migration execution, containerization and orchestration, data pipeline engineering, and infrastructure as code development.

Requirements

  • 8+ years of hands-on experience in Data Engineering and Cloud Infrastructure, with a focus on building and migrating production workloads.
  • Deep technical expertise in Amazon ECS (Fargate/EC2), including networking (ALB/NLB), task placement strategies, and container security.
  • Proven experience with modern data platforms such as Snowflake (AI Data Cloud) and cloud-native services.
  • Good understanding of open-source table formats, specifically Apache Iceberg.
  • Expert-level proficiency in Java, Python and SQL.
  • Hands-on experience with Spark, Kafka, and orchestration tools like Apache Airflow, Dagster, or dbt.
  • Deep understanding of data warehousing and modern data lakehouse architecture.
  • Ability to explain complex technical concepts to non-technical stakeholders in the wealth management business.
  • A "builder" mindset with the ability to navigate ambiguity in a fast-paced environment.
  • Bachelor’s or Master’s degree in computer science, Engineering, Mathematics, or a related field.

Nice To Haves

  • Experience containerizing microservices for Amazon ECS.
  • Experience executing the technical migration of legacy on-premises systems to AWS.
  • Experience with "Lakehouse" patterns using Apache Iceberg to ensure data portability.
  • Experience with Infrastructure & Automation-as-Code (IaC) Development using Terraform or AWS CDK.
  • Experience with AI coding assistants (e.g., GitHub Copilot).
  • Experience building custom automation tools in Python or Go for data validation and schema conversion.
  • Experience with Observability Implementation using Amazon CloudWatch Container Insights, and OpenTelemetry.
  • Proven track record of upskilling junior engineers.

Responsibilities

  • Directly execute the migration of legacy ETL and microservices to AWS, including refactoring monolithic code into containerized services and deploying them to Amazon ECS (Fargate/EC2).
  • Build and maintain Docker images, write complex ECS Task Definitions, and configure service-to-service communication using Amazon ECS Service Connect and AWS Cloud Map.
  • Develop end-to-end data flows using AWS Glue (PySpark), Amazon EMR, and Snowflake. Implement "Lakehouse" patterns using Apache Iceberg.
  • Write and maintain production-grade Terraform or AWS CDK modules to provision VPCs, ECS clusters, and RDS instances.
  • Ensure all infrastructure is version-controlled and deployed via GitHub Actions or GitLab CI.
  • Actively use AI coding assistants (e.g., GitHub Copilot) to refactor legacy SQL, generate unit tests, and automate the creation of boilerplate pipeline code.
  • Identify manual bottlenecks in the migration process and build custom automation tools in Python or Go to streamline data validation and schema conversion.
  • Lead rigorous peer code reviews, enforcing standards for performance, security (IAM least privilege), and maintainability.
  • Hands-on configuration of Amazon CloudWatch Container Insights, and OpenTelemetry to ensure deep visibility into migrated microservices and data jobs.
  • Directly optimize Spark job configurations, Snowflake warehouse sizing, and ECS auto-scaling policies to balance performance.
  • Upskill junior engineers through mentorship.

Benefits

  • Training and development opportunities
  • Firmwide networks
  • Benefits
  • Wellness programs
  • Personal finance offerings
  • Mindfulness programs
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service