Asset & Wealth Management-Cloud Engineer-Analyst-Dallas

Goldman SachsDallas, TX
Onsite

About The Position

Goldman Sachs is seeking a motivated Cloud Engineer to support the WM Data Engineering ecosystem. This role involves being a key contributor to the migration and modernization of on-premises legacy data pipelines and services to AWS. The engineer will work at the intersection of software engineering and data architecture, translating technical blueprints into high-performance code. The mission is to build secure, accessible, and cost-optimized data assets that power real-time client insights and advanced analytics in a cloud-native environment.

Requirements

  • 2+ years of hands-on experience in Data Engineering or Software Engineering, with a focus on cloud-based data solutions.
  • Experience with modern data platforms like Snowflake and cloud-native AWS services.
  • Understanding of open-source table formats, specifically Apache Iceberg.
  • Proficiency in Java, Python, and SQL.
  • Hands-on experience with Spark, Kafka, and orchestration tools like Apache Airflow, Dagster, or dbt.
  • Strong problem-solving "builder" mindset and the ability to communicate technical concepts within a team environment.
  • Bachelor’s or Master’s degree in computer science, Engineering, Mathematics, or a related field.

Responsibilities

  • Build and maintain scalable data pipelines using AWS Glue, Amazon EMR, and Snowflake, transitioning legacy on-premises workloads to modern cloud-native architectures.
  • Execute the migration of on-premises microservices to AWS by containerizing workloads with Docker and deploying them to Amazon ECS.
  • Implement and manage open table formats, specifically Apache Iceberg, to ensure high-performance analytics and seamless schema evolution across the WM data lake.
  • Develop and schedule complex data workflows using Apache Airflow (MWAA) or AWS Step Functions, ensuring robust error handling and retry logic.
  • Deploy and manage cloud infrastructure using Terraform or AWS CDK, adhering to the "Infrastructure as Code" philosophy for all deployments.
  • Maintain automated deployment pipelines to ensure consistent and auditable code promotion across development, UAT, and production environments.
  • Implement data validation to ensure data integrity and accuracy during and after the migration process.
  • Build monitoring dashboards and alerting mechanisms to track pipeline health, data latency, and SLA adherence.
  • Contribute to the migration of on-premises data workloads to AWS.
  • Help build the data foundations required for predictive modeling and generative AI applications.

Benefits

  • training and development opportunities
  • firmwide networks
  • wellness offerings
  • personal finance offerings
  • mindfulness programs
  • fostering and advancing diversity and inclusion
  • reasonable accommodations for candidates with special needs or disabilities

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Entry Level

Number of Employees

5,001-10,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service