Data Engineer - Associate

Morgan StanleyNew York, NY
2d$90,000 - $150,000

About The Position

In the Technology division, we leverage innovation to build the connections and capabilities that power our Firm, enabling our clients and colleagues to redefine markets and shape the future of our communities. This is a Data Engineering position at Associate level, which is part of the job family responsible for providing specialist data analysis and expertise that drive decision-making and business insights as well as crafting data pipelines, implementing data models, and optimizing data processes for improved data accuracy and accessibility, including applying machine learning and AI-based techniques. Since 1935, Morgan Stanley is known as a global leader in financial services, always evolving and innovating to better serve our clients and our communities in more than 40 countries around the world. Morgan Stanley is a leading global financial services firm, providing investment banking, securities, wealth management, and investment management services in over 40 countries. The firm is recognized for its commitment to innovation, integrity, and delivering value to clients, shareholders, and communities worldwide. About Finance Technology: Finance Technology at Morgan Stanley delivers innovative solutions for regulatory and financial reporting, general ledger, P&L calculations, and analytics. The team leverages advanced data platforms and modern engineering practices to support the firm’s finance division, ensuring accuracy, compliance, and strategic business insights.

Requirements

  • 3–5 years of hands-on experience in data engineering, database development, or related roles.
  • Strong expertise in SQL (query optimization, data modelling, and performance tuning).
  • Proficiency with Snowflake (or similar cloud data platforms) and Python for data processing and automation.
  • Experience with ETL tools, scripting (Shell/Python), and building scalable data pipelines.
  • Solid understanding of data warehousing concepts and best practices.
  • Excellent problem-solving and analytical skills.
  • Strong communication skills and ability to work collaboratively in a global team environment.

Nice To Haves

  • Familiarity with Power BI for data visualization and reporting.
  • Experience with Apache Airflow or similar workflow orchestration tools.
  • Exposure to OLAP tools and multidimensional data modelling.
  • Experience or interest in leveraging GenAI, LLMs, or Copilot tools for data engineering, automation, or reporting use cases.

Responsibilities

  • Design, develop, and maintain robust data pipelines, ETL processes, and large-scale data warehouse solutions to support enterprise-level reporting and analytics initiatives.
  • Drive automation of data workflows and reporting processes to improve efficiency and reduce manual intervention, leveraging modern tools and AI/GenAI solutions where possible.
  • Enable and optimize cloud-based data platforms for scalable data storage, processing, and reporting.
  • Collaborate with business stakeholders, data scientists, and engineering teams to understand requirements and deliver high-quality data solutions.
  • Optimize and troubleshoot SQL queries, data models, and ETL scripts for performance and reliability.
  • Ensure data integrity, quality, and security across all stages of the data lifecycle.
  • Document data flows, processes, and technical solutions for ongoing support and knowledge sharing.
  • Contribute to a culture of innovation, continuous learning, and agile delivery within the team.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service