Research Associate, Quantitative Developer

Bridgewater AssociatesNew York, NY
9h$150,000 - $200,000Onsite

About The Position

We are seeking a Data Engineer to join a PM-led pod focused on systematic macro and long/short equity strategies in APAC. This is a hands-on, embedded role working directly with the portfolio manager, quantitative researchers, and quantitative developers in a fast-paced investment environment. Unlike centralized data platform roles, this position is deeply integrated with the investment process. You will own the end-to-end data lifecycle that powers alpha research, portfolio construction, and live trading – from sourcing and ingestion to profiling, validation, transformation, and delivery into research and production systems. You will collaborate with our quantitative researchers, developers and portfolio managers and have direct impact on the data and systems that drive investment decisions.

Requirements

  • 1–5 years of experience as a Data Engineer or in a closely related role, either: embedded with systematic investment teams (hedge fund, asset manager, bank), or in a high-scale, data-intensive technology environment (e.g., consumer, payments, or platform companies).
  • Strong programming skills in Python and SQL; experience building production-quality, maintainable data pipelines.
  • Experience working with modern data platforms (e.g., Snowflake or similar cloud data warehouses).
  • Familiarity with distributed processing and workflow orchestration (e.g., Spark, Airflow, or equivalents).
  • Proven ability to reason about data correctness, lineage, versioning, and reproducibility in environments where data errors have material downstream impact.
  • Comfort using lightweight statistical analysis and data science techniques to assess data quality, coverage, and suitability for research use.
  • Demonstrated experience working with high-dimensional, messy, and evolving datasets, including financial market and reference data (e.g., prices, fundamentals, macro, corporate actions), or large-scale behavioral, transactional, or event-driven data with complex schemas and quality challenges.
  • Experience working with datasets spanning multiple geographies, where data sources, standards, and availability vary materially by region.
  • Experience navigating APAC data realities, including jurisdiction-specific macro definitions, country-specific corporate structures, and uneven disclosure and historical coverage.
  • Familiarity with differences between global and local data sources, including gaps between English-language and local-language
  • In addition, to succeed within our unique culture and work environment, individuals must demonstrate humility, innate curiosity, and openness to new ideas and approaches. Candidates must be driven, confident, and goal-oriented. All Bridgewater employees are expected to be honest, exceptionally direct, and eager to provide and receive objective feedback. Our employees constantly strive for self-improvement through feedback and self-reflection and are committed to the pursuit of excellence.

Nice To Haves

  • Familiarity with APAC data sets & providers a plus.

Responsibilities

  • Identify and assess new datasets to identify their merit in our investment process.
  • Build and maintain a catalog of datasets/vendors by attending data conferences, reading whitepapers, and taking introductory calls.
  • Design, build, and maintain robust, scalable data pipelines supporting systematic macro and long/short equity strategies.
  • Partner closely with researchers and quantitative developers to ensure data is research-ready, well-documented, and reproducible across simulation and live environments.
  • Profile and interrogate datasets to understand distributions, coverage gaps, stability over time, and structural breaks.
  • Implement data quality checks, anomaly detection, and monitoring to ensure accuracy, timeliness, and completeness of production datasets.
  • Design and maintain our data ontology and schemas
  • Work across a variety of datasets including traditional datasets (fundamentals, market data, security master, etc.) to large alternative datasets.
  • Work with shared data engineering and platform teams to evolve the broader data ecosystem while maintaining pod-level ownership and agility.
  • Contribute to improvements in tooling, standards, and best practices that increase research velocity and system reliability
  • Work with the data vendors on data quality and new features that will benefit our investment process.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service