Point72-posted 9 days ago
Full-time • Mid Level
Stamford, NY
1,001-5,000 employees

As Point72 reimagines the future of investing, our Technology group is constantly improving our company’s IT infrastructure, positioning us at the forefront of a rapidly evolving technology landscape. We’re a team of experts experimenting, discovering new ways to harness the power of open-source solutions, and embracing enterprise agile methodology. We encourage professional development to ensure you bring innovative ideas to our products while satisfying your own intellectual curiosity. The Data Engineering Technology team provides Point72 with a scalable data engineering capability and set of data services to support the firm’s expanding data needs. We focus on solutions such as cloud computing, data platforms for large scale data processing, data governance, data quality, enterprise reference data, business automation, and high touch service. We have team members across four continents who collaborate to support the firm’s global footprint of investment strategies and teams.

  • Design, develop, and maintain robust data pipelines and ETL workflows in Databricks to support quantitative research, trading, and risk management.
  • Collaborate closely with data scientists, analysts, and portfolio managers to understand data needs and deliver scalable data infrastructure.
  • Ingest, process, and normalize large volumes of structured and unstructured financial data from a variety of sources.
  • Optimize performance of data pipelines and ensure high availability, reliability, and data quality across all production systems.
  • Implement data governance best practices, including data lineage, cataloging, auditing, and access controls.
  • Support the integration of third-party data vendors and APIs into the broader data ecosystem.
  • Continuously evaluate and implement new tools and technologies to improve data engineering capabilities, with a focus on cloud-native and distributed processing frameworks.
  • 3–6 years of professional experience in data engineering or a similar role, ideally within a financial services or high-performance computing environment.
  • Expertise in Databricks, including Spark (PySpark or Scala), Delta Lake, and notebook-based development workflows.
  • Proficiency in building scalable, distributed data pipelines in a cloud environment (preferably Azure or AWS).
  • Strong programming skills in Python and SQL.
  • Solid understanding of data architecture principles, data modeling, and data warehousing.
  • Experience with version control (e.g., Git), CI/CD workflows, and modern data orchestration tools (e.g., Airflow, dbt).
  • Demonstrated ability to work collaboratively in a fast-paced, high-stakes environment with both technical and non-technical stakeholders.
  • Bachelor’s or master’s degree in computer science, engineering, or a related technical field.
  • Commitment to the highest ethical standards
  • Fully paid health care benefits
  • Generous parental and family leave policies
  • Mental and physical wellness programs
  • Volunteer opportunities
  • Non-profit matching gift program
  • Support for employee-led affinity groups representing women, minorities and the LGBTQ+ community
  • Tuition assistance
  • A 401(k) savings program with an employer match and more
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service