Senior Data Implementation Engineer

Acadian Asset ManagementBoston, MA
Hybrid

About The Position

Acadian Asset Management is seeking a Senior Data Implementation Engineer to lead and accelerate data transformation efforts. This role is crucial for reimagining and rebuilding the company's data platform to support scaling investment research, production data workflows, and operational capabilities. The engineer will be hands-on in designing and implementing modern data pipelines and platform capabilities using tools like Apache Arrow and Polars, and will also help shape the use of AI-assisted development tools to improve data pipeline development. This is a senior engineering role requiring a blend of strategic thinking and execution, focusing on understanding existing systems, designing transition paths, mentoring others, and building reliable, scalable, high-performance data infrastructure for the quantitative investment process.

Requirements

  • Bachelor’s degree in a relevant field
  • A strong background in data engineering, data platform implementation, or data-intensive software engineering, ideally with 5+ years of experience in a financial, investment, or technology-driven environment.
  • Deep experience building production-grade data pipelines, including ingestion, transformation, validation, orchestration, monitoring, and deployment.
  • Strong Python programming skills, with relevant experience working with Dagster, Airflow or similar technologies.
  • Experience with modern data architecture patterns such as data lakehouse, Medallion architecture, composable data systems, microservices, and event-driven or workflow-oriented architectures.
  • Familiarity with Data Mesh concepts, including decentralized data ownership, domain-oriented data products, governance, and discoverability.
  • Experience with orchestration, workflow, and data transformation tools such as dbt, Polars, Pandas, Apache Arrow, or comparable technologies.
  • Practical experience using AI-assisted development tools such as Claude, ChatGPT, GitHub Copilot, or similar platforms to support data pipeline development, automation, testing, documentation, and engineering productivity.
  • Demonstrated ability to mentor engineers, lead technical workstreams, and influence engineering practices across a team.
  • A strong sense of ownership and the ability to operate effectively in ambiguous environments, balancing tactical delivery with long-term architectural thinking.
  • A passion for building systems that are efficient, scalable, reliable, observable, and easy for others to use.
  • An understanding of how data infrastructure supports the broader investment process, and a desire to deliver solutions that improve outcomes for clients and create meaningful business impact.

Responsibilities

  • Lead Data Pipeline Design and Implementation: Design, build, and optimize robust data pipelines that are reliable, scalable, observable, and performant.
  • Use modern data engineering tools and patterns to support investment research, production workflows, and downstream analytics.
  • Drive System Modernization: Deeply understand existing data systems, identify architectural limitations, and lead the transition toward more modern, maintainable, and efficient data architectures.
  • Shape Technical Direction: Partner with Data Platform Engineering, Data Analysis, and investment teams to define implementation patterns, pipeline standards, and architectural approaches that can scale across the organization.
  • Apply AI Tooling to Engineering Workflows: Use AI-assisted development tools such as Claude, ChatGPT, or similar technologies to accelerate pipeline development, automate repetitive workflows, improve documentation, generate tests, and enhance developer productivity.
  • Mentor and Lead Others: Provide technical guidance to junior and mid-level engineers. Help raise the engineering bar through code reviews, design discussions, pairing, documentation, and thoughtful mentorship. Lead small project teams or workstreams when needed.
  • Improve Reliability and Operational Excellence: Develop systems with strong testing, monitoring, lineage, data quality checks, and operational supportability. Help ensure that data pipelines are resilient, transparent, and easy to maintain.
  • Collaborate Across Teams: Work closely with stakeholders across technology, data, research, and investment teams to understand requirements, resolve ambiguity, and deliver solutions that create measurable business value.
  • Continuously Improve the Platform: Identify opportunities to simplify, automate, and improve the data ecosystem. Champion practical innovation while balancing speed, stability, maintainability, and long-term architectural quality.

Benefits

  • flexible hybrid work environment
  • strong benefits
  • health
  • retirement
  • wellness offerings
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service