Senior Data Implementation Engineer

Acadian Asset Management LLCBoston, MA
Hybrid

About The Position

Acadian Asset Management is seeking a Senior Data Implementation Engineer to lead and accelerate data transformation efforts. This role is crucial for reimagining and rebuilding the data platform to support scaling investment research, production data workflows, and operational capabilities. The engineer will be hands-on in designing and implementing modern data pipelines and platform capabilities using tools like Apache Arrow and Polars, and will also help shape the use of AI-assisted development tools to improve efficiency and maintainability. This is a senior engineering role requiring a blend of strategic thinking and execution, focusing on understanding existing systems, designing transition paths, mentoring others, and building reliable, scalable, high-performance data infrastructure for the quantitative investment process. Acadian offers a hybrid work environment with 3 days per week in the Boston office.

Requirements

  • Bachelor’s degree in a relevant field
  • A strong background in data engineering, data platform implementation, or data-intensive software engineering, ideally with 5+ years of experience in a financial, investment, or technology-driven environment.
  • Deep experience building production-grade data pipelines, including ingestion, transformation, validation, orchestration, monitoring, and deployment.
  • Strong Python programming skills, with relevant experience working with Dagster, Airflow or similar technologies.
  • Experience with modern data architecture patterns such as data lakehouse, Medallion architecture, composable data systems, microservices, and event-driven or workflow-oriented architectures.
  • Familiarity with Data Mesh concepts, including decentralized data ownership, domain-oriented data products, governance, and discoverability.
  • Experience with orchestration, workflow, and data transformation tools such as dbt, Polars, Pandas, Apache Arrow, or comparable technologies.
  • Practical experience using AI-assisted development tools such as Claude, ChatGPT, GitHub Copilot, or similar platforms to support data pipeline development, automation, testing, documentation, and engineering productivity.
  • Demonstrated ability to mentor engineers, lead technical workstreams, and influence engineering practices across a team.
  • A strong sense of ownership and the ability to operate effectively in ambiguous environments, balancing tactical delivery with long-term architectural thinking.
  • A passion for building systems that are efficient, scalable, reliable, observable, and easy for others to use.
  • An understanding of how data infrastructure supports the broader investment process, and a desire to deliver solutions that improve outcomes for clients and create meaningful business impact.

Responsibilities

  • Design, build, and optimize robust data pipelines that are reliable, scalable, observable, and performant, using modern data engineering tools and patterns to support investment research, production workflows, and downstream analytics.
  • Deeply understand existing data systems, identify architectural limitations, and lead the transition toward more modern, maintainable, and efficient data architectures.
  • Partner with Data Platform Engineering, Data Analysis, and investment teams to define implementation patterns, pipeline standards, and architectural approaches that can scale across the organization.
  • Use AI-assisted development tools such as Claude, ChatGPT, or similar technologies to accelerate pipeline development, automate repetitive workflows, improve documentation, generate tests, and enhance developer productivity.
  • Provide technical guidance to junior and mid-level engineers, raising the engineering bar through code reviews, design discussions, pairing, documentation, and thoughtful mentorship. Lead small project teams or workstreams when needed.
  • Develop systems with strong testing, monitoring, lineage, data quality checks, and operational supportability to ensure data pipelines are resilient, transparent, and easy to maintain.
  • Work closely with stakeholders across technology, data, research, and investment teams to understand requirements, resolve ambiguity, and deliver solutions that create measurable business value.
  • Identify opportunities to simplify, automate, and improve the data ecosystem, championing practical innovation while balancing speed, stability, maintainability, and long-term architectural quality.

Benefits

  • health
  • retirement
  • wellness offerings
  • comprehensive benefits program
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service