M&A Data Integration Engineer

Waverly Advisors, LLCBirmingham, AL
4h

About The Position

Waverly Advisors’ primary goal is to serve our clients, one another, and our communities. We aren’t your typical wealth management firm. Our intense client focus is at the center of everything we do. We go far beyond just managing our clients’ investments, offering truly in-depth financial planning. We set ourselves apart by actually living and acting on our guiding principle, ‘Serve.’ It is the reason we go to work every day. We are a growing organization expanding through mergers and acquisitions and are looking for a M&A Data Integration Engineer who is excited to build, scale, and own modern data solutions. This role is ideal for someone early-to-mid career who wants hands-on development experience, exposure to complex business problems, and the opportunity to grow into a senior data engineering or platform role. You will design and build Python- and Spark-based data pipelines in Azure Databricks, playing a key role in how acquired data is migrated, validated, and operationalized across the business. As the company scales, this role will have increasing ownership, technical influence, and career growth.

Requirements

  • 1–4 years of Python development experience, preferably in a data engineering or backend role.
  • Hands-on experience with Azure Databricks and PySpark.
  • Strong SQL skills and experience working with relational or analytical databases.
  • Experience building data pipelines, migrations, or system integrations.
  • Curiosity, coachability, and a strong desire to grow technical depth and ownership.

Nice To Haves

  • Experience supporting M&A integrations or working with financial / transactional data.
  • Familiarity with Azure data services (ADLS, Azure SQL, Synapse).
  • Experience with Data Lake and modern data architecture concepts.
  • Exposure to CI/CD, automation, or data platform tooling.
  • Familiarity with portfolio management systems (Orion, Tamarac, Black Diamond).

Responsibilities

  • Build and maintain Python and PySpark data pipelines in Azure Databricks to support M&A data migrations and integrations.
  • Develop automated ETL/ELT workflows for financial and operational systems.
  • Implement data validation, reconciliation, and data quality checks to ensure accuracy and audit readiness.
  • Support onboarding of acquired entities through system mapping, transformation logic, and migration execution.
  • Work with APIs, cloud storage, databases, and flat files to integrate diverse data sources.
  • Collaborate with IT, Finance, Operations, and Compliance to translate business needs into scalable data solutions.
  • Document data pipelines, Databricks notebooks, and integration standards as the platform matures.
  • Track work and dependencies using project management tools (e.g., Monday.com).

Benefits

  • Comprehensive Health, Dental, and Vision coverage to support your overall well‑being.
  • 401(k) retirement plan with match and profit sharing to help you invest in your future.
  • Twelve paid holidays each year.
  • An extra vacation day during your birthday week—so you can celebrate you!
  • Responsible Time Off Policy giving flexibility without annual PTO limits, while balancing team responsibilities and business needs.
  • Paid sabbatical program: Enjoy four consecutive weeks of paid time off after seven years of service.
  • Compensation commensurate with experience.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service