Data Engineer

Anteriad
Remote

About The Position

Anteriad is a B2B solution provider focused on innovating how B2B marketers make data-driven business decisions. They believe data is crucial for effective solutions across various marketing challenges, from customer acquisition to demand generation and account-based marketing. Data is central to their operations, and their team develops powerful solutions to drive results for clients through innovative technology and deep analysis. As a Data Engineer on the Database Solutions Team, you will support custom data platforms by building, maintaining, and continuously evolving data ingestion and transformation pipelines. This role primarily uses Azure Data Factory, Databricks, and emerging Microsoft Fabric capabilities, while preserving and reusing existing SQL logic. A key aspect of the role involves modernizing and migrating SQL Server–based workloads from Azure virtual machines to Azure-native data services. Hands-on experience with Azure data tools and SQL Server is highly valued. Each custom data platform is actively maintained and enhanced to support new and evolving requests from campaign services and analytics teams. The successful candidate must be comfortable managing ongoing, parallel initiatives, balancing modernization efforts with continuous delivery of incremental enhancements.

Requirements

  • Hands‑on experience building or supporting pipelines in Azure Data Factory.
  • Experience developing data transformations in Databricks, preferably using SQL within notebooks.
  • Experience working with Azure Data Lake Storage or similar cloud storage platforms.
  • Familiarity with modern data lake and analytics concepts, including medallion architecture.
  • Experience with SQL Server, including writing, maintaining, and understanding complex SQL scripts and stored procedures.
  • Experience using Git / Azure DevOps for source control.
  • Experience with Python for use in Databricks notebooks and supporting data pipeline operations.
  • Ability to follow established architectural patterns and implementation standards.
  • Strong analytical, troubleshooting, and communication skills.

Nice To Haves

  • Exposure to Microsoft Fabric, including Lakehouse, Warehousing, or OneLake concepts.
  • Experience integrating or evaluating Fabric alongside existing Azure and Databricks ecosystems.
  • Experience migrating SQL Server workloads from Azure VMs to Azure‑native or Fabric‑based platforms.
  • Prior exposure to SSIS and translating ETL logic into cloud‑based pipelines.
  • Familiarity with data governance and data hygiene best practices, supporting clean, consistent, and trustworthy data for downstream use.

Responsibilities

  • Build and maintain data ingestion pipelines using Azure Data Factory (ADF) to move data into the corporate Azure Data Lake and related platforms.
  • Develop and enhance data transformation logic in Databricks notebooks, primarily leveraging SQL‑based transformations in existing SQL scripts and stored procedures.
  • Support a medallion data architecture (Bronze, Silver, Gold) with Silver and Gold datasets stored primarily in Parquet format.
  • Assist in migrating SQL Server–based processes (scripts, stored procedures, SSIS logic) running on Azure VMs into Azure‑native pipelines and Databricks transformations.
  • Continuously update and enhance existing database solutions and systems based on new and evolving requirements from marketing campaign activation and analytics teams, ensuring data remains accurate, timely, and fit for purpose.
  • Manage multiple concurrent initiatives, including modernization efforts and ongoing enhancement requests, while maintaining platform stability and data quality.
  • Configure Azure Data Factory pipelines to orchestrate Databricks notebooks and Fabric‑related processes as applicable.
  • Use Azure DevOps for source control, versioning, and deployment of Azure Data Factory, Databricks, and Fabric assets.
  • Test, monitor, and troubleshoot pipelines to ensure data reliability, performance, and quality.
  • Collaborate with senior engineers, project managers, campaign services, analytics, and platform teams to deliver ongoing data solutions.
  • Participate in the adoption and use of Microsoft Fabric, understanding how Fabric components (e.g., OneLake, Lakehouse, Warehousing) integrate with existing Azure and Databricks workflows.
  • Maintain technical documentation for data pipelines, transformations, and datasets

Benefits

  • Flexible PTO & Company Holidays
  • Flexible Schedule
  • Continuous Professional Training and Development
  • Mix Of Collaborative & Independent Work
  • Community Outreach Opportunities via Anteriad Cares
  • Professional Global Mentoring Program - Career Guidance From Leadership
  • Great Benefits For You And Your Family
  • Comprehensive medical (choice of 3 plans)
  • dental and vision coverage
  • Company paid short-term disability, long term disability and life Insurance
  • Optional supplemental life, accident and critical illness insurance plans
  • 401K with company match
  • Flexible PTO and generous holiday schedule
  • Fully paid primary caregiver leave (12 weeks) & parental bonding leave (2 weeks)

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

101-250 employees

© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service