Junior Data Engineer

Enact Mortgage InsuranceRaleigh, NC
19hHybrid

About The Position

At Enact, we understand that there’s no place like home. That’s why we bring our deep expertise, insightful offerings, and extra‑mile service to work every day to help lenders put more people in homes—and keep them there. We’re looking for a Junior Data Engineer in Raleigh, NC to join the Data & Analytics Engineering team as we modernize our data ecosystem using cloud‑first platforms, scalable data pipelines, and trusted data products. In this role, you will help build the next generation of our data lake, data warehouse, and machine learning data foundation. As we continue our modernization journey—including cloud‑native architecture, automated data pipelines, and evolving analytics platforms—you will play an active role in building and supporting ingestion, transformation, data engineering workflows across AWS, Snowflake, and modern data engineering frameworks. LOCATION Enact Headquarters, Raleigh, NC – Hybrid Schedule

Requirements

  • Bachelor’s degree in Computer Science, Engineering, Information Systems, or related field.
  • 3+ years of hands-on experience with data engineering
  • 3+ experience building pipelines using AWS services (Lambda, S3, API Gateway, SNS/SQS, EventBridge, DMS, Glue, RDS, Dynamo DB, Step Functions).
  • 2+ years hands on experience with CI/CD: CodeBuild/CodePipeline or GitHub Actions/GitLab.
  • Experience developing data ingestion and transformation pipelines using Snowflake (stages, tasks, streams, procedures).
  • Strong programming skills in SQL, Python or spark and familiarity with data ingestion, data preparation, and data integration patterns.
  • Ability to analyze complex data flows and integration points with high attention to detail.
  • Clear communication, collaborative mindset, and a bias to automate & document.

Nice To Haves

  • Experience integrating SaaS platforms (e.g., Salesforce, Workday, ServiceNow, enterprise LOS systems).
  • Experience with sagemaker pipelines and MLOps tools such as MLflow.

Responsibilities

  • Build and maintain frameworks for data pipelines (ETL/ELT) and ML workflows.
  • Develop and maintain ML pipelines to orchestrate model execution in cloud.
  • Develop and maintain user interface for model execution.
  • Support and optimize cloud‑based data architectures leveraging AWS and Snowflake.
  • Design, build, and launch production‑grade data pipelines.
  • Contribute to processes needed to achieve operational excellence.
  • Ensure solutions meet enterprise standards for data quality and reliability.
  • Collaborate with architects, data scientists, analysts, and application teams.
  • Troubleshoot, monitor, and optimize workflows.
  • Document data flows and operational processes.

Benefits

  • Hybrid work schedule (in-office days Tues/Wed/Thurs)
  • Generous Time Off
  • 40 Hours of Volunteer Time Off
  • Tuition Reimbursement and Student Loan Repayment
  • Paid Family Leave and Flexible Spending Accounts
  • 401k with up to 5% employer match
  • Fitness and Emotional Wellness Reimbursements
  • Onsite Gym
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service