Professional Data Engineer

Freddie MacMcLean, VA
13d$109,000 - $163,000

About The Position

At Freddie Mac, our mission of Making Home Possible is what motivates us, and it’s at the core of everything we do. Since our charter in 1970, we have made home possible for more than 90 million families across the country. Join an organization where your work contributes to a greater purpose. Position Overview: We are seeking a highly skilled Professional Software Engineer to join our team and enhance our internal data platform. This role requires expertise in modern cloud-based data infrastructure to support data-driven decision-making and modeling across the organization. The ideal candidate will possess a strong background in data engineering, software engineering, and AWS familiarity. Our Impact: We manage a critical internal data platform supporting key business operations, including prepayment model development, trading analytics, and securitization. We collaborate with various teams to understand their data requirements and design systems that align with their business objectives. We ensure our systems are robust, scalable, fault-tolerant, and cost-effective. Your Impact: Design, build, maintain and support ETL/ELT data pipelines using AWS Services (e.g. AWS EMR) and Snowflake Maintain data ingestion libraries written in Java and Python Design and develop new code, review existing code changes, and implement automated tests. Actively seek opportunities to continuously improve the technical quality and architecture to improve the product’s business value. Improve the product’s test automation and deployment practices to enable the team to deliver features more efficiently. Operate the data pipelines in production including release management and production support.

Requirements

  • At least 2 years of experience developing production software
  • Strong Python skills with at least two years of experience writing production code
  • At least one year of experience in data engineering with either Apache Spark or Snowflake
  • Bachelor’s degree in computer science or equivalent experience
  • Experience writing automated unit, integration, regression, performance and acceptance tests
  • Solid understanding of software design principles

Responsibilities

  • Design, build, maintain and support ETL/ELT data pipelines using AWS Services (e.g. AWS EMR) and Snowflake
  • Maintain data ingestion libraries written in Java and Python
  • Design and develop new code, review existing code changes, and implement automated tests.
  • Actively seek opportunities to continuously improve the technical quality and architecture to improve the product’s business value.
  • Improve the product’s test automation and deployment practices to enable the team to deliver features more efficiently.
  • Operate the data pipelines in production including release management and production support.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service