Sr Data Platform Engineer

PennymacWestlake Village, CA
148d$90,000 - $150,000

About The Position

The Sr Data Platform Engineer - Snowflake Specialist leads the design, optimization, and management of our enterprise data warehouse infrastructure, with a primary focus on advanced Snowflake implementations. This role supports critical business functions through sophisticated data platform development, including data operations and ensuring scalable and performant data solutions across the organization.

Requirements

  • Degree in Computer Science, Data Engineering, Engineering, or similar technical major
  • 5+ years of software development experience with 3+ years of advanced Snowflake experience
  • Expert-level Snowflake skills including performance optimization, cost management, and enterprise-scale implementations
  • Deep understanding of data warehousing concepts, ETL/ELT patterns, and modern data architecture principles
  • Extensive experience with databases at enterprise scale, including both SQL and NoSQL technologies
  • Demonstrated ability to master advanced data platform technologies such as real-time streaming, data orchestration, data governance, security, monitoring, or performance optimization
  • Proven track record of architecting and implementing business-critical data solutions that improve stability, security, performance, and scalability
  • Demonstrated ability to effectively communicate complex data architecture concepts to engineers, product owners, project managers, and business stakeholders
  • Demonstrated experience in multi-team collaboration and agile development practices, particularly in data-focused environments
  • Ability to collaborate across teams and design data systems that address architectural gaps and scalability challenges

Nice To Haves

  • Financial Services and mortgage industry experience, particularly with regulatory reporting and risk management data requirements

Responsibilities

  • Architect, optimize, and manage complex Snowflake environments including warehouses, data sharing, streams, tasks, and performance tuning
  • Design and implement advanced features including stored procedures, UDFs, dynamic SQL, zero-copy cloning, and time travel capabilities
  • Lead enterprise data modeling, dimensional modeling, and modern data warehouse design patterns in Snowflake
  • Develop data pipelines using Python 3 and Object-Oriented Programming & Design Patterns
  • Utilize Python data frameworks (e.g., Pandas, SQLAlchemy, Apache Airflow) and APIs (FastAPI, Flask)
  • Integrate data using Amazon Web Services (AWS) and serverless technologies
  • Optimize advanced SQL queries and database performance
  • Implement REST APIs and data integration patterns
  • Collaborate using Git and development workflows
  • Apply comprehensive testing strategies including unit and integration testing
  • Work with event-driven architectures and microservices in data contexts
  • Utilize Infrastructure as Code (CloudFormation, CDK, Terraform)
  • Apply DataOps practices (CI/CD for data pipelines, automated testing)
  • Adhere to Agile, Scrum, and Jira methodologies

Benefits

  • Comprehensive Medical, Dental, and Vision
  • Paid Time Off Programs including vacation, holidays, illness, and parental leave
  • Wellness Programs, Employee Recognition Programs, and onsite gyms and cafe style dining (select locations)
  • Retirement benefits, life insurance, 401k match, and tuition reimbursement
  • Philanthropy Programs including matching gifts, volunteer grants, charitable grants and corporate sponsorships
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service