This job is closed

We regret to inform you that the job you were interested in has been closed. Although this specific position is no longer available, we encourage you to continue exploring other opportunities on our job board.

Sofi - Seattle, WA

posted 14 days ago

Full-time - Senior
Remote - Seattle, WA
Credit Intermediation and Related Activities

About the position

SoFi is driven by data! In this role, you will contribute to the long-term success of SoFi's data vision by developing distributed systems and scalable data platforms. The Data Platform Group supports data use cases across all of SoFi's diverse business units by providing a highly scalable, democratized data platform that empowers teams to ingest, model, and consume data confidently. Join the Data Platform Group as it refines its vision and establishes industry-leading standards for data lifecycle management, introducing best-in-class architectural components and processes to extract value from disparate data sources. The success of this team is vital to the company's success, and your contributions will have a highly visible and lasting impact. As a Principal Engineer, you'll contribute to the team's technical direction by designing innovative solutions to complex business challenges. You'll collaborate with engineering teams to maximize value for platform consumers, coordinate with squads to align with the Data Platform team's strategy, and provide guidance on testing and deployment strategies. Leveraging your expertise, you'll explore GenAI to enhance data analysis and decision-making. You'll mentor team members and represent the team during recruitment and hiring. The ideal candidate has experience in distributed systems and scalable data platforms.

Responsibilities

  • Collaborate with cross-functional teams to understand complex business requirements and translate them into scalable, high-impact technical solutions, directly influencing SoFi's data-driven decision-making processes.
  • Lead architectural design sessions for the Data Platform and its integrations (APIs, services), ensuring solutions are not only technically sound but also aligned with broader business goals and significantly contribute to SoFi's overall data strategy.
  • Drive the development of advanced features within the Experimentation platform, ensuring modular, efficient, and scalable code structures optimized for the aforementioned stack.
  • Spearhead rigorous code review processes, underscoring best practices, efficiency, and optimal use of underlying software components unique capabilities and services.
  • Foster and facilitate internal technical sessions, exploring nuances of AWS data services like DMS, MSK (Kafka), and S3, and sharing best practices for integration with the broader data stack.
  • Provide technical leadership in evaluating and adopting emerging technologies within the modern data stack, ensuring SoFi remains at the forefront of data engineering innovation.
  • Drive Operational excellence across the squads and act as a liaison between Data and other organizations. Implement and track operational metrics to measure progress and identify opportunities for further optimization.
  • Engineer sophisticated data pipelines using dbt, Airflow, and Snowflake, with special emphasis on performance optimization and data integrity using Great Expectations.
  • Leverage Python and SQL scripting proficiencies for intricate data operations, custom ETL/ELT processes, and sophisticated data transformations across the platform.
  • Collaborate with data scientists and ML engineers to explore and implement GenAI solutions for data analysis, feature engineering, and predictive modeling. Contribute to the development of responsible GenAI practices within the organization, ensuring ethical and unbiased use of AI technologies.
  • Mentor technical team members in best practices for Snowflake, Airflow, dbt, and AWS services, promoting a culture of technical distinction and innovation.

Requirements

  • A Bachelor's or Master's degree in Computer Science, Information Security, or a related field is required; an advanced degree is preferred.
  • A minimum of 10 years in a pivotal Software/Data Engineering role, with extensive experience in modern data stacks, particularly Snowflake, Airflow, dbt, Kafka, Docker/k8s, and AWS data services.
  • Strong understanding of data ingestion, orchestration, transformation, and reverse ETL best practices and design principles.
  • Proven skills in distributed systems architecture and building scalable solutions.
  • Mastery in Python, Java, and SQL for complex operations within Snowflake and AWS services like DMS, MSK (Kafka), and S3.
  • Solid experience with Terraform or Cloudformation as IaC solutions.
  • Strong leadership and communication skills.
  • Experience working in a collaborative coding environment, refining designs together, working through code reviews, and managing pull requests.
  • Demonstrated problem-solving capabilities, especially within the context of the modern data stack and experimentation realm.
  • Exceptional technical communication skills, adept at liaising with both technical peers and diverse stakeholders within a data-driven organization.
  • Demonstrated ability to lead a team of developers, providing technical guidance, mentorship, and support.

Nice-to-haves

  • Data exploration and analysis experience using SQL/Python/R/Tableau.
  • Experience with prompt engineering and fine-tuning LLMs.
  • Contributions to open-source projects.
  • Experience with data governance and security.
  • Familiarity with machine learning concepts.
  • Experience with finance / fintech or enthusiastic to learn and grow in this space.

Benefits

  • Base pay range: $192,000.00 - $330,000.00
  • Eligible for a bonus
  • Long term incentives
  • Comprehensive and competitive benefits
Job Description Matching

Match and compare your resume to any job description

Start Matching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service