Mission Lane-posted 3 months ago
$110,000 - $130,000/Yr
Full-time • Mid Level
501-1,000 employees

Mission Lane is combining the power of data, technology, and exceptional service to pave a clear way forward for millions of people on the path to financial success. By attracting top talent and leveraging cutting-edge technology, we’re enabling people to unlock real financial progress. Sound like a mission you can get behind? The Data Engineering team at Mission Lane is responsible for building and maintaining the analytical data infrastructure and ETL/Reverse ETL pipelines that move data in and out of our data warehouse. We are a team of 13 data engineers working in a highly collaborative environment. We are continually striving to elevate our engineering practices and are committed to building high-quality, reliable, and scalable data solutions.

  • Design, develop, and maintain high-performance data pipelines using Python, SQL, dbt, and Snowflake on GCP.
  • Engage in our efforts to advance code quality, test coverage, and maintainability of our data pipelines.
  • Champion and implement software engineering best practices, including code reviews, testing methodology, CI/CD, and documentation.
  • Support the adoption of data quality tools and practices (e.g., data lineage, automated alerting).
  • Research, evaluate, and recommend new technologies and tools to improve our data platform.
  • Contribute to the data architecture and design of our data warehouse.
  • Collaborate effectively with software engineering teams to define data structures, streamline ingestion processes, and ensure data consistency.
  • Work closely with stakeholders (data scientists, analysts, business users) to understand their data needs and translate them into technical requirements.
  • Partner with stakeholders to ensure that projects are well defined.
  • Troubleshoot and resolve complex data pipeline issues, ensuring data quality and reliability.
  • Contribute to the development and maintenance of our CI/CD pipelines for data infrastructure.
  • Participate in on-call rotation to support critical data pipelines.
  • Identify and address inefficiencies in our data engineering processes.
  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 2+ years of experience in data engineering.
  • Strong analytical SQL skills.
  • Strong Python skills.
  • Understanding of software engineering principles and best practices (e.g., version control, testing, CI/CD).
  • Experience with data warehousing technologies, preferably Snowflake.
  • Experience with cloud platforms, preferably GCP (Google Cloud Platform), including services like Cloud Functions, and GCS.
  • Experience designing and implementing reliable and resilient ETL/ELT pipelines.
  • Excellent communication, collaboration, and problem-solving skills.
  • Experience with our specific stack: Snowflake, dbt, Montecarlo, Airflow.
  • Experience with data governance and data quality frameworks.
  • Knowledge of data modeling principles.
  • Experience with infrastructure-as-code tools (e.g., Terraform, Kubernetes Config Connector).
  • Base Salary: $110,000 - $130,000 USD.
  • Participation in our annual incentive program and equity.
  • Unlimited paid time off.
  • 401(k) match.
  • Monthly wellness stipend.
  • Health, dental, and vision insurance options.
  • Disability coverage.
  • Paid parental leave.
  • Flexible spending account (for childcare and healthcare).
  • Life insurance.
  • Remote-friendly work environment.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service