Data Engineer

LendKey TechnologiesBlue Ash, OH
9m$110,000 - $140,000

About The Position

LendKey Technologies, Inc. is a pioneer in digital network lending, dedicated to simplifying the lending process for both financial institutions and borrowers. With more than 15 years of experience, LendKey has facilitated more than $7 billion in loans through hundreds of credit unions and community banks nationwide. Our mission is to empower local lenders with innovative, reliable technology that upholds community values while enabling access to national markets. We are looking for an enthusiastic data engineer to join our data engineering team and help us build out our modern data platform using Databricks lakehouse architecture. Working alongside our lead and senior data engineers, you will contribute to building a scalable, secure, and efficient data platform that serves the entire organization and our partners. This is an opportunity to be part of building and shaping a modern, real-time data platform from the ground up. What you'll do:

Requirements

  • 3+ years of experience in data engineering or related roles
  • Demonstrated Python skills for data processing, transformation, and automation
  • Strong analytical and problem-solving skills
  • Ability to understand business workflows and translate business questions into data models and pipelines
  • Demonstrated strong Solid SQL fundamentals
  • Experience with Git-based workflows PRs, code reviews, branching strategies
  • Ability to effectively leverage AI in development workflows
  • Experience using AI-assisted coding tools (e.g., Claude Code, ChatGPT, Cursor, Copilot)
  • Familiarity with modern data engineering concepts, including ETL/ELT pipelines, and data quality practices
  • Bachelor’s degree in Computer Science or related field, or equivalent practical experience

Nice To Haves

  • Experience with Databricks and Lakehouse architecture
  • Experience with Databricks Mosaic AI
  • Experience in financial services, lending, or regulated environments
  • Exposure to data governance and PII handling

Responsibilities

  • Maintain and support the Databricks platform, including managing access controls, permissions, and security mechanisms to protect private and PII data
  • Build out and optimize the lakehouse medallion architecture (bronze, silver, and gold layers) to support various analytics and reporting needs
  • Design, develop, and maintain data ingestion pipelines from various source systems using MySQL Lakeflow, API calls, and other integration methods including file ingestion
  • Work with lead and senior data engineers to design data architecture solutions and execute implementation stories
  • Contribute to the development of data governance policies, including data normalization, anonymization, and chain of custody
  • Design, manage, and support ETL/ELT processes with ongoing data quality checks and testing
  • Identify gaps and opportunities in data infrastructure and contribute to building appropriate solutions
  • Partner with the product and engineering teams to develop scalable, extensible data systems
  • Work with end users and power users across the organization to translate business questions and requirements into appropriate data structures
  • Monitor and troubleshoot system performance, reliability, availability, and recoverability of data across the Databricks platform
  • Assist in migration efforts from legacy platforms to the Databricks lakehouse
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service