Senior Database Engineer

IntegriChainPhiladelphia, PA
3h

About The Position

​​​​​​Modern Data Architecture & Platform Engineering Design, build, and optimize database solutions using Snowflake, PostgreSQL, and Oracle RDS. Design and evolve cloud-native data lakehouse architectures using Snowflake, AWS, and open data formats where appropriate. Implement and manage Medallion Architecture (Bronze / Silver / Gold) patterns to support raw ingestion, curated analytics, and business-ready datasets. Build and optimize hybrid data platforms spanning operational databases (PostgreSQL / RDS) and analytical systems (Snowflake). Develop and maintain semantic layers and analytics models to enable consistent, reusable metrics across BI, analytics, and AI use cases. Engineer efficient data models, ETL/ELT pipelines, and query performance tuning for analytical and transactional workloads. Implement replication, partitioning, and data lifecycle management to enhance scalability and resilience. Manage schema evolution, data versioning, and change management in multienvironment deployments Advanced Data Pipelines & Orchestration Engineer highly reliable ELT pipelines using modern tooling (e.g., dbt, cloud-native services, event-driven ingestion). Design pipelines that support batch, micro-batch, and near–real-time processing. Implement data quality checks, schema enforcement, lineage, and observability across pipelines. Optimize performance, cost, and scalability across ingestion, transformation, and consumption layers. AI-Enabled Data Engineering Apply AI and ML techniques to data architecture and operations, including: Intelligent data quality validation and anomaly detection Automated schema drift detection and impact analysis Query optimization and workload pattern analysis Design data foundations that support ML feature stores, training datasets, and inference pipelines. Collaborate with Data Science teams to ensure data platforms are AI-ready, reproducible, and governed. Automation, DevOps & Infrastructure as Code Build and manage data infrastructure as code using Terraform and cloud-native services. Integrate data platforms into CI/CD pipelines, enabling automated testing, deployment, and rollback of data changes. Develop tooling and automation (Python, SQL, APIs) to streamline provisioning, monitoring, and operational workflows. Security, Governance & Compliance Implement enterprise-grade data governance, including role-based access control, encryption, masking, and auditing. Enforce data contracts, ownership, and lifecycle management across the lakehouse. Partner with Security and Compliance teams to ensure audit readiness and regulatory alignment. Build and manage data infrastructure as code using Terraform and cloud-native services. Integrate data platforms into CI/CD pipelines, enabling automated testing, deployment, and rollback of data changes. Develop tooling and automation (Python, SQL, APIs) to streamline provisioning, monitoring, and operational workflows.

Requirements

  • 5+ years of experience in data engineering, database engineering, or data platform development in production environments.
  • Strong hands-on experience with Snowflake, including performance tuning, security, and cost optimization.
  • Deep expertise with PostgreSQL and AWS RDS in cloud-native architectures.
  • Proven experience designing lakehouse or modern data warehouse architectures.
  • Strong understanding of Medallion Architecture, semantic layers, and analytics engineering best practices.
  • Experience building and operating advanced ELT pipelines using modern tooling (e.g., dbt, orchestration frameworks).
  • Proficiency with SQL and Python for data transformation, automation, and tooling.
  • Experience with Terraform and infrastructure-as-code for data platforms.
  • Solid understanding of data governance, observability, and reliability engineering.

Nice To Haves

  • Experience with dbt, AWS Glue, Airflow, or similar orchestration tools.
  • Familiarity with feature stores, ML pipelines, or MLOps workflows.
  • Exposure to data observability platforms and cost optimization strategies.
  • Relevant certifications (Snowflake SnowPro, AWS Database Specialty, etc.).

Responsibilities

  • Design, build, and optimize database solutions using Snowflake, PostgreSQL, and Oracle RDS.
  • Design and evolve cloud-native data lakehouse architectures using Snowflake, AWS, and open data formats where appropriate.
  • Implement and manage Medallion Architecture (Bronze / Silver / Gold) patterns to support raw ingestion, curated analytics, and business-ready datasets.
  • Build and optimize hybrid data platforms spanning operational databases (PostgreSQL / RDS) and analytical systems (Snowflake).
  • Develop and maintain semantic layers and analytics models to enable consistent, reusable metrics across BI, analytics, and AI use cases.
  • Engineer efficient data models, ETL/ELT pipelines, and query performance tuning for analytical and transactional workloads.
  • Implement replication, partitioning, and data lifecycle management to enhance scalability and resilience.
  • Manage schema evolution, data versioning, and change management in multienvironment deployments
  • Engineer highly reliable ELT pipelines using modern tooling (e.g., dbt, cloud-native services, event-driven ingestion).
  • Design pipelines that support batch, micro-batch, and near–real-time processing.
  • Implement data quality checks, schema enforcement, lineage, and observability across pipelines.
  • Optimize performance, cost, and scalability across ingestion, transformation, and consumption layers.
  • Apply AI and ML techniques to data architecture and operations
  • Design data foundations that support ML feature stores, training datasets, and inference pipelines.
  • Collaborate with Data Science teams to ensure data platforms are AI-ready, reproducible, and governed.
  • Build and manage data infrastructure as code using Terraform and cloud-native services.
  • Integrate data platforms into CI/CD pipelines, enabling automated testing, deployment, and rollback of data changes.
  • Develop tooling and automation (Python, SQL, APIs) to streamline provisioning, monitoring, and operational workflows.
  • Implement enterprise-grade data governance, including role-based access control, encryption, masking, and auditing.
  • Enforce data contracts, ownership, and lifecycle management across the lakehouse.
  • Partner with Security and Compliance teams to ensure audit readiness and regulatory alignment.

Benefits

  • Excellent and affordable medical benefits
  • Non-medical perks including Student Loan Reimbursement, Flexible Paid Time Off and Paid Parental Leave
  • 401(k) Plan with a Company Match to prepare for your future
  • Robust Learning & Development opportunities including over 700+ development courses free to all employees

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

501-1,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service