About The Position

Revecore is embarking on re-architecting and modernizing its core platform. The Data Technology at Revecore is empowering the business in unlocking new opportunities by integrating data and machine learning across our products and business functions. This team is composed of Data Engineering, Analytics Engineering, Data Science and Machine Learning (ML) Engineering specializations. The Senior Data Engineer will join the Technology function, solving complex data problems for the Revecore Data Platform. The ideal candidate is capable of contributing to platform, architecture, and automation best practices, while also diving deep into specific areas and finding the best path forward. You are expected to pave the path for the data teams around data ingestion, data pipelines and data operations. You will be responsible for building a modern data lakehouse platform that will transform Revecore and its capabilities.

Requirements

  • 5+ years of experience designing and deploying data engineering solutions using Python, SQL, and modern orchestration/transformation frameworks (e.g., dbt, Airflow, Dagster, Prefect).
  • Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent experience).
  • Experience designing, building, and maintaining scalable, reliable data pipelines with containerized deployments on Kubernetes or Docker.
  • Experience with modern data warehouses such as Snowflake, Redshift, or BigQuery.
  • Proven ability to develop and support multi-cloud data platforms on Azure and AWS, including data lakes (ADLS, S3) and data warehouses (Synapse, Snowflake, Redshift); automate infrastructure using Terraform or CloudFormation.
  • Expertise in pipeline optimization for performance, cost efficiency, and reliability using monitoring (Prometheus, Grafana), CI/CD (GitHub Actions, Jenkins), and data quality frameworks.
  • Strong understanding of infrastructure as code, SDLC, and deployment automation.
  • Proven ability to write clear documentation and promote best practices via knowledge sharing, code reviews, and technical guidance.

Nice To Haves

  • Demonstrated ability to bridge legacy systems with cloud-native architectures for platform modernization.
  • Strong data modeling skills, including dimensional modeling for cloud analytics.
  • Familiarity with data cataloging, lineage, and governance tools for data discoverability and trust.
  • Experience with real-time/streaming data technologies (Kafka, Kinesis, Event Hubs) and event-driven architectures.

Responsibilities

  • Own the implementation and quality of the data platforms, ensuring availability, reliability, resilience, and security while upholding consistent standards of engineering excellence.
  • Continuously improve both the engineering practice and the quality of the domain’s data outputs.
  • Design and develop scalable, maintainable, and reliable data pipelines to support business and analytics needs.
  • Collaborate with other Data Engineers to understand requirements and build tooling and automation that enhance developer efficiency.
  • Monitor data systems and performance, identifying opportunities to optimize warehouse and infrastructure costs.
  • Stay current with industry trends and emerging technologies across the data platform ecosystem to drive innovation and modernization.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service