Senior Data Engineer

Equiti Group
12d

About The Position

We are seeking a highly skilled Senior Data Engineer to lead the next phase in the evolution of our data environment – from pipelines to platform. In this role, you will build the infrastructure that powers everything from customer insights to financial reporting and self-serve analytics. The ideal candidate will have a strong technical background and a vision for what is possible. As a Senior Data Engineer, you will play a critical role in ensuring Equiti’s data is ready to meet the needs of internal and external stakeholders. You will act as architect for data warehouse and data pipeline design, working with Product, Engineering, and Data Analytics teams to build a reliable and performant data platform.

Requirements

  • Bachelor's degree in Computer Science, Information Technology, or a related field.
  • Minimum of 5 years of experience as a Data Engineer.
  • Proven experience in data warehouse design, implementation, and maintenance using Databricks (Delta Lake, Autoloader, Unity Catalog, Jobs) and dbt
  • Python + SQL (Spark) development experience.
  • Knowledge of data pipeline tools like Apache Airflow and Kafka (MSK).
  • Experience working with AWS cloud infrastructure, including S3, IAM/KMS, Aurora Postgres and RDS Postgres.
  • Postgres DBA depth: query tuning, autovacuum/fillfactor, EXPLAIN, Liquidbase familiarity.
  • Proficient with BI tools like Sisense, Power BI, or Tableau
  • Real-time ETL - Kafka streaming, (MSK/SQS)
  • Comfortable standardizing metrics and enabling trusted, consistent access to data
  • Excellent problem-solving and troubleshooting skills.
  • Strong communication and interpersonal skills.
  • Ability to work independently and as part of a team.

Nice To Haves

  • Candidates who demonstrate strong cross-functional communication skills and an interest in developing into leadership roles over time.
  • Experience in a highly regulated industry (healthcare, banking, etc.).
  • Experience with database performance tuning and optimization techniques.
  • Knowledge of scripting languages (e.g., Python, PowerShell).
  • Familiarity with Salesforce and its data structures.

Responsibilities

  • Design and implement scalable data pipelines and platform components, leveraging tools such as Databricks, Airflow, and AWS Aurora.
  • Partner with Product and Engineering to build scalable systems that help unlock the value of data from a wide range of sources such as backend databases and marketing platforms.
  • Lead technical vision and architecture with a holistic point of view on both short-term and long-term horizons.
  • Define and implement data quality frameworks and monitoring systems to ensure ongoing trust in data assets.
  • Enable real-time data presentation to facilitate management of a world-wide workforce.
  • Enable external use cases like customer-facing dashboards, self-serve analytics, and bulk data transfer.
  • Devise migration strategies to ensure data integrity and minimize downtime.
  • Define data quality frameworks to measure and monitor data quality across the enterprise.
  • Implement and maintain robust security measures to protect sensitive data.
  • Ensure compliance with relevant data privacy regulations.
  • Collaborate with technical leadership and contribute to the evolution of data engineering best practices.
  • Serve as a subject matter expert on data engineering technologies, including Databricks, Airflow, and AWS Aurora.
  • Provide guidance and support to development teams on data engineering best practices to ensure efficient data access and integration.
  • Collaborate with database engineers, data analysts, and application developers to understand data requirements and ensure data platform alignment.
  • Stay up to date with the latest data engineering technologies and trends.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service