Senior Data Engineer

Brookfield PropertiesCharleston, SC
4dOnsite

About The Position

The Senior Data Engineer is responsible for designing, building, and optimizing scalable data systems and pipelines that enable robust analytics and data-driven decision-making. This role requires strong technical expertise in modern data engineering tools, cloud environments, and programming languages. The ideal candidate demonstrates excellence in developing efficient data solutions, ensuring data quality, and collaborating across cross-functional teams to deliver business value.

Requirements

  • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
  • Minimum of 3+ years of experience in data engineering
  • Strong problem-solving skills and attention to detail.
  • Excellent communication and collaboration skills in a cross-functional environment.
  • Strong understanding of data architecture, ETL processes, and big data systems.
  • Strong proficiency in Python and SQL; familiarity with Spark is a plus.
  • Hands-on experience with modern data warehouses such as Redshift or Snowflake
  • Proficiency with cloud data services (AWS), particularly AWS S3 and Redshift,
  • Expertise with ETL orchestration tools (e.g., Airflow, Glue)
  • Familiarity with Git and version control best practices.
  • Experience working with Salesforce data models, APIs, and integration tools to support data ingestion, synchronization, and analytics.

Nice To Haves

  • Certification in Data Engineering or Cloud Architecture (AWS, Azure, Snowflake, or GCP).
  • Experience with Open Search, Document Databases, and other non-relational systems.
  • Familiarity with CI/CD pipelines and infrastructure-as-code tools such as Docker and Terraform.
  • Experience using Jira and Scrum boards to manage sprints, track progress, and collaborate effectively within agile development teams.
  • Experience with data governance, quality frameworks, and metadata management.
  • Exposure to data visualization tools (e.g., Power BI, Tableau) for understanding downstream data use.

Responsibilities

  • Design, develop, and maintain data pipelines that extract, transform, and load data between Salesforce, APIs and cloud data platforms (e.g., Snowflake, Redshift, Databricks).
  • Develop and maintain efficient, scalable, and well-documented code using modern engineering practices.
  • Partner with data analysts, scientists, and architects to improve data accessibility and reliability across the organization.
  • Implement and maintain ETL frameworks and automation to ensure timely and accurate data delivery.
  • Research emerging data technologies to enhance scalability, performance, and automation of data systems.
  • Deliver reliable, scalable, and optimized data pipelines that meet SLAs.
  • Ensure accuracy, completeness, and timeliness of data across systems.
  • Work closely with cross-functional teams, including data scientists, software engineers, and business stakeholders, to support analytical and operational initiatives.
  • Contribute to the evaluation and adoption of new tools, technologies, and processes for improving data infrastructure.
  • Maintain technical documentation and adhere to organizational best practices and coding standards.
  • Build and maintain enterprise data products including data ingestion, storage, organization, and cloud data products.
  • Implement and maintain ETL processes to ensure accurate and timely data flow.
  • Automate and manage infrastructure and deployments to ensure reliability and scalability through DevOps practices.

Benefits

  • Backed by Brookfield, our benefits include a 5% 401(k) match, wellness credits that reduce healthcare costs, and up to 160 hours of PTO annually for full-time employees.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service