Sr Data Engineer

StifelMemphis, TN
22h

About The Position

The Sr Data Engineer is responsible for designing, developing, and maintaining data pipelines and infrastructure on the AWS platform. The Sr Data Engineer works with large volumes of data, ensuring its quality, reliability, and accessibility. Tasks may include data ingestion, transformation, storage, data sharing and consumption, and implementing data security and privacy measures. This role is crucial in enabling efficient and effective data-driven decision-making.

Requirements

  • Proficient in programming languages such as Python and SQL for database querying and manipulation.
  • Strong understanding of AWS services related to data engineering, such as Amazon S3, Amazon Redshift, Amazon Aurora Postgres, AWS Glue, AWS Lambda, AWS Step Function, AWS Lake Formation, Amazon Data Zone, Amazon Kinesis, MSK, and Amazon EMR.
  • Knowledge of database design principles and experience with database management systems.
  • Experience with data storage technologies, such as relational databases (e.g., SQL Server, PostgreSQL) and distributed storage systems (e.g., PySpark).
  • Understanding of Extract, Transform, Load (ETL) processes and experience with ETL tools like AWS Glue and SQL Server Integration Services is essential.
  • Skilled at integrating disparate data sources and ensuring data quality and consistency.
  • Understanding and experience with orchestration tools like Apache Airflow, AWS Glue Workflows, AWS Step Functions, and notification services.
  • Familiarity with IAC, such as Terraform, git, and DevOps pipelines.
  • Strong analytical thinking and problem-solving abilities are essential to effectively identify and resolve data-related issues.
  • Ability to analyze complex data sets, identify patterns, and derive actionable insights.
  • Awareness of data governance practices, data privacy regulations, and security protocols is crucial.
  • Minimum Required: Bachelor's Degree in Computer Science, related field, or equivalent experience.
  • Minimum Required: 10+ years of post-bachelor progressive experience in data engineering.
  • Proficient in the following computer languages: Python, SQL
  • AWS technologies to include: Glue, S3, Redshift, Lambda, Lake Formation, DataZone

Nice To Haves

  • Experience implementing data security measures and ensuring compliance with relevant standards is desirable.

Responsibilities

  • Build and maintain scalable and reliable data pipelines, ensuring the smooth flow of data from various sources to the desired destinations in the AWS cloud environment.
  • Work closely with stakeholders to understand their data requirements and design data solutions that meet their needs.
  • Understand data models/schemas and implement ETL (Extract, Transform, and Load) processes to transform raw data into a usable format in the destination.
  • Responsible for monitoring and optimizing the performance of data pipelines, troubleshooting any issues that arise, and ensuring data quality and integrity.

Benefits

  • comprehensive benefits package to include health, dental and vision care, 401k, wellness initiatives, life insurance, and paid time off.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service