West Monroe-posted 1 day ago
$119,500 - $161,700/Yr
Full-time • Mid Level
New York City, NY
1,001-5,000 employees

Are you ready to make an impact? As a Senior Data Engineer , you will work with clients and internal teams to develop scalable, high-performance data solutions. You will focus on building modern data pipelines, implementing cloud-native architectures, and ensuring data quality and reliability. This role requires strong technical expertise, problem-solving skills, and the ability to collaborate in a fast-paced consulting environment.

  • Design and develop robust, scalable, and efficient data pipelines using modern tools and frameworks.
  • Build and optimize ETL/ELT workflows for ingesting, transforming, and storing structured and unstructured data.
  • Implement data architectures on cloud platforms, particularly Google Cloud Platform (GCP), leveraging services such as Dataflow, BigQuery Dataform, and Data Fusion
  • Develop real-time data processing solutions using streaming technologies like Apache Beam and Pub/Sub.
  • Collaborate with data architects and business stakeholders to design data models optimized for analytics, reporting, and machine learning.
  • Ensure data quality, security, and governance by implementing best practices and using tools like Cloud DLP, IAM, and Dataplex.
  • Monitor, troubleshoot, and optimize data pipelines for performance and cost efficiency using tools like Cloud Monitoring and Cloud Logging.
  • Implement CI/CD pipelines for data workflows using tools like Cloud Build, GitHub Actions, or Terraform.
  • Write clean, maintainable, and well-documented code in Python, Java, or other programming languages.
  • Serve as a technical mentor to junior data engineers and contribute to knowledge-sharing initiatives within the team.
  • Collaborate with cross-functional teams to deliver end-to-end data solutions aligned with business objectives.
  • 5+ years of experience in data engineering or related roles, with hands-on experience in designing and implementing data pipelines and architectures.
  • Strong expertise in cloud platforms, particularly Google Cloud Platform (GCP), with experience using services such as BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage.
  • Proficiency in building ETL/ELT workflows and data pipelines using tools like Apache Beam, Cloud Composer, or Cloud Data Fusion.
  • Experience with real-time data processing and streaming technologies such as Pub/Sub, Kafka, or Spark Streaming.
  • Solid understanding of SQL for data modeling, querying, and optimization.
  • Strong programming skills in Python or Java, with experience in developing reusable and scalable codebases.
  • Familiarity with infrastructure as code (IaC) tools like Terraform or Deployment Manager for provisioning cloud resources.
  • Knowledge of data governance, security, and compliance best practices, including IAM roles, encryption, and VPC Service Controls.
  • Experience with CI/CD pipelines for data workflows and DevOps practices.
  • Proven ability to work collaboratively in a team environment and communicate effectively with technical and non-technical stakeholders.
  • GCP certifications such as Professional Data Engineer or Professional Cloud Architect are preferred.
  • Experience with other cloud platforms (AWS, Azure) is a plus.
  • Familiarity with machine learning workflows and tools like Vertex AI is a plus.
  • Strong problem-solving skills and ability to troubleshoot complex data pipeline issues.
  • Travel to client site as needed (30% to 50%).
  • Employees (and their families) are covered by medical, dental, vision, and basic life insurance.
  • Employees are able to enroll in our company’s 401k plan, purchase shares from our employee stock ownership program and be eligible to receive annual bonuses.
  • Employees will also receive unlimited flexible time off and ten paid holidays throughout the calendar year.
  • Eligibility for ten weeks of paid parental leave will also be available upon hire date.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service