West Monroe-posted 1 day ago
$119,500 - $161,700/Yr
Full-time • Mid Level
New York, NY
1,001-5,000 employees

Are you ready to make an impact? As a Senior Data Engineer, you will work with clients and internal teams to develop scalable, high-performance data solutions. You will focus on building modern data pipelines, implementing cloud-native architectures, and ensuring data quality and reliability. This role requires strong technical expertise, problem-solving skills, and the ability to collaborate in a fast-paced consulting environment. At West Monroe, we work with you. We’re a global business and technology consulting firm passionate about creating measurable value for our clients, delivering real-world solutions. The combination of business and technology is not new, but how we bring them together is unique. We’re fluent in both. We know that technology alone is not the answer, but how we apply it is. We rely on data to constantly adapt and solve new challenges. Actions that work today with outcomes that generate value for years to come. At West Monroe, we zero in on the heart of the opportunity, getting to results faster and preparing people for what’s next. You’ll feel the difference in how we work. We show up personally. We’re right there in the room with you, co-creating through the challenges. With West Monroe, collaboration isn’t a lofty promise, but a daily action. We work together with you to turn vision into clear action with lasting impact. West Monroe is an Equal Employment Opportunity Employer We believe in treating each employee and applicant for employment fairly and with dignity. We base our employment decisions on merit, experience, and potential, without regard to race, color, national origin, sex, sexual orientation, gender identity, marital status, age, religion, disability, veteran status, or any other characteristic prohibited by federal, state or local law. To learn more about diversity, equity and inclusion at West Monroe, visit www.westmonroe.com/inclusion. If you require a reasonable accommodation to participate in our recruiting process, please inquire by sending an email to [email protected]. Please review our current policy regarding use of generative artificial intelligence during the application process. If you are based in California, we encourage you to read West Monroe’s Notice at Collection for California residents, provided pursuant to the California Consumer Privacy Act (CCPA) and linked here.

  • Design and develop robust, scalable, and efficient data pipelines using modern tools and frameworks.
  • Build and optimize ETL/ELT workflows for ingesting, transforming, and storing structured and unstructured data.
  • Implement data architectures on cloud platforms, particularly Google Cloud Platform (GCP), leveraging services such as Dataflow, BigQuery Dataform, and Data Fusion
  • Develop real-time data processing solutions using streaming technologies like Apache Beam and Pub/Sub.
  • Collaborate with data architects and business stakeholders to design data models optimized for analytics, reporting, and machine learning.
  • Ensure data quality, security, and governance by implementing best practices and using tools like Cloud DLP, IAM, and Dataplex.
  • Monitor, troubleshoot, and optimize data pipelines for performance and cost efficiency using tools like Cloud Monitoring and Cloud Logging.
  • Implement CI/CD pipelines for data workflows using tools like Cloud Build, GitHub Actions, or Terraform.
  • Write clean, maintainable, and well-documented code in Python, Java, or other programming languages.
  • Serve as a technical mentor to junior data engineers and contribute to knowledge-sharing initiatives within the team.
  • Collaborate with cross-functional teams to deliver end-to-end data solutions aligned with business objectives.
  • 5+ years of experience in data engineering or related roles, with hands-on experience in designing and implementing data pipelines and architectures.
  • Strong expertise in cloud platforms, particularly Google Cloud Platform (GCP), with experience using services such as BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage.
  • Proficiency in building ETL/ELT workflows and data pipelines using tools like Apache Beam, Cloud Composer, or Cloud Data Fusion.
  • Experience with real-time data processing and streaming technologies such as Pub/Sub, Kafka, or Spark Streaming.
  • Solid understanding of SQL for data modeling, querying, and optimization.
  • Strong programming skills in Python or Java, with experience in developing reusable and scalable codebases.
  • Familiarity with infrastructure as code (IaC) tools like Terraform or Deployment Manager for provisioning cloud resources.
  • Knowledge of data governance, security, and compliance best practices, including IAM roles, encryption, and VPC Service Controls.
  • Experience with CI/CD pipelines for data workflows and DevOps practices.
  • Proven ability to work collaboratively in a team environment and communicate effectively with technical and non-technical stakeholders.
  • Travel to client site as needed (30% to 50%).
  • GCP certifications such as Professional Data Engineer or Professional Cloud Architect are preferred.
  • Experience with other cloud platforms (AWS, Azure) is a plus.
  • Familiarity with machine learning workflows and tools like Vertex AI is a plus.
  • Strong problem-solving skills and ability to troubleshoot complex data pipeline issues.
  • Employees (and their families) are covered by medical, dental, vision, and basic life insurance.
  • Employees are able to enroll in our company’s 401k plan, purchase shares from our employee stock ownership program and be eligible to receive annual bonuses.
  • Employees will also receive unlimited flexible time off and ten paid holidays throughout the calendar year.
  • Eligibility for ten weeks of paid parental leave will also be available upon hire date.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service