Google Cloud Platform (GCP) Data Engineer

CognizantParsippany-Troy Hills, NJ
9h$80,000 - $110,000

About The Position

We are Cognizant AIA - Artificial Intelligence and Analytics !! About AI & Analytics: Artificial intelligence (AI) and the data it collects and analyzes will soon sit at the core of all intelligent, human-centric businesses. By decoding customer needs, preferences, and behaviors, our clients can understand exactly what services, products, and experiences their consumers need. Within AI & Analytics, we work to design the future—a future in which trial-and-error business decisions have been replaced by informed choices and data-supported strategies! By applying AI and data science, we help leading companies to prototype, refine, validate, and scale their AI and analytics products and delivery models! Cognizant’s AIA practice takes insights that are buried in data and provide businesses with a clear way to transform how they source, interpret, and consume their information. Our clients need flexible data structures and a streamlined data architecture that quickly turns data resources into informative, meaningful intelligence. Job Summary: We are seeking a skilled Google Cloud Platform (GCP) Data Engineer to design, build, and optimize data pipelines and analytics solutions in the cloud. The ideal candidate must have hands-on experience with GCP data services, strong ETL/ELT development skills, and a solid understanding of data architecture, data modeling, data warehousing and performance optimization.

Requirements

  • Bachelor’s degree in Computer Science, Information Technology, or related field.
  • Minimum 8 years of experience in data engineering, preferably in a cloud environment.
  • Minimum 3 years of hands-on and strong expertise in GCP services: BigQuery, Cloud storage, Cloud run, Dataflow, Cloud SQL, AlloyDB, Cloud Balancer, PubSub, IAM, Logging and Monitoring.
  • Proficiency in SQL, Python and Linux scripting.
  • Prior experience with ETL tools such as Datastage, Informatica, SSIS
  • Familiarity with data modeling (star/snowflake) and data warehouse concepts.
  • Understanding of CI/CD, version control (Git), and Infrastructure as Code (Terraform).
  • Strong problem-solving and analytical mindset.
  • Effective communication and collaboration skills.
  • Ability to work in an agile and fast-paced environment.

Nice To Haves

  • GCP Professional Data Engineer or Cloud Architect certification is a plus.

Responsibilities

  • Develop ETL/ELT processes to extract data from various sources, transform it, and load it into BigQuery or other target systems.
  • Build and maintain data models, data warehouses, and data lakes for analytics and reporting.
  • Design and implement scalable, secure, and efficient data pipelines on GCP using tools such as Dataflow, Pub/Sub, cloud run, Python and linux scripting.
  • Optimize BigQuery queries, manage partitioning and clustering, and handle cost optimization.
  • Integrate data from on-premise and cloud systems using Cloud Storage, and APIs.
  • Work closely with DevOps teams to automate deployments using Terraform, Cloud Build, or CI/CD pipelines.
  • Ensure security and compliance by applying IAM roles, encryption, and network controls.
  • Collaborate with data analysts, data scientists, and application teams to deliver high-quality data solutions.
  • Implement best practices for data quality, monitoring, and governance.

Benefits

  • Medical/Dental/Vision/Life Insurance
  • Paid holidays plus Paid Time Off
  • 401(k) plan and contributions
  • Long-term/Short-term Disability
  • Paid Parental Leave
  • Employee Stock Purchase Plan
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service