Data Engineer (GCP)

Cognizant Technology SolutionsHartford, CT
9d$100 - $120

About The Position

We are Cognizant AIA - Artificial Intelligence and Analytics !! About AI & Analytics: Artificial intelligence (AI) and the data it collects and analyzes will soon sit at the core of all intelligent, human-centric businesses. By decoding customer needs, preferences, and behaviors, our clients can understand exactly what services, products, and experiences their consumers need. Within AI & Analytics, we work to design the future—a future in which trial-and-error business decisions have been replaced by informed choices and data-supported strategies! By applying AI and data science, we help leading companies to prototype, refine, validate, and scale their AI and analytics products and delivery models! Cognizant's AIA practice takes insights that are buried in data and provide businesses with a clear way to transform how they source, interpret, and consume their information. Our clients need flexible data structures and a streamlined data architecture that quickly turns data resources into informative, meaningful intelligence. Job Summary Data Engineer(GCP) 4 to 6+ years of experience in Data Engineering, with at least 2 years in GCP. Strong hands-on exp in Teradata data warehousing, BTEQ, and complex SQL. Solid knowledge of GCP services: BigQuery, Dataflow, Cloud Storage, Pub/Sub, Composer, and Dataproc. Experience with ETL/ELT pipelines using custom scripting tools (Python/Java). Proven ability to refactor and translate legacy logic from Teradata to GCP. Familiarity with CI/CD, GIT, Argo CD, and DevOps practices in cloud data envi Experience : 6to10Yrs Required Skills : Teradata,Python,Big Data,Google Big Query,Google Cloud DataProc,Google Cloud SQL,Google Cloud Composer,Google Cloud Pub/Sub #LI-KN1

Requirements

  • 4 to 6+ years of experience in Data Engineering, with at least 2 years in GCP.
  • Strong hands-on exp in Teradata data warehousing, BTEQ, and complex SQL.
  • Solid knowledge of GCP services: BigQuery, Dataflow, Cloud Storage, Pub/Sub, Composer, and Dataproc.
  • Experience with ETL/ELT pipelines using custom scripting tools (Python/Java).
  • Proven ability to refactor and translate legacy logic from Teradata to GCP.
  • Familiarity with CI/CD, GIT, Argo CD, and DevOps practices in cloud data envi
  • Teradata
  • Python
  • Big Data
  • Google Big Query
  • Google Cloud DataProc
  • Google Cloud SQL
  • Google Cloud Composer
  • Google Cloud Pub/Sub

Benefits

  • Medical/Dental/Vision/Life Insurance
  • Paid holidays plus Paid Time Off
  • 401(k) plan and contributions
  • Long-term/Short-term Disability
  • Paid Parental Leave
  • Employee Stock Purchase Plan

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

5,001-10,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service