Cognizant Technology Solutions-posted 8 months ago
$100,000 - $105,000/Yr
Full-time • Senior
Hybrid • Columbus, OH
Professional, Scientific, and Technical Services

Cognizant is one of the world's leading professional services companies, transforming clients' business, operating, and technology models for the digital era. Our unique industry-based, consultative approach helps clients envision, build, and run more innovative and efficient businesses. Headquartered in the U.S., Cognizant (a member of the NASDAQ-100 and one of Forbes World's Best Employers 2024) is consistently listed among the most admired companies in the world. We are seeking an experienced Architect with 12 to 16 years of experience in Property & Casualty Insurance domain. The ideal candidate will have expertise in Spark in Scala Delta Sharing Databricks Unity Catalog Admin Databricks CLI Delta Live Pipelines Structured Streaming Risk Management Apache Airflow Amazon S3 Amazon Redshift Python Databricks SQL Databricks Delta Lake Databricks Workflows and PySpark. This is a hybrid work model with day shifts and no travel required.

  • Lead the design and implementation of data architecture solutions for Property & Casualty Insurance projects.
  • Oversee the development and deployment of scalable data pipelines using Spark in Scala and Delta Live Pipelines.
  • Provide expertise in Databricks Unity Catalog Admin and Databricks CLI to manage and secure data assets.
  • Implement and manage Delta Sharing for seamless data sharing across different platforms.
  • Utilize Structured Streaming to process real-time data efficiently.
  • Apply risk management techniques to ensure data integrity and security.
  • Integrate Apache Airflow for orchestrating complex data workflows.
  • Manage data storage and retrieval using Amazon S3 and Amazon Redshift.
  • Develop and maintain Python scripts for data processing and automation.
  • Utilize Databricks SQL for querying and analyzing large datasets.
  • Implement Databricks Delta Lake for optimized data storage and retrieval.
  • Design and manage Databricks Workflows for efficient data processing.
  • Develop and optimize PySpark applications for large-scale data processing.
  • Collaborate with cross-functional teams to ensure alignment with business objectives.
  • Provide technical guidance and mentorship to junior team members.
  • Ensure compliance with industry standards and best practices.
  • Continuously evaluate and improve data architecture and processes.
  • Contribute to the companys purpose by delivering high-quality data solutions that enhance decision-making and operational efficiency.
  • Must have extensive experience in Spark in Scala Delta Sharing Databricks Unity Catalog Admin Databricks CLI Delta Live Pipelines Structured Streaming Risk Management Apache Airflow Amazon S3 Amazon Redshift Python Databricks SQL Databricks Delta Lake Databricks Workflows and PySpark.
  • Must have domain experience in Property & Casualty Insurance.
  • Must have strong communication skills in English (Read/Write Speak).
  • Nice to have experience in other data engineering and cloud technologies.
  • Medical/Dental/Vision/Life Insurance
  • Paid holidays plus Paid Time Off
  • Long-term/Short-term Disability
  • Paid Parental Leave
  • Employee Stock Purchase Plan
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service