Cognizant Technology Solutions-posted 8 months ago
$90,000 - $123,000/Yr
Full-time • Mid Level
Hybrid • Richardson, TX
Professional, Scientific, and Technical Services

Cognizant (NASDAQ: CTSH) is a leading provider of information technology, consulting, and business process outsourcing services, dedicated to helping the world's leading companies build stronger businesses. Headquartered in Teaneck, New Jersey (U.S.). Cognizant is a member of the NASDAQ-100, the S&P 500, the Forbes Global 1000, and the Fortune 500 and we are among the top performing and fastest growing companies in the world. We are seeking a Sr. Developer with 5+ years of experience in Spark in Scala Apache Airflow Python and Databricks SQL. The ideal candidate will have a strong background in Asset Management Operations. This hybrid role requires the candidate to work during the day shift and does not require travel. The candidate will play a crucial role in developing and maintaining our data infrastructure ensuring optimal performance and reliability.

  • Develop and maintain data pipelines using Spark in Scala to ensure efficient data processing and transformation.
  • Implement and manage workflows using Apache Airflow to automate and schedule data tasks.
  • Write and optimize complex SQL queries in Databricks SQL to support data analysis and reporting.
  • Utilize Python to develop scripts and applications for data manipulation and integration.
  • Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs.
  • Monitor and troubleshoot data pipelines to ensure data quality and system reliability.
  • Provide technical guidance and support to team members on best practices and industry standards.
  • Conduct code reviews to ensure code quality and adherence to development standards.
  • Stay updated with the latest technologies and trends in data engineering and asset management operations.
  • Participate in design and architecture discussions to contribute to the overall data strategy.
  • Document processes workflows and data models to maintain a comprehensive knowledge base.
  • Ensure compliance with data governance and security policies.
  • Contribute to continuous improvement initiatives to enhance the efficiency and effectiveness of data operations.
  • Possess strong experience in Spark in Scala for developing and maintaining data pipelines.
  • Have hands-on experience with Apache Airflow for workflow automation and scheduling.
  • Demonstrate proficiency in writing and optimizing SQL queries in Databricks SQL.
  • Exhibit expertise in Python for developing data manipulation and integration scripts.
  • Show a solid understanding of asset management operations and related data requirements.
  • Display excellent problem-solving skills and the ability to troubleshoot data issues effectively.
  • Have strong communication skills to collaborate with cross-functional teams and stakeholders.
  • Be detail-oriented with a focus on delivering high-quality solutions.
  • Stay proactive in learning and adopting new technologies and best practices.
  • Maintain a strong commitment to data governance and security standards.
  • Be capable of working independently and managing multiple tasks in a hybrid work environment.
  • Demonstrate the ability to document processes and maintain a comprehensive knowledge base.
  • Show a proactive approach to continuous improvement and innovation in data operations.
  • Certified Spark Developer
  • Apache Airflow Certification
  • Python Data Science Certification
  • Databricks SQL Analyst Certification
  • Medical/Dental/Vision/Life Insurance
  • Paid holidays plus Paid Time Off
  • 401(k) plan and contributions
  • Long-term/Short-term Disability
  • Paid Parental Leave
  • Employee Stock Purchase Plan
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service