Synechron-posted about 1 year ago
$110,000 - $120,000/Yr
Full-time • Mid Level
Jersey City, NJ
10,001+ employees
Professional, Scientific, and Technical Services

The Scala Developer with Apache Spark at Synechron Inc. is responsible for designing, developing, and maintaining scalable data processing applications. This role involves collaboration with data scientists and engineers to build data pipelines that support data-driven decision-making processes. The ideal candidate will leverage their expertise in Scala and Apache Spark to optimize applications and ensure high availability and reliability of data processing systems.

  • Design, develop, and maintain scalable data processing applications using Scala and Apache Spark.
  • Collaborate with data engineers and data scientists to understand data requirements and translate them into technical specifications.
  • Optimize Spark applications for performance and scalability.
  • Implement data ingestion processes from various sources (databases, APIs, etc.) into Spark.
  • Write efficient, reusable, and maintainable code while following best practices in software development.
  • Conduct code reviews and provide mentorship to junior developers.
  • Monitor and troubleshoot Spark applications, ensuring high availability and reliability.
  • Stay updated with the latest technologies and trends in big data and distributed computing.
  • Proven experience as a Scala Developer, with a strong understanding of functional programming concepts.
  • Hands-on experience with Apache Spark, including Spark SQL, Spark Streaming, and MLlib.
  • Familiarity with big data technologies such as Hadoop, Kafka, or Cassandra is a plus.
  • Experience with cloud platforms (AWS, Azure, GCP) and containerization technologies (Docker, Kubernetes) is preferred.
  • Proficient in writing efficient SQL queries and understanding of database systems (e.g., PostgreSQL, MySQL, NoSQL databases).
  • Knowledge of data modeling and ETL processes.
  • Strong analytical and problem-solving skills.
  • Excellent communication and collaboration abilities.
  • Bachelor's degree in Computer Science, Engineering, or a related field; Master's degree is a plus.
  • Experience with big data technologies such as Hadoop, Kafka, or Cassandra.
  • Experience with cloud platforms (AWS, Azure, GCP) and containerization technologies (Docker, Kubernetes).
  • Highly competitive compensation and benefits package
  • Possibility to work abroad
  • Laptop and a mobile phone
  • 10 days of paid annual leave (plus sick leave and national holidays)
  • Maternity & Paternity leave plans
  • Comprehensive insurance plan including medical, dental, vision, life insurance, and long-/short-term disability
  • Retirement savings plans
  • Higher education certification policy
  • Commuter benefits (varies by region)
  • Extensive training opportunities
  • On-demand Udemy for Business access
  • Coaching opportunities with experienced colleagues
  • Cutting edge projects at leading financial institutions
  • Flat and approachable organization
  • Diverse, fun-loving and global work culture
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service