PubMatic-posted 2 months ago
Entry Level
Hybrid • Pune, IN
501-1,000 employees
Publishing Industries

PubMatic is seeking a Data Analytics-focused Software Engineer (Direct Contractor) with 1-2 years of hands-on experience in software development. The ideal candidate should have a strong foundation in Big Data technologies such as Hadoop, Spark, Scala, Kafka, and cloud-based systems, along with programming proficiency in Java, Scala, and Python. In this role, you will contribute to building and optimizing PubMatic's data analytics and processing pipelines, enhancing the platform's scalability, reliability, and analytical capabilities. The candidate should also have basic knowledge of GenAI concepts to understand the evolving analytics landscape.

  • Design, develop, and implement a highly scalable and fault-tolerant Big Data platform to process large volumes of data efficiently.
  • Build and maintain data pipelines using technologies such as Spark, Hadoop, Kafka, and Snowflake.
  • Collaborate with cross-functional teams to enhance data ingestion, transformation, and analytics capabilities.
  • Develop and maintain backend services using Java, REST APIs, JDBC, and AWS-based services.
  • Ensure services are robust, secure, and optimized for performance and scalability.
  • Participate in code reviews and contribute to best practices for design and development.
  • Work closely with product managers, data scientists, and platform engineers to design scalable solutions aligned with business goals.
  • Participate actively in Agile/Scrum processes, including sprint planning, retrospectives, backlog grooming, and story estimation.
  • Support customer issues via email or JIRA by providing updates, debugging, and deploying patches.
  • Contribute to maintaining system reliability and improving platform availability and observability.
  • 1-2 years of hands-on experience in software development, preferably in backend or data engineering roles.
  • Strong programming skills in Java, with working knowledge of Scala and Python.
  • Solid understanding of data structures, algorithms, and object-oriented design principles.
  • Basic exposure to Big Data frameworks such as Spark, Hadoop, Kafka, and data warehouse systems like Snowflake.
  • Familiarity with cloud platforms (AWS preferred) and containerized environments (Docker/Kubernetes is a plus).
  • Basic knowledge of Generative AI (GenAI) and its applications in data and analytics.
  • Strong analytical, debugging, and problem-solving skills.
  • Excellent communication skills, both written and verbal.
  • A collaborative mindset and ability to work in a fast-paced, dynamic environment.
  • Paternity/maternity leave
  • Healthcare insurance
  • Broadband reimbursement
  • Kitchen loaded with healthy snacks and drinks
  • Catered lunches
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service