Software Engineer (Big Data)

QuantcastSan Francisco, CA
$144,900 - $168,400Hybrid

About The Position

At Quantcast, we don't just build advertising technology, we revolutionize how it works. Our AI-powered Demand Side Platform (DSP) connects the world's most ambitious marketers with their ideal audiences across the open internet, delivering results that actually move the needle. Since 2006, we've been the industry's trailblazer, launching the first AI-powered measurement platform for publishers and the first AI-driven DSP. Our AI doesn't just optimize—it delivers the measurable outcomes that matter most to our clients, giving them the competitive edge they need in a crowded marketplace. Ready to join the team that's defining the future of digital advertising? We are seeking a Software Engineer to join our Big Data Services team. This role focuses on building, maintaining, and optimizing large-scale data processing systems while supporting the needs of our data science and engineering teams. You will work closely with engineers, data scientists, and researchers to ensure a reliable, scalable, and efficient data infrastructure that powers advanced analytics and modeling workflows. The primary focus of the team is driving and developing systems and services that enable Quantcast to scale its distributed storage and compute platform. The ideal candidate has a passion for large-scale distributed systems (e.g., HDFS and Spark).

Requirements

  • BS in computer science or equivalent experience
  • 1-3 years of professional software engineering experience (internships included)
  • You must be work-authorized in the United States without the need for employer sponsorship.
  • This is a hybrid role based in our San Francisco. To ensure a manageable commute for in-office days, candidates must reside within a 60-mile radius of San Francisco, CA. No relocation candidates at this time.
  • Familiarity with data processing frameworks such as Apache Spark, Hadoop, or similar.
  • Familiarity with containerization tools (e.g., Docker and Kubernetes)
  • Proficient in Java and/or Python programming languages
  • Linux system administration/automation experience
  • Strong problem-solving and debugging skills
  • Organized, detail-oriented personality

Nice To Haves

  • Experience with workflow orchestration tools (e.g., Airflow) is a plus

Responsibilities

  • Contribute to the development and optimization of large-scale data workflows using technologies such as Apache Spark or similar frameworks
  • Debug and resolve issues in distributed environments, including data inconsistencies and job failures
  • Maintain and enhance the services that support the distributed storage and compute platform
  • Assist in deploying and maintaining production systems, including CI/CD workflows
  • Work to make our platform more elastic and fault-tolerant
  • Provide technical input into roadmaps for the team
  • Write clean, maintainable, and well-tested code

Benefits

  • On top of a competitive salary, this position includes a performance bonus, equity, and a comprehensive benefits package.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service