Magnite-posted 4 months ago
$150,000 - $170,000/Yr
Full-time • Mid Level
Boston, MA
501-1,000 employees

At Magnite, we cultivate an environment of continuous growth and collaboration. Our work impacts what millions of people read, watch, and buy, and we’re looking for people to help us tackle that responsibility with creativity and focus. Magnite (NASDAQ: MGNI) is the world’s largest independent sell-side advertising platform. Publishers use our technology to monetize their content across all screens and formats including CTV / streaming, online video, display, and audio. Our tech fuels billions of transactions per day! Magnite conducts 400+ billion ad request auctions daily carrying $450+ million of revenue annually. To stand out among competitors, Magnite strategically focuses on being the most efficient, highest scale independent exchange. The Streaming Organization's Data Engineering team owns the applications and infrastructure that make up the Streaming Org's data pipeline, handling ~40 billion events per day at an average of 400-500 thousand per second. This data underpins the Streaming business - including client reporting, internal data science, account managers, and product + business teams - and as such we need to build systems that remain scalable and efficient at this volume of data while also ensuring data consistency and reliability. We value communication, discussion, and sharing of ideas to come to the best technical solutions to our large-scale data challenges. We are looking for people who want to get things done and value open collaboration (including constructive feedback when brainstorming). Our end-to-end ownership of the data world includes both typical data-engineering type problems (think Spark pipelines, ETL processing, etc) and more general software-engineering tech as well (think Java applications, API design, etc).

  • Get to work on handling internet-scale data problems
  • Help architect and build systems to process our data volume to empower all consumers
  • Join in on full end-to-end ownership of the team's data pipeline and related systems for data delivery
  • Be a part of and promote our culture of collaboration and mentorship
  • 4+ years of software development experience
  • BS/MS in Computer Science or equivalent work experience
  • Experience designing and building systems that work with large-scale data volumes and data ingestion systems at scale in a cloud-first setting, OR established technical excellence with a desire to learn and crush the data engineering world
  • Experience with Java or Scala for core application development
  • Familiarity with Spark (batch + streaming) for data pipeline development
  • Experience with Terraform, Docker, Jenkins for CI/CD / infra / application deployment
  • Knowledge of Airflow for job orchestration
  • Experience with AWS-based cloud infrastructure including RDS, EC2, S3, Kinesis, ECS
  • Comprehensive Healthcare Coverage for You and Your Family from Day One
  • Generous Time Off Holiday Breaks, Summer Fridays and Quarterly Wellness Days
  • Equity and Employee Stock Purchase Plan
  • Family-Focused Benefits and Parental Leave
  • 401k Retirement Savings Plan with Employer Match
  • Disability and Life Insurance
  • Cell Phone Subsidy
  • Fitness and Wellness Reimbursement
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service