Zillow-posted 3 months ago
$158,200 - $252,800/Yr
Full-time • Senior
251-500 employees

The Engagement Data Warehouse team at Zillow is a member of the Analytics Data Engineering organization and works closely with product and analytics teams. Your work will have a direct impact on the design and implementation of data engineering solutions that improve the efforts of our Data Science teams, thus enhancing Zillow's offerings for customers. This dedicated team is vital in supporting Zillow's Growth and Shopping business areas. The Senior Software Development Engineer, Big Data role at Zillow presents an outstanding opportunity to be at the forefront of brand-new data engineering work.

  • Design and implement scalable data pipelines to collect, process, and store large volumes of critical data from various sources.
  • Provide data reliability and uptime by monitoring and troubleshooting data pipeline performance and scalability to ensure efficient operations.
  • Continuously seek to improve the team’s efficiency by automating repeatable processes.
  • Facilitate engineering discussions with collaborators, customers, partners, and team members from various departments to understand business needs and convert them into technical requirements.
  • Translate business use cases into well-thought-out data models that are easy to evolve with the business.
  • Communicate technical concepts effectively to non-technical audiences.
  • Review specifications, designs, pull requests and provide constructive/helpful feedback to raise the quality of our team’s output.
  • Consistently write high-quality code, refactor, and optimize for better scalability, performance, and readability.
  • Provide leadership within the team and mentor junior engineers.
  • A degree (BS+) in Computer Science or a related field.
  • 5+ years of experience building and maintaining data-intensive applications.
  • Experience developing sophisticated data pipelines scaling to billions of rows with production quality deployment, monitoring and reliability.
  • Extensive experience with modern data technologies such as Spark and Airflow.
  • Strong proficiency in programming languages such as Python, Java, or Scala.
  • Extensive experience with SQL.
  • Proven data modeling experience, translating business requirements into clean and easily evolvable data models.
  • Excellent interpersonal skills and a passion for collaborating across organizational boundaries.
  • Experience working with cloud services (AWS/Azure/GCP).
  • Experience with Databricks.
  • Understanding of data visualization tools (e.g., Tableau, Power BI).
  • Comprehensive medical, dental, vision, life, and disability coverages.
  • Parental leave and family benefits.
  • Retirement contributions.
  • Paid time off.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service