Senior Software Engineer - Data

AppleCupertino, CA
4h

About The Position

We are looking for a Senior Software Engineer to join our Data Engineering Infrastructure team, which builds and operates the foundational platforms that power data ingestion, transformation, and analytics across the organization. You will design and develop high-performance, reliable, and scalable systems that enable data engineers, analysts, and ML practitioners to move, process, and govern data efficiently and securely. DESCRIPTION As a Senior Software Engineer in the Data Engineering Infrastructure team, you will design and build distributed systems and frameworks that automate the lifecycle of data — from ingestion to transformation to serving. You’ll work at the intersection of software engineering, distributed data processing, and cloud infrastructure, helping to define the standards, abstractions, and tools that enable our data platform to operate at scale. You will collaborate closely with teams across data engineering, analytics, ML, and platform engineering to deliver resilient infrastructure components such as data ingestion pipelines, metadata and schema management services, workflow orchestration, and monitoring frameworks. This is a hands-on role where you will influence architecture, write production-grade code, and drive engineering excellence across the data platform.

Requirements

  • Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent practical experience).
  • 8+ years of experience in software engineering, with at least 3 years focused on data systems or platform infrastructure.
  • Strong programming skills in Java, Python
  • Hands-on experience with distributed data frameworks such as Spark, Flink, or Kafka.
  • Solid understanding of data modeling, storage formats (Parquet/Avro/ORC), and partitioning strategies.
  • Familiarity with CI/CD, container orchestration (Kubernetes), and infrastructure-as-code tools (Terraform, CloudFormation).
  • Experience working with cloud-based data platforms (AWS, GCP, or Azure).
  • Excellent problem-solving, debugging, and communication skills.

Nice To Haves

  • Scala - nice to have.
  • Master’s degree in Computer Science, Data Engineering, or a related field.
  • Experience building data infrastructure frameworks, SDKs, or shared libraries used by multiple data teams.
  • Expertise in Apache Spark internals, Flink stateful streaming, or Kafka Connect ecosystems.
  • Familiarity with data governance, cataloging, and schema management systems (e.g., Hive Metastore, Glue, Iceberg, Delta Lake).
  • Experience with Airflow, or other workflow orchestration tools.
  • Prior exposure to observability stacks (Prometheus, OpenTelemetry, Splunk) for monitoring distributed jobs.
  • Proven track record of leading design discussions, mentoring engineers, and driving cross-team technical initiatives.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service