About The Position

Horizon3.ai is a fast-growing, remote cybersecurity company dedicated to the mission of enabling organizations to proactively find, fix and verify exploitable attack vectors before criminals exploit them. Our flagship product, the NodeZeroTM platform, delivers production-safe autonomous pentests and other key assessment operations that scale across the largest internal, external, cloud, and hybrid cloud environments. NodeZero has been adopted by organizations of all sizes, from small educational institutions to government agencies and Global 100 enterprises. It is used by IT Ops/SecOps teams, consulting pentesters, and MSSPs and MSPs. We are a fusion of former U.S. Special Operations cyber operators, startup engineers & operators, and formerly frustrated cybersecurity practitioners. We're committed to helping solve our common security problems: ineffective security tools and false positives, resulting in alert fatigue, blind spots, "checkbox” security culture, cybersecurity skills shortage, and the long lead time and expense of hiring outside consultants. Collectively, we are a team of learn it alls, committed to a culture of respect, collaboration, ownership, and results. As a remote first company, we require minimum 25Mbps consumer grade broadband connection.

Requirements

  • Demonstrated expertise leading teams designing and operating cloud data warehouse in production (eg, Redshift, Snowflake, Databricks, BigQuery).
  • Hands-on experience with the modern data stack: dbt for transformation, a pipeline orchestrator (Airflow, Dagster, or similar), and managed ingestion tooling (Fivetran, Airbyte, etc.)
  • Experience implementing data quality frameworks and observability: defining pipeline SLAs, detecting and alerting on anomalies, and establishing tiered data sets with quality guarantees (e.g., medallion architecture)
  • Able to partner with and influence peer engineering teams - drive alignment on shared standards such as pipeline patterns, data contracts, and quality guarantees across teams that own their own data sources.
  • Experience building or significantly growing a small data engineering team — including hiring, onboarding, and establishing engineering norms.
  • Proven ability to define and instill engineering culture: code review standards, definition of done, incident response, documentation practices, and culture of ownership.
  • Drives high-impact architecture decisions rigorously — requires design documents, runs structured reviews, builds consensus, and ensures decisions are well-reasoned before commitment.
  • Demonstrated experience acting as product owner for a platform or infrastructure team: maintaining a roadmap, triaging inbound requests, managing internal customer expectations, and making prioritization tradeoff decisions against capacity.
  • Proficient in SQL. Able to read, write, and review data transforms and data quality checks in SQL.
  • Proficient in Python. Ability to review pipeline code and guide engineering decisions.
  • Competent in data analysis. Able to investigate anomalies, validate data quality issues, and find insight in data.

Nice To Haves

  • Familiarity with AWS data services (Redshift, Athena, S3/Glue)
  • Experience with Databricks (Delta Lake, Unity Catalog, Spark)
  • Familiarity with Argo Workflows or Kubernetes-native job orchestration
  • Experience with high volume streaming data (Kafka, PubSub)
  • Experience supporting data science or ML workflows.
  • Exposure to cybersecurity, network telemetry, APM, or other high-volume operational SaaS data.

Responsibilities

  • Lead the team that provides the internal data platform that powers analytics and operational decision-making across Horizon3, to make high-quality, trustworthy data available to the business
  • Drive and lead execution on a modernization of Horizon3’s data architecture.
  • Define data quality and timeliness standards. Drive data quality, observability and pipeline robustness efforts to provide reliable and performant access to data to consumers.
  • Act as a product owner to capture needs of product teams, BI teams, and other customers and manage the roadmap of data engineering initiatives.
  • Grow a team of data engineers, infrastructure engineers, and data analysts. Establish a culture of collaboration and engineering excellence.

Benefits

  • health, vision & dental insurance for you and your family
  • a flexible vacation policy
  • generous parental leave
  • equity package in the form of stock options
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service