ServiceNow-posted 2 months ago
$187,600 - $328,300/Yr
Full-time • Senior
Pleasanton, CA
Professional, Scientific, and Technical Services

Join the Global Cloud Services organization's FinOps Tools team, which is building ServiceNow's next-generation analytics and financial governance platform. Our team owns the full modern data stack: Trino for distributed queries, dbt for transformations, Iceberg for lakehouse architecture, Lightdash for business intelligence, and Argo Workflows for orchestration. You will be the founding engineer dedicated to building the Cloud Development Platform that empowers our 30+ data practitioners (data scientists, analysts, and FinOps engineers) to collaborate and productionize analytics at scale. We are building a cloud-native data development platform that bridges the gap between exploratory analysis and production-grade workflows. As our founding Staff Software Developer focused on Cloud Development Infrastructure, you will design, architect, and rapidly implement a platform built on VS Code, Coder, and Jupyter that seamlessly integrates with our existing data stack (Trino, dbt, Iceberg, Lightdash, Argo Workflows). You will establish opinionated, automated pathways from notebook experimentation to production pipelines, moving at startup speed within an enterprise environment. This role demands aggressive execution: working prototype in 3 months, production-ready platform in 6 months. This is a unique opportunity to build from the ground up and define how data development happens at ServiceNow's scale.

  • Design and architect the foundational cloud development platform for notebook-based data workflows
  • Lead technical decision-making on workspace provisioning, developer experience, and productionization pathways
  • Establish best practices for notebook-to-production workflows, including git integration, parameterization, validation, and automated deployment
  • Drive innovation in data development platforms, leveraging AI/ML tools for enhanced developer productivity
  • Build and customize cloud workspace infrastructure using Coder (open source) on Kubernetes
  • Develop VS Code extensions (TypeScript) for productionization workflows: notebook validation, parameterization, and Argo Workflow generation
  • Implement opinionated notebook templates and validation rules for production-ready data pipelines
  • Create seamless integrations between notebooks and ServiceNow's data stack: Trino queries, Iceberg table outputs, Lightdash previews, dbt transformations
  • Build backend services (Python) for workflow orchestration, notebook parsing, and metadata management
  • Deploy JupyterHub initially, then progressively replace components with custom platform features based on user feedback
  • Design container images with embedded security policies, pre-configured data access to Trino/Iceberg tables, and optimized dependencies
  • Implement git-native workflows with automated notebook versioning, code review integration, and CI/CD pipelines
  • Build observability and monitoring for workspace health, user activity, and pipeline success rates
  • Create 'template-based' notebook workflows with opinionated structure: parameterization (Papermill-style), Iceberg table outputs, validation checkpoints
  • Build CLI and UI tooling for one-click productionization: notebook → Argo Workflow with minimal manual intervention
  • Establish developer guardrails: credential management, data access policies, resource quotas
  • Collaborate closely with early adopter data scientists to rapidly iterate on workflows and validate usability
  • Leverage cutting-edge AI development tools (e.g., Cursor, Windsurf, ChatGPT, GitHub Copilot) to accelerate development velocity
  • Establish AI-augmented development practices and mentor future team members on effective AI tool utilization
  • Collaborate with DevOps team on Kubernetes infrastructure, CI/CD pipelines, and security policies
  • Partner with FinOps Tools team members working on Trino, dbt, Lightdash, and Iceberg to ensure seamless integrations
  • Contribute to open-source projects in the notebook and developer tooling ecosystem
  • Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving.
  • 12+ years of experience in software engineering with deep expertise in full-stack development and cloud-native architecture with a Bachelor's degree; or 8 years and a Master's degree; or a PhD with 5 years experience in Computer Science, Engineering, or related technical field; or equivalent experience.
  • Strong Python skills for backend services, API development, and data tooling (notebook parsing, workflow generation)
  • Proven track record of rapid execution in greenfield environments with evolving requirements
  • Hands-on experience building and scaling developer platforms or internal tools at enterprise scale
  • Deep understanding of cloud development environments (Coder, GitHub Codespaces, Gitpod, or similar)
  • Strong Kubernetes and containerization expertise for cloud-native application deployment
  • Experience with data workflows and tooling: Jupyter, notebooks, orchestration systems (Airflow/Argo), data catalogs
  • Full professional proficiency in English
  • Open-source contributions, Jupyter ecosystem, or developer tooling
  • Experience with Argo Workflows, Tekton, or Kubernetes-native CI/CD systems
  • Familiarity with data validation frameworks (Great Expectations, dbt tests, etc.)
  • Experience with Apache Iceberg or lakehouse architectures
  • Conference speaking or technical blogging on developer platforms or data tooling
  • Base pay of $187,600 - $328,300, plus equity (when applicable), variable/incentive compensation and benefits.
  • Health plans, including flexible spending accounts
  • 401(k) Plan with company match
  • ESPP, matching donations
  • Flexible time away plan and family leave programs
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service