About The Position

Join the Global Cloud Services organization's FinOps Tools team, which is building ServiceNow's next-generation analytics and financial governance platform. Our team owns the full modern data stack: Trino for distributed queries, dbt for transformations, Iceberg for lakehouse architecture, Lightdash for business intelligence, and Argo Workflows for orchestration. You will be the founding engineer dedicated to building the Cloud Development Platform that empowers our 30+ data practitioners (data scientists, analysts, and FinOps engineers) to collaborate and productionize analytics at scale. We are building a cloud-native data development platform that bridges the gap between exploratory analysis and production-grade workflows. As our founding Staff Software Developer focused on Cloud Development Infrastructure, you will design, architect, and rapidly implement a platform built on VS Code, Coder, and Jupyter that seamlessly integrates with our existing data stack (Trino, dbt, Iceberg, Lightdash, Argo Workflows). You will establish opinionated, automated pathways from notebook experimentation to production pipelines, moving at startup speed within an enterprise environment. This role demands aggressive execution: working prototype in 3 months, production-ready platform in 6 months. This is a unique opportunity to build from the ground up and define how data development happens at ServiceNow's scale.

Requirements

  • Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools, automating workflows, analyzing AI-driven insights, or exploring AI's potential impact on the function or industry.
  • 12+ years of experience in software engineering, with a track record of delivering high-quality products with deep expertise in full-stack development and cloud-native architecture with a Bachelor's degree; or 8 years and a Master's degree; or a PhD with 5 years experience in Computer Science, Engineering, or related technical field; or equivalent experience.
  • Strong Python skills for backend services, API development, and data tooling (notebook parsing, workflow generation)
  • Proven track record of rapid execution in greenfield environments with evolving requirements
  • Hands-on experience building and scaling developer platforms or internal tools at enterprise scale
  • Deep understanding of cloud development environments (Coder, GitHub Codespaces, Gitpod, or similar)
  • Strong Kubernetes and containerization expertise for cloud-native application deployment
  • Experience with data workflows and tooling: Jupyter, notebooks, orchestration systems (Airflow/Argo), data catalogs
  • Full professional proficiency in English
  • Proficiency in Python, Java, or similar object-oriented languages.
  • Experience with modern front-end frameworks such as Angular, React, or Vue.
  • Strong knowledge of data structures, algorithms, object-oriented design, design patterns, and performance optimization
  • Familiarity with automated testing frameworks (e.g., JUnit, Selenium, TestNG) and integrating tests into CI/CD pipelines
  • Understanding software quality principles including reliability, observability, and production readiness.
  • Ability to troubleshoot complex systems and optimize performance across the stack.
  • Experience with AI-powered tools or workflows, including validation of datasets, model predictions, and inference consistency.
  • Comfort with development tools such as IDEs, debuggers, profilers, source control, and Unix-based systems
  • VS Code ecosystem: Extension API, webview development, command palette, language servers, debugger protocols
  • Coder or similar platforms: Workspace provisioning, remote development environments, infrastructure customization
  • Jupyter ecosystem: JupyterHub, Jupyter Server, Papermill, nbconvert, or similar notebook tooling
  • Kubernetes & containerization: Pod management, custom resource definitions, Helm charts, image security
  • Infrastructure as Code: Terraform, Kubernetes operators, GitOps workflows
  • Git workflows: Branching strategies, code review automation, CI/CD integration
  • Modern data stack: Familiarity with Trino, dbt, Iceberg, Argo Workflows, or similar technologies
  • API design: RESTful services, authentication (OAuth/SAML), webhook integrations
  • Proven track record building internal developer platforms or productivity tools from scratch
  • Experience designing opinionated workflows that balance flexibility with guardrails
  • Strong understanding of developer personas: data scientists, analysts, engineers
  • Ability to iterate rapidly with early adopters and incorporate feedback without over-engineering
  • Experience with workspace security: secrets management, network policies, image scanning
  • Comfort operating at startup velocity within enterprise constraints
  • Proven ability to work autonomously and drive technical decisions in ambiguous, greenfield environments
  • Strong bias toward action: prototype quickly, gather feedback, iterate aggressively
  • Strong technical writing and documentation skills for developer-facing content
  • Excellent collaboration skills across engineering, DevOps, and data teams
  • Ability to establish technical foundations for new products with long-term vision while delivering short-term results

Nice To Haves

  • Open-source contributions, Jupyter ecosystem, or developer tooling
  • Experience with Argo Workflows, Tekton, or Kubernetes-native CI/CD systems
  • Familiarity with data validation frameworks (Great Expectations, dbt tests, etc.)
  • Experience with Apache Iceberg or lakehouse architectures
  • Conference speaking or technical blogging on developer platforms or data tooling

Responsibilities

  • Design and develop scalable, maintainable, and reusable software components with a strong emphasis on performance and reliability.
  • Collaborate with product managers to translate requirements into well-architected solutions, owning features from design through delivery
  • Build intuitive and extensible user experiences using modern UI frameworks, ensuring flexibility for customer-specific needs.
  • Contribute to the design and implementation of new products and features while enhancing existing product capabilities.
  • Integrate automated testing into development workflows to ensure consistent quality across releases.
  • Participate in design and code reviews ensuring best practices in performance, maintainability, and testability.
  • Develop comprehensive test strategies covering functional, regression, integration and performance aspects
  • Foster a culture of continuous learning and improvement by sharing best practices in engineering and quality
  • Promote a culture of engineering craftsmanship, knowledge-sharing, and thoughtful quality practices across the team.
  • Design and architect the foundational cloud development platform for notebook-based data workflows
  • Lead technical decision-making on workspace provisioning, developer experience, and productionization pathways
  • Establish best practices for notebook-to-production workflows, including git integration, parameterization, validation, and automated deployment
  • Drive innovation in data development platforms, leveraging AI/ML tools for enhanced developer productivity
  • Move fast: deliver working MVP in 3 months, production system scale in 6 months
  • Build and customize cloud workspace infrastructure using Coder (open source) on Kubernetes
  • Develop VS Code extensions (TypeScript) for productionization workflows: notebook validation, parameterization, and Argo Workflow generation
  • Implement opinionated notebook templates and validation rules for production-ready data pipelines
  • Create seamless integrations between notebooks and ServiceNow's data stack: Trino queries, Iceberg table outputs, Lightdash previews, dbt transformations
  • Build backend services (Python) for workflow orchestration, notebook parsing, and metadata management
  • Deploy JupyterHub initially, then progressively replace components with custom platform features based on user feedback
  • Design container images with embedded security policies, pre-configured data access to Trino/Iceberg tables, and optimized dependencies
  • Implement git-native workflows with automated notebook versioning, code review integration, and CI/CD pipelines
  • Build observability and monitoring for workspace health, user activity, and pipeline success rates
  • Establish infrastructure foundation that scales from 5 early adopters to 30+ practitioners within first year
  • Create "template-based" notebook workflows with opinionated structure: parameterization (Papermill-style), Iceberg table outputs, validation checkpoints
  • Build CLI and UI tooling for one-click productionization: notebook → Argo Workflow with minimal manual intervention
  • Establish developer guardrails: credential management, data access policies, resource quotas
  • Collaborate closely with early adopter data scientists to rapidly iterate on workflows and validate usability
  • Prioritize platform stability and clear productionization paths over feature breadth in first 6 months
  • Leverage cutting-edge AI development tools (e.g.. Cursor, Windsurf, ChatGPT, GitHub Copilot) to accelerate development velocity
  • Establish AI-augmented development practices and mentor future team members on effective AI tool utilization
  • Drive innovation in AI-assisted code generation, testing, and platform optimization
  • Work autonomously with guidance from Engineering and FinOps leadership
  • Collaborate with DevOps team on Kubernetes infrastructure, CI/CD pipelines, and security policies
  • Partner with FinOps Tools team members working on Trino, dbt, Lightdash, and Iceberg to ensure seamless integrations
  • Contribute to open-source projects in the notebook and developer tooling ecosystem

Benefits

  • equity (when applicable)
  • variable/incentive compensation
  • health plans, including flexible spending accounts
  • a 401(k) Plan with company match
  • ESPP
  • matching donations
  • a flexible time away plan
  • family leave programs
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service