Salesforce-posted 6 days ago
Full-time • Mid Level
San Francisco, CA
5,001-10,000 employees

Salesforce is the #1 AI CRM, where humans with agents drive customer success together. Here, ambition meets action. Tech meets trust. And innovation isn’t a buzzword — it’s a way of life. The world of work as we know it is changing and we're looking for Trailblazers who are passionate about bettering business and the world through AI, driving innovation, and keeping Salesforce's core values at the heart of it all. Ready to level-up your career at the company leading workforce transformation in the agentic era? You’re in the right place! Agentforce is the future of AI, and you are the future of Salesforce. We are looking for exceptional Senior Engineers to build the engine that powers Salesforce’s enterprise intelligence. In this role, you will be a hands-on technical contributor responsible for modernizing our core data ecosystem. You will move beyond simple ETL scripts to build a robust, software-defined Data Mesh using Snowflake, dbt, Airflow, and Informatica . You will bridge the gap between "Data Engineering" and "Software Engineering"—treating data pipelines as production code, automating infrastructure with Terraform, and optimizing high-scale distributed systems to enable AI and Analytics across the enterprise.

  • Design and implement scalable data pipelines and transformation logic using Snowflake (SQL) and dbt
  • Replace legacy hardcoded scripts with modular, testable, and reusable data components.
  • Engineer robust workflows in Airflow
  • Write custom Python operators and ensure DAGs are dynamic, factory-generated, and resilient to failure.
  • Deep dive into query profiles, optimize pruning/clustering in Snowflake, and reduce credit consumption while improving data freshness.
  • Manage the underlying platform infrastructure (warehouses, roles, storage integration) using Terraform or Helm.
  • Ensure every PR has unit tests, schema validation, and automated deployment pipelines.
  • Build monitoring and alerting (Monte Carlo, Grafana, Newrelic, Splunk) to detect data anomalies before stakeholders do.
  • Work with domain teams (Sales, Marketing, Finance) to onboard them to the platform, helping them decentralize their data ownership while adhering to platform standards.
  • Prepare structured data for AI consumption, ensuring high-quality, governed datasets are available for LLM agents and advanced analytics models.
  • Migrate this domain to dbt
  • Optimize this slow pipeline
  • Strong background in software engineering (Python/Java/Go) applied to data. You are comfortable writing custom API integrations and complex Python scripts.
  • Deep production experience with Snowflake (architecture/tuning) and dbt (Jinja/Macros/Modeling).
  • Advanced proficiency with Airflow (Managed Workflows for Apache Airflow).
  • Hands-on experience with AWS services (S3, Lambda, IAM, ECS) and containerization (Docker/Kubernetes).
  • Experience with Git, CI/CD (GitHub Actions/Jenkins), and Terraform.
  • 5+ years of relevant data or software engineering experience.
  • Familiarity with Graph Databases ( Neo4j ) or Semantic Standards (RDF/SPARQL, TopQuadrant ) is a strong plus as we integrate these technologies into the platform.
  • Experience with Apache Iceberg or Delta Lake.
  • Experience with Kafka or Snowpipe Streaming.
  • Experience using AI coding assistants (Copilot, Cursor) to accelerate development.
  • time off programs
  • medical
  • dental
  • vision
  • mental health support
  • paid parental leave
  • life and disability insurance
  • 401(k)
  • an employee stock purchasing program
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service