Solution Data Engineer Sr.

Sequoia Connect
Remote

About The Position

At Sequoia Connect, we are a Talent-First Technology Ecosystem that redefines how elite professionals interact with the global digital landscape. We move beyond traditional models to act as a catalyst for the top 1% of global talent, connecting human potential with complex industrial execution. By joining our inner circle, you are not simply taking a position; you are aligning with a strategic partner dedicated to updating your "Human OS" and accelerating your growth through world-class, high-impact projects. We are currently partnering with a global IT powerhouse that represents the connected world through innovative, customer-centric experiences. As a USD 6 billion organization and one of the top 7 IT service providers globally, our client empowers over 1,200 global customers—including several Fortune 500 companies—to "Rise™." With a massive network of 163,000+ professionals across 90 countries, they are at the absolute forefront of digital transformation, leveraging next-generation technologies such as 5G, AI, Blockchain, and Quantum Computing. This is your chance to thrive in a workplace recognized as one of the most sustainable corporations in the world. You will join an environment that values innovation and societal impact, working on end-to-end digital transformation projects for global leaders. If you are a driven professional looking for global career opportunities and exposure to high-impact projects within an international network of expertise, this is where you belong.

Requirements

  • Advanced proficiency in the Databricks Ecosystem (Workspace admin, Delta Lake, Unity Catalog).
  • Expertise in Data Warehousing & Integration, specifically with Snowflake.
  • Strong development skills in Python, SQL, and PySpark.
  • Hands-on experience with Adobe Data Ingestion (Data Feeds/Analytics).
  • Solid experience in version control using Git.
  • Advanced Oral English.
  • Advanced Spanish.

Nice To Haves

  • Experience with Amazon Redshift for data warehousing operations.
  • Proficiency in orchestration tools, specifically Airflow.
  • Familiarity with Google Cloud Platform (BigQuery) operations.
  • Knowledge of AWS Data Services such as Athena and AWS Glue.

Responsibilities

  • Full management of the Databricks environment, including configuration and workspace administration.
  • Implement data governance using Unity Catalog and manage structured storage with Delta Lake tables.
  • Write and optimize data processing logic using Python, SQL, and PySpark.
  • Manage data architecture and performance within Snowflake.
  • Handle Adobe Data Feeds and Adobe Analytics, ensuring correct ingestion and transformation of marketing data.
  • Oversee data warehousing operations in Amazon Redshift.
  • Schedule and monitor complex data workflows using Airflow.
  • Own end-to-end maintenance of data pipelines, ensuring reliability from ingestion to delivery across multiple cloud platforms.
  • Manage code lifecycle and collaborative development using Git.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service