About The Position

Worldly is hiring a hands-on data engineer with a passion for sustainability to join our dynamic team. You will take on a primary role in building, operating, maintaining, and evolving the systems that support our internal analytics and power our customer-facing analytics platforms. In this role: You will collaborate with stakeholders across the organization to design and implement scalable, cloud-based data solutions, integrating generative AI to drive innovation. You will work closely with cross-functional stakeholders (finance, product, marketing, customer support, tech, data science) to enable trusted data products for internal decision making and external-facing tools. You will have a leading role in the development of a data lake resource to complement our existing data warehouse, enabling greater flexibility in analytics and reporting. You will work with AWS services, automation tools, machine learning, and generative AI to enhance efficiency, stability, security, and performance. This role is expected to drive outcomes in day-to-day execution and operational stability, while partnering with senior engineering leadership on longer-range architecture direction.

Requirements

  • 5+ years of professional experience in data engineering, analytics engineering, or data platform engineering.
  • Advanced SQL expertise and strong experience with relational databases, especially Postgres.
  • Strong Python development skills applied to data pipelines, automation, and operational tooling.
  • Strong Git-based development practices (branching, PRs, code review).
  • Demonstrated experience developing and supporting DBT transformations and operational workflows.
  • Hands-on experience building AWS ingestion/ETL workflows using services such as S3, IAM, Glue, Lambda, CloudFormation (or other IaC), and AppFlow.
  • Practical DevOps experience: CI/CD pipelines, Git/GitHub workflows, and containerization fundamentals (Docker).
  • Experience with analytics data modeling and metric definition practices.
  • Experience implementing automated monitoring/alerting and data quality controls for pipelines and critical datasets.
  • Experience operating production data systems (including data quality tests, regression checks, validation frameworks, incident triage, root-cause analysis, runbooks, reliability improvements).
  • Experience working closely with analytics teams and cross-functional stakeholders; familiarity with Jira/Confluence and Agile delivery.
  • Familiarity with data security practices (PII protection, encryption controls, access management).

Nice To Haves

  • Experience with semantic layers (Cube.dev preferred),
  • Familiarity with genAI/NLP enablement,
  • Exposure to graph databases (knowledge graphs) and related concepts (Neo4j preferred).

Responsibilities

  • SQL + Postgres Data Warehouse
  • Operate and evolve our Postgres data warehouse: schema design, performance tuning, indexing, access controls, and so on.
  • Build analytics-ready datasets supporting sustainability measurement, supply-chain insights, and business metrics.
  • Semantic Layer Deployment
  • Deploy and maintain multiple instances of Cube.dev semantic layers with standardized configuration, CI/CD workflows, and governance practices (including documentation of processes, configurations, and troubleshooting).
  • Establish clear and consistent metric definitions and versioning across dashboards and analytics surfaces.
  • GenAI/NLP Enablement
  • Support integration and deployment of genAI-enabled workflows, especially NLP-based use cases (classification, extraction, normalization, embeddings/similarity). • Ensure that our data infrastructure is “AI-ready”.
  • Graph Data Enablement
  • In collaboration with data scientists, research and develop practical transition plans for evolving selected relational/warehouse data structures into a graph-based knowledgebase (for use with the Neo4j framework), including candidate use cases, data modeling approach, migration sequencing, and operational considerations (performance, governance, lineage, and security).
  • DBT Pipelines and Automation
  • Maintain and stabilize existing DBT pipelines that underpin reporting and analytics, including automation for incremental processing/scheduling, data quality monitoring, and performance tuning.
  • Lead operational support and modernization planning: rapid triage and root-cause resolution for pipeline issues, and evaluation/prototyping of next-generation transformation approaches with clear, low-risk transition plans in partnership with analytics and engineering stakeholders.
  • AWS Data Pipelines
  • Build ingestion and ETL processes using S3, Glue, Lambda, and AppFlow.
  • Integrate data from third-party systems and APIs (e.g., Zendesk, HubSpot, NetSuite, other platform data) with strong auditability and operational resilience.

Benefits

  • Medical, Dental, and Vision Insurance are offered through multiple PPO options. Worldly covers 90% employee premium and 60% spouse/dependent premium.
  • Company-sponsored 401k with up to 4% match for US employees.
  • Incentive Stock Options.
  • 100% Parental Paid Leave.
  • Unlimited PTO.
  • 12 paid company holidays.
  • Earn a competitive salary and performance-based bonuses. Get healthcare, retirement matching, and equity for US employees.
  • Flexible time off. Take the time you need to recharge. Our culture encourages team members to explore and rest to be their best selves.
  • Work-From-Home Stipends
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service