Solutions Engineer

Fetch
Remote

About The Position

As a Solution Engineer, you will meet with data partners, assess requirements, and recommend or build pipeline improvements to manage data export and delivery in an efficient, scalable process. Our data systems consolidate multiple data sources in order to enable both internal and external stakeholders to answer questions on an ongoing basis. You may be working on a small, cross-functional team that includes other engineers, a product manager, data scientists, and others. Success in this role requires the ability to take on ambiguous, complex problems and promote innovative solutions to address immediate needs and support future growth. In this role, you will assess requirements and help design the next generation of tools to allow some of the largest brands in the world to better understand their customers and reimagine the shopping experience. Join us in transforming the way that brands reach their customers and empowering consumers to Live Rewarded through the power of Fetch Points!

Requirements

  • 3+ years of professional experience in a technical role requiring advance knowledge of data pipelines, data integration, and data quality initiatives
  • Experience in requirements analysis and functional specification authorship
  • Experience in understanding business and data problems areas and devising actionable solutions
  • Project management experience using Jira or similar task management platforms
  • Experience writing optimized SQL for processing efficiency
  • Experience with data modeling and orchestration tools
  • Strong experience with SQL-based analytical data warehouses (e.g., Snowflake), and familiarity with object storage and non-relational data stores used in production systems (e.g., S3, DynamoDB).
  • Experience owning and operating an ETL and ELT processes, data warehouses, and business intelligence tools
  • Experience clearly communicating about data with internal and external stakeholders and senior leadership from both technical and nontechnical backgrounds
  • Ability to thrive in a highly autonomous, matrixed organization and manage multiple, concurrent work streams
  • Proficiency in at least one imperative programming language (i.e., Python)
  • Experience deploying and managing cloud resources on AWS, Azure, or GCP
  • Experience implementing data quality, data governance, or disaster recovery initiatives

Nice To Haves

  • Proclivity for building and experimenting with different tools and tech, and sharing your learnings with the broader organization
  • Experience building reusable code and automation, including using infrastructure-as-code tools such as Terraform.
  • Experience in measurement or AdTech

Responsibilities

  • Meet with Fetch B2B partners to share product knowledge and development lifecycle
  • Identify partners’ needs and transform them into practical business requirements
  • Offer guidance to partners and internal stakeholders in understanding and deriving value from shared data through requirements gathering, technical guidance, and individual consultation
  • Track and report status on partners’ feature requests, data issues, and incident reports
  • Promptly address partner and stakeholder questions with a sense of ownership and urgency
  • Collaborate with cross-functional teams to improve data product quality and performance
  • Provide actionable feedback on current data product performance and proposed changes
  • Manage planning and prioritization for operational support and product development
  • Maintain expertise in Fetch data pipelines and modern data management techniques to contribute to complex data pipeline projects with broad visibility
  • Model and analyze data utilizing SQL best practices for OLAP / OLTP query and database performance
  • Leverage Data Build Tool (DBT), Snowflake, Airflow, AWS infrastructure, CI/CD, testing, and engineering best practices to accomplish your work
  • Design, coordinate, and develop innovative approaches to managing datasets involving millions of daily active users and terabytes of data
  • Translate business requirements for data exports and near-real-time actionable insights into data models and artifacts
  • Communicate findings clearly verbally and in writing to a broad range of stakeholders
  • Lead the charge on data documentation and data discovery initiatives
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service