Ingenio-posted 8 days ago
$125,000 - $155,000/Yr
Full-time • Mid Level
Hybrid • San Francisco, CA

Here at Ingenio, we'd love to talk with you regardless of your qualifications or years of experience. If you believe you’d be a great fit for this role, we invite you to apply even if you do not meet all points on the job description. Who we are: Ingenio is a global media and technology company developing products that provide guidance on love, relationships, careers, and all aspects of life. We are passionate about connecting people with the world’s best advisors and content to empower everyone to live happier lives. Ingenio offers the world’s largest portfolio of over 20 marketplace and media brands in the spiritual and emotional wellness space. Our flagship brands include Keen, Horoscope.com , Astrology.com , Purple Garden, Kasamba, and Kang. How you’ll be impactful: As a Data Engineer, you will build and maintain data processing pipelines and systems that are critical for data operations and business efficiency. This includes designing scalable data workflows, maintaining new API integrations with marketing and affiliate partners (e.g., Google Ads, GA, Liveramp), and ensuring accurate, reliable data for analytics and machine learning applications. Please note: This role will require being in our SF office at least 3 days per week (Tuesday-Thursday).

  • Assist in the design, construction, and maintenance of large-scale data processing systems.
  • Design and build scalable data pipelines: Architect and implement data workflows for high-volume, high-complexity datasets, ensuring data is reliable, accurate, and accessible for analytics and machine learning applications.
  • Work closely with data scientists to deploy models into production using modern MLOps practices (CI/CD for ML, automated retraining, monitoring, and rollout strategies).
  • Operationalize feature stores, model registries, and model versioning workflows.
  • Design data workflows optimized for AI/ML use cases, including real-time streaming data, vector data pipelines, or embeddings-based retrieval systems where applicable.
  • Build and maintain data applications in the cloud using FastAPI and other related technologies.
  • Implement data flow processes to integrate, transform, and summarize data from disparate sources.
  • Develop ETL scripts and SQL queries to manage data across multiple platforms.
  • Collaborate with team members to improve data efficiency and quality.
  • Bachelor’s degree in Computer Science, Engineering, or a related field.
  • 4+ years experience in building data pipes, datalakes, and data warehouses.
  • Familiarity with data warehousing solutions like Snowflake.
  • Strong knowledge of ETL/ELT concepts, frameworks, or tools (e.g., Apache Airflow, DBT, or similar).
  • Experience with ML workflow orchestration tools such as MLflow, Kubeflow, Airflow, SageMaker, or similar.
  • Familiarity with model deployment, monitoring, and lifecycle management.
  • Strong experience with SQL and familiarity with programming languages such as Python or Java.
  • Good understanding of data warehousing and data modeling concepts.
  • Strong problem-solving skills and attention to detail.
  • Opportunity to work alongside a friendly, talented, and highly collaborative team
  • Premium medical, dental, and vision insurance
  • Generous holiday and PTO policies (including Birthday PTO!)
  • Summer Fridays
  • Technology stipend
  • 401k matching program
  • Lunch
  • Wellness allowance
  • Training and development opportunities and allowance
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service