Data Engineering Analyst

Tiverton AdvisorsRaleigh, NC
2hOnsite

About The Position

Tiverton is seeking a Data Engineering Analyst (Entry Level / Intern) to support our investment process and portfolio operations through data engineering, analytics, and AI-powered automation. This is an entry-level position ideal for recent graduates or current students seeking internship-to-hire opportunities. The role combines data infrastructure development with investment analytics, working across deal sourcing, due diligence, portfolio monitoring, and LP reporting. The ideal candidate is curious, eager to learn, and excited to build solutions across the full data stack - from pipeline engineering to business intelligence - while applying AI/ML tools to solve real-world problems in agricultural private equity. This role offers broad exposure to both the investment side (deal flow, due diligence and fund analytics) and operations side (portfolio company data, reporting automation, and other analytics. This role is onsite in our Raleigh, NC office. The successful candidate will be self-motivated and energized by working with a group of thoughtful, smart, and skilled colleagues. He or she will enjoy being a part of a young, hungry and collaborative organization focused on becoming the pre-eminent investment firm in US agriculture.

Requirements

  • Proficiency in Python and SQL through coursework or projects; familiarity with pandas, APIs, or automation a plus
  • Exposure to data pipelines, ETL concepts, or data engineering workflows through coursework or projects
  • Familiarity with cloud platforms or data warehouses (Snowflake, BigQuery, AWS) – exposure through coursework or certifications counts
  • Interest in data visualization; experience with any BI tool (Power BI, Tableau, Looker) or willingness to learn
  • Solid Excel skills including formulas, pivot tables, and basic data analysis
  • Exposure to APIs, web scraping, or data collection methods (REST APIs, Beautiful Soup, or similar)
  • Interest in AI/ML tools and LLMs; experience with ChatGPT, Claude, or similar for productivity is a plus
  • Git version control and collaborative development workflows
  • Ability to translate business problems into technical solutions
  • Strong problem-solving skills - can debug data issues independently
  • Understanding of financial concepts and private equity metrics helpful but not required
  • Strong communication skills - can explain technical concepts to non-technical stakeholders
  • Self-directed with ability to prioritize and manage multiple projects
  • Detail-oriented with focus on data quality and reliability
  • Current senior or recent graduate (within 1 year) pursuing or holding a degree in a relevant field
  • Pursuing or recently completed Bachelor’s degree in Computer Science, Data Science, Engineering, Finance, Economics, or related quantitative field
  • Demonstrated interest through coursework, personal projects, hackathons, or prior internships involving data pipelines, analytics, or automation

Nice To Haves

  • Experience building LLM-powered applications or automation tools
  • Familiarity with CRM systems (Affinity, Salesforce) or investment workflow tools
  • Experience with document processing and unstructured data extraction
  • Knowledge of ML libraries (scikit-learn, numpy) and model deployment
  • Exposure to private equity, venture capital, or investment banking
  • Understanding of DevOps practices - testing, monitoring, CI/CD
  • Knowledge of agricultural markets, farm credit systems, or commodity data

Responsibilities

  • Build and maintain ETL pipelines pulling data from internal and external sources into our Snowflake data warehouse
  • Develop Python and SQL automation scripts for recurring data processes
  • Manage Snowflake data warehouse - schema design, query optimization, and data modeling
  • Build API integrations for third-party data sources (pricing data, B2B data providers, market intelligence)
  • Implement data quality checks, validation rules, and monitoring to ensure pipeline reliability
  • Create web scraping solutions for data collection from public sources
  • Maintain code repositories with proper version control and documentation
  • Support deal pipeline analytics and sourcing workflows in our CRM
  • Build models and analytics for sector trends (crop prices, land values, farm credit metrics)
  • Extract and analyze data from appraisal documents, financial statements, and industry reports
  • Develop due diligence analytical frameworks and data rooms for new investments
  • Create LP reporting dashboards and automated quarterly reporting processes
  • Support investment team with ad-hoc analytical requests and data visualization
  • Leverage LLMs (OpenAI, Claude) to accelerate document analysis, data extraction, and research workflows
  • Build AI-powered automation for deal screening, document processing, and data enrichment
  • Implement intelligent solutions for pattern recognition, anomaly detection, and data quality
  • Use prompt engineering and AI coding assistants to rapidly prototype analytical tools
  • Develop RAG (Retrieval-Augmented Generation) systems for knowledge management
  • Support portfolio company reporting requirements and data requests
  • Build dashboards and reporting tools for portfolio operations teams
  • Troubleshoot data issues and provide technical support to portfolio companies
  • Partner with investment team to ensure clean, reliable data for portfolio monitoring

Benefits

  • Healthcare
  • Dental
  • Vision
  • Group Life Insurance
  • 401(k)
  • generous PTO
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service