Machine Learning Intern

Marble TechnologiesLincoln, NE
11h

About The Position

Marble is seeking a motivated and technically strong Machine Learning Intern to help build internal systems for summarizing and analyzing production data using large language models, statistical tools, and modern data infrastructure. This is a hands-on opportunity to work on applied AI systems that connect production data, analytics, and agent-based workflows in a real industrial setting. As a Machine Learning Intern at Marble, you will help build the AI and data infrastructure that allows LLM-powered agents to analyze packoff and production data at scale. This role is centered on applied AI systems and data engineering rather than traditional model training. You will develop pipelines that extract, process, and organize data from platforms such as ClickHouse and Spark, create statistical and analytical tools that LLMs can use through tool calls, and build orchestration logic for agents that identify trends, surface issues, and communicate insights. Depending on project needs, you may also support integrations with Slack for notifications, human-in-the-loop guidance, and agent supervision. This role is ideal for someone who is excited by applied AI, data engineering, distributed data systems, and building practical software that connects LLMs to real industrial workflows.

Requirements

  • Pursuing a Bachelor’s or Master’s degree in Computer Science
  • Foundational understanding of machine learning, data systems, or LLM-based applications
  • Experience with Python
  • Strong analytical and problem-solving skills.
  • Ability to work independently while collaborating across engineering teams
  • Strong written and verbal communication skills

Nice To Haves

  • Experience with LLM APIs, tool calling, agent workflows, or prompt engineering
  • Experience with cloud platforms such as AWS, GCP, or Azure
  • Familiarity with version control systems such as Git
  • Interest in industrial operations, manufacturing analytics, or applied AI systems

Responsibilities

  • Build pipelines to retrieve, process, and organize packoff and production data using ClickHouse, Spark, and related systems
  • Create statistical and analytical tools that LLM-based agents can use through tool calls
  • Design workflows for agents to summarize data, identify trends, and surface anomalies
  • Support orchestration of AI systems that combine data retrieval, tool use, and language-model reasoning
  • Help integrate agent workflows with Slack or similar tools for reporting, alerts, and human guidance
  • Partner with engineering teams to define KPIs, summaries, and operational monitoring outputs
  • Validate and improve the accuracy and reliability of LLM-generated outputs
  • Document pipelines, tools, prompts, and system workflows
  • Test, debug, and improve internal AI and data systems
  • Research new methods for agent tooling, analytics, and operational reporting
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service