Nuclearn-posted 13 days ago
Full-time • Mid Level
Hybrid • Phoenix, AZ
11-50 employees

Nuclearn.ai builds AI-powered software for the nuclear and utility industries—tools that keep critical infrastructure reliable, efficient, and safe. Our software integrates AI-driven workflow, documentation, and research automation, and is already used at 60+ nuclear reactors across North America. You'll ship production code operators and engineers rely on every day. We're growing quickly, expanding our team and our Phoenix HQ. The work is consequential: what you build helps real plants run safer and smarter. Eligibility: U.S. citizenship or permanent residency (green card) is required due to DOE export compliance. You’ll own features end‑to-end across a modern Python/React stack, with a heavy dose of reliability, data plumbing, and “meet the enterprise where it is” integration work. Ship production features across React (frontend) and FastAPI (backend) that power our products (e.g., CAP AI, AtomAssist). Design and evolve APIs and Postgres schemas for performance, correctness, and auditability (migrations, indexing/partitioning, background data corrections). Containerize and deploy services with Docker/Podman and Kubernetes; help tune queues/workers (Redis, RabbitMQ, Celery) for throughput, reliability and idempotency. Make data usable: build ingestion pipelines that prefer tabular sources (CSV/Excel/JSON) but gracefully handle the “we only have PDFs” reality - minimizing OCR, adding validation, and failing safely. Integrate with customer systems common in the industry (e.g., Maximo, DevonWay, Microsoft 365/Teams/OneNote). Own reliability: reduce noise and fix root causes identified across Sentry and Netdata; add observability, back‑pressure, retries, and circuit breakers so we never lose a record. Collaborate with customers: join (lightweight) customer calls with utilities to understand constraints, scope integrations, and demo new capabilities. Reality of the role: You’ll bounce between product code, schema work, a gnarly data import, a Sentry investigation, and a customer demo environment - often in the same week. Examples of problems you might own in your first 90 days Build a DevonWay → CAP AI connector that ingests event data in tabular form, validates against our schemas, and supports safe reprocessing. Add a “simulate, then apply” workflow for CAP automations (human‑in‑the-loop gates, dry‑run diffs, full audit trails, easy rollbacks). Cut a noisy Sentry class of errors by 30% by hardening a Celery task (idempotent writes, retry policy, dead‑letter queue). Implement license entitlements & usage reporting for a fleet customer renewal (clean server‑side enforcement plus UI visibility). Deliver a small Teams/OneNote POC to integrate new data streams into AtomAssist.

  • Ship production features across React (frontend) and FastAPI (backend) that power our products (e.g., CAP AI, AtomAssist).
  • Design and evolve APIs and Postgres schemas for performance, correctness, and auditability (migrations, indexing/partitioning, background data corrections).
  • Containerize and deploy services with Docker/Podman and Kubernetes; help tune queues/workers (Redis, RabbitMQ, Celery) for throughput, reliability and idempotency.
  • Make data usable: build ingestion pipelines that prefer tabular sources (CSV/Excel/JSON) but gracefully handle the “we only have PDFs” reality - minimizing OCR, adding validation, and failing safely.
  • Integrate with customer systems common in the industry (e.g., Maximo, DevonWay, Microsoft 365/Teams/OneNote).
  • Own reliability: reduce noise and fix root causes identified across Sentry and Netdata; add observability, back‑pressure, retries, and circuit breakers so we never lose a record.
  • Collaborate with customers: join (lightweight) customer calls with utilities to understand constraints, scope integrations, and demo new capabilities.
  • Degree in CS or related field—or equivalent practical experience.
  • You’ve shipped production React + FastAPI and can contribute independently within ~6 weeks.
  • You care about correctness and safety: typed APIs, schema migrations with backfills, idempotent jobs, and tests that catch the sharp edges.
  • You’re comfortable with customer‑facing engineering (a quick demo, a clarifying question, a pragmatic workaround).
  • Clear, direct communicator; kind reviewer; steady under pressure.
  • AI/ML or data‑pipeline experience (prompting, retrieval, feature stores, vector search).
  • Prior startup experience.
  • Exposure to nuclear/utility or other safety‑critical domains (aviation, med‑device, rail, etc.).
  • Unlimited PTO
  • health/dental/vision insurance
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service