Hardware Engineer, Design Verification

Normal Computing CorporationNew York City, NY
Hybrid

About The Position

The Normal Team builds foundational software and hardware that help move technology forward - supporting the semiconductor industry, critical AI infrastructure, and the broader systems that power our world. We work as one team across New York, San Francisco, Copenhagen, Seoul, and London. Your Role in Our Mission: You will bring your expertise in the end-to-end design verification flow to support our Verification AI team. This is a hybrid verification and product-shaping role. You will verify internal hardware (Physics inspired ASICs) while simultaneously reviewing the collateral generated by our AI to help refine product strategy and tool usability. You will act as the bridge between raw verification data and our Machine Learning models, ensuring our AI learns from high-quality, curated, and synthesized data.

Requirements

  • 5+ years of experience in Digital Verification at a major semiconductor or EDA tool company.
  • Advanced proficiency in SystemVerilog, UVM methodology, EDA verification tools (vManager, Xcelium, Jasper), and proficiency and application of Python or Perl scripting.
  • Proven expertise in end-to-end design verification, including test plan creation, stimulus generation, and feature extraction.
  • Excellent written and spoken communication skills.

Responsibilities

  • Review AI-generated collateral to help shape product strategy and refine AI outputs in collaboration with the ML team.
  • Provide design verification for internal hardware projects
  • Set up and evaluate EDA tools, ensuring internal tool usability and effective deployment on shared computing resources.
  • Verification collateral development: create testbench environments, assertions, and coverage, from design documents, to support product development, functional coverage, and coverage closure.
  • Curate and annotate datasets to make it easier to associate specific parts of a chip specification with specific test cases.
  • Establish rigorous quality criteria for verification data and implement continuous refinement processes.
  • Implement data augmentation methods and automated quality assurance checks to ensure high-fidelity data for ML training.
  • Generate synthetic data using AI-based methods to supplement real datasets.
  • Collaborate with ML teams to ensure synthetic data effectively challenges verification models.
  • Build automated pipelines to annotate test data and link it explicitly to chip specifications.
  • Automate document parsing (e.g., datasheets, protocol specifications) for contextual tagging and traceability.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service