Senior Data Visualization Engineer

NVIDIADurham, NC
1dHybrid

About The Position

NVIDIA is seeking a Senior Data Visualization Engineer to define the visual interface of our most advanced data products. You will sit at the intersection of UX design and massive-scale engineering, turning petabyte-scale data lakes into real-time, ML-infused analytical applications. We aren't just building dashboards; we are building the "cockpit" for the world’s most advanced AI infrastructure. What you'll be doing: Rapid Dashboard Prototyping: Build and deploy high-fidelity visuals at speed. You will be responsible for taking raw requirements and delivering intuitive, production-ready dashboards in accelerated timelines. Architect Intelligence Products: Design and build high-performance, interactive data applications that translate complex system telemetry into actionable insights. Bridge ML and UX: Partner with Data Science teams to visualize predictive model outputs, making anomaly detection and forecasting intuitive for executive decision-makers. Engineered Data Discovery: Orchestrate data across the speed layer (such as Elasticsearch) and cold storage to provide a seamless, multi-tiered analytical experience. Engineered Data Discovery: Leverage SQL, Spark, and Presto to orchestrate data directly from our S3/Delta Lake environments, ensuring sub-second latency for massive datasets. Optimize dashboard performance to ensure that visualizations are fast, responsive, and reliable, even when querying terabyte-scale datasets. Craft compelling visual narratives that highlight key insights, trends, and opportunities, going beyond simply displaying data.

Requirements

  • 8+ years of hands-on experience in a data visualization, Business Intelligence, or data analyst role with a strong focus on dashboard development.
  • Large Data Experience: Proven experience working with and visualizing large-scale datasets (billions of rows) from data warehouses like Snowflake, BigQuery, Redshift, or similar.
  • ML Literacy: A solid understanding of machine learning principles and the ability to visualize model health and feature importance (SHAP, ROC, etc.).
  • Proficiency in Python (using libraries like Pandas, Plotly) or R for data manipulation and analysis.
  • Web Service Integration: Experience connecting to and consuming data from web services and APIs (e.g., REST, GraphQL) to integrate real-time or third-party data into dashboards.
  • Demonstrating high-level proficiency in SQL, adept at formulating detailed and efficient queries to extract and transform data from vast data repositories.
  • A solid grasp of visual composition principles and data visualization guidelines to guarantee dashboards are impactful and user-friendly.
  • Outstanding communication and presentation proficiencies, adept at conveying complex insights to diverse audiences, regardless of technical background.
  • Bachelor’s degree or equivalent experience in Computer Science, Data Science, Information Systems, Statistics, or a related field.

Nice To Haves

  • Experience with JavaScript-based visualization libraries (e.g., D3.js, Plotly, Highcharts) for building custom, highly interactive visuals.
  • Familiarity with data modeling concepts and ETL/ELT processes.
  • A background or strong interest in UI/UX development principles.
  • A public-facing portfolio (e.g., Tableau Public, GitHub) showcasing your dashboard and visualization projects is a significant plus.

Responsibilities

  • Rapid Dashboard Prototyping: Build and deploy high-fidelity visuals at speed. You will be responsible for taking raw requirements and delivering intuitive, production-ready dashboards in accelerated timelines.
  • Architect Intelligence Products: Design and build high-performance, interactive data applications that translate complex system telemetry into actionable insights.
  • Bridge ML and UX: Partner with Data Science teams to visualize predictive model outputs, making anomaly detection and forecasting intuitive for executive decision-makers.
  • Engineered Data Discovery: Orchestrate data across the speed layer (such as Elasticsearch) and cold storage to provide a seamless, multi-tiered analytical experience.
  • Leverage SQL, Spark, and Presto to orchestrate data directly from our S3/Delta Lake environments, ensuring sub-second latency for massive datasets.
  • Optimize dashboard performance to ensure that visualizations are fast, responsive, and reliable, even when querying terabyte-scale datasets.
  • Craft compelling visual narratives that highlight key insights, trends, and opportunities, going beyond simply displaying data.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service