Data Architect

Live Nation EntertainmentPrior Lake, MN
$136,000 - $170,000

About The Position

WHO ARE WE? Live Nation Entertainment is the world’s leading live entertainment company, comprised of global market leaders: Ticketmaster, Live Nation Concerts, and Live Nation Media & Sponsorship. Ticketmaster is the global leader in event ticketing with over 500 million tickets sold annually and more than 12,000 clients worldwide. Live Nation Concerts is the largest provider of live entertainment in the world promoting more than 40,000 shows and 100+ festivals annually for nearly 4,000 artists in over 40 countries. These businesses allow Live Nation Media & Sponsorship to create strategic music marketing programs that connect over 1,000 brands with the 98 million fans that attend Live Nation Entertainment events each year. For additional information, visit www.livenationentertainment.com. WHO ARE YOU? Passionate and motivated. Driven, with an entrepreneurial spirit. Resourceful, innovative, forward thinking and committed. At Live Nation Entertainment, our people embrace these qualities, so if this sounds like you then please read on! THE TEAM Core Data Services org is at the center of Data and Analytics initiatives across the entire Live Nation enterprise. We are at the beginning of our journey to build an enterprise data platform capable of being the true backbone for data needs across the organization. Our mission is to make reliable data available and enable value creation by the data community of engineers, analysts and decision makers–and to do so in an AI-enabled way. Core Data Services org consists of Product and Engineering teams overseeing Architecture, Platform Engineering, Data Engineering, Business Intelligence Engineering and Operations. THE JOB This role is about driving transformation – defining how data is modeled, governed, and delivered in a global, high-volume environment. If you’re excited to influence architecture decisions from day one, thrive in ambiguity, and want to leave a lasting mark on how data is done at scale, this role is for you. As a Data Architect within the Core Data Services team, you will be responsible for the comprehensive solution and architecture of our Enterprise Data platform. We are seeking an experienced professional with a track record of building solutions to large-scale systems at the enterprise level. You will be responsible for designing, implementing, and optimizing end-to-end data solutions that empower our organization to harness the full potential of data. You will collaborate closely with cross-functional teams, including data engineers, data scientists, and business stakeholders, to drive innovation and ensure the seamless integration of data and analytics within our ecosystem. Within this role, you will encounter a multitude of opportunities to effect meaningful change across various dimensions, from technological enhancements and streamlined processes to the development and mentorship of talented individuals. In particular, you will build models at high velocity using AI agents, many of which you develop yourself, and those models will be designed to be consumed by yet other AI agents, both yours and those built by stakeholders throughout our broader business. Your contributions will have a significant impact on our business and technology team, reaffirming our commitment to excellence.

Requirements

  • 10+ years of tech industry experience and have 5+ years experience in architecting and implementing big data strategies.
  • 5+ years expertise with and cloud-based data platforms (Databricks, AWS, Snowflake, Spark).
  • 1+ year with AI orchestration, automated modeling, or other AI-enabled data engineering.
  • Deep knowledge of data modeling using Erwin or similar tooling (3NF, dimensional/star schema), medallion architectures, ETL processes, data integration, and Data Warehousing concepts.
  • Experience with Big Data technologies such as Databricks, Spark, Hadoop, and NoSQL databases.
  • Experience in architecting data pipelines and solutions for streaming and batch integrations using tools/frameworks like dbt, Talend, Azure Data Factory, Spark, Spark Streaming, etc.
  • Experience with confluent Kafka real-time data processing and API platforms.
  • Strong understanding of data governance, lineage, and compliance.
  • Experience optimizing for cost and performance at scale (e.g., caching, partitioning strategies).
  • Excellent communication and interpersonal skills for effective collaboration with technical and non-technical teams to be able to translate architecture into business value
  • Bachelor's or Master's degree in Computer Science, Information Systems, or a related field.

Responsibilities

  • Architecture & Strategy Design and evolve our Databricks Lakehouse architecture with a focus on scalability, cost efficiency, and reliability in response to changing requirements and to address various data needs across the business–in particular the need for all data models to be AI-ready and agent-consumable.
  • Define and implement a tiered data ecosystem with maturity layers to transform data systematically while ensuring each layer services a specific purpose from raw data ingestion to refined and analytics-ready data.
  • Outline the vision, requirements, and lead development of the Data Lakehouse by aligning technical architecture with business needs and long-term vision, in particular including AI enablement.
  • Continuously evaluate and introduce new patterns, tools, or practices to make our data platform more scalable, resilient and reliable and then work across our team to put your ideas into action.
  • Modeling, Standards & Governance Develop conceptual, logical, and physical data models to serve as the backbone of integrated, reusable and scalable data to support the current and future development of data strategies; innovate on and evolve traditional optimal models for AI consumption.
  • Establish and enforce standards for metadata, lineage, and data quality across the ecosystem.
  • Define patterns and playbooks that guide engineering teams in building consistent, future-proof pipelines, including AI-automated pipelines.
  • Enhance data discoverability by working closely with data analysts and business stakeholders to make data easily accessible and understandable to them.
  • Develop and enforce data engineering, security, and data quality standards through automation.
  • Develop and implement strategies for seamlessly integrating data from diverse sources across different systems and platforms, including real-time and batch processing.
  • Delivery & Enablement Collaborate with engineers, analysts, and business teams to design solutions, both traditional and AI-automated, that are accessible, usable, and trustworthy.
  • Mentor and provide architectural guidance to engineering staff.
  • Lead workshops and training to raise data literacy and promote best practices.

Benefits

  • generous vacation
  • healthcare
  • retirement benefits
  • student loan repayment
  • tuition reimbursement
  • six months of paid caregiver leave for new parents including fostering
  • perks like Roadie Babies helping new parents care for their babies on work trips
  • access to free live events through our exclusive employee ticketing program
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service