Data Engineer - Enterprise Products

LINCOLN CENTERNew York, NY
1dHybrid

About The Position

Lincoln Center is seeking a Data Engineer to help design, build, and operate an enterprise data platform that powers analytics, reporting, personalization, and future AI initiatives across the campus. Sitting within the Enterprise Products area, this role focuses on shared platforms and systems that serve the entire organization. You will be responsible for building secure, scalable, and cost-efficient data pipelines and datasets on AWS that unify information from ticketing, CRM, fundraising, marketing, finance, and digital engagement systems. You’ll collaborate closely with product, business development, marketing, finance, and external partners to ensure the data platform is reliable, governed, observable, and trusted across the institution.

Requirements

  • Minimum 5 years of experience in data engineering or closely related roles.
  • Strong SQL skills (CTEs, window functions, performance tuning) and solid Python experience for data processing and testing.
  • Hands-on experience with AWS data services, including S3, Glue, Lambda, IAM, and CloudWatch.
  • Experience working with modern data warehouses and transformation tools such as Snowflake and dbt.
  • Experience designing dimensional and semantic models, incremental pipelines, and slowly changing dimensions.
  • Working knowledge of data governance concepts including RBAC/ABAC, masking, tagging, and PII handling (GDPR/CCPA).
  • Familiarity with workflow orchestration tools (Airflow or AWS-native equivalents).
  • Experience with CI/CD pipelines, Git-based version control, and infrastructure-as-code (Terraform or similar).

Responsibilities

  • Design, build, and operate scalable ETL/ELT pipelines on AWS supporting batch and near-real-time data use cases.
  • Ingest and process data using AWS-native services such as S3, Glue, Lambda, Step Functions, and CloudWatch, alongside modern data tooling.
  • Integrate data from enterprise systems including ticketing, CRM, fundraising, finance, marketing platforms, and web/app analytics.
  • Develop dimensional and semantic data models in the warehouse that serve as trusted sources of truth across departments.
  • Optimize data workflows for performance, reliability, and cost, including partitioning strategies, orchestration schedules, and compute usage.
  • Implement data governance standards using AWS IAM and warehouse controls, including role-based access, masking, tagging, and metadata management.
  • Partner with teams across Marketing, Business Development, Finance, and Programming to deliver high-impact data products such as cohorts, funnels, donor analytics, and event performance insights.
  • Improve data quality and reliability through automated testing, monitoring, alerting, and clear data ownership.
  • Reduce technical debt by automating manual processes, deprecating legacy pipelines, and standardizing data access patterns.
  • Contribute to enterprise-wide data standards, documentation, and cloud best practices as the platform evolves.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

501-1,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service