Lead Data Engineer

LennarIrving, TX

About The Position

We are Lennar Lennar is one of the nation's leading homebuilders, dedicated to making an impact and creating an extraordinary experience for their Homeowners, Communities, and Associates by building quality homes and providing exceptional customer service, giving back to the communities in which we work and live in, and fostering a culture of opportunity and growth for our Associates throughout their career. Lennar has been recognized as a Fortune 500® company and consistently ranked among the top homebuilders in the United States. Join a Company that Empowers you to Build your Future Lennar is seeking a Lead Data Engineer to help lead the design, build, and long-term ownership of our enterprise data platform. This role sits at the center of a major infrastructure transformation—migrating our core data platform to AWS and Snowflake—and will be critical to ensuring that work is delivered with engineering rigor, operational resilience, and clear knowledge transfer back to the internal team. The ideal candidate is a deeply technical, hands-on engineer who thrives in complex cloud environments. They bring production experience with AWS data services, dbt, Python, and Snowflake—and they know how to build pipelines and transformation layers that are not just functional, but maintainable, observable, and built to scale. They can work alongside external partners without being dependent on them, and they take pride in owning the quality of what they ship. You’ll join a high-performing Data & Analytics team operating at the intersection of real estate, operations, and AI—building the data foundation that powers pricing models, operational intelligence tools, and strategic decisions across 40+ divisions of one of the nation’s largest homebuilders. A career with purpose. A career built on making dreams come true. A career built on building zero defect homes, cost management, and adherence to schedules.

Requirements

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related technical field.
  • 7+ years of data engineering experience, including meaningful production ownership of cloud-native pipelines in AWS environments.
  • Comfortable operating with autonomy in ambiguous environments—scoping work, setting realistic timelines, and raising blockers proactively without waiting to be asked.
  • Strong leadership and time management skills, with experience leading cross-functional teams and managing multiple projects simultaneously.
  • Expertise in cloud data platforms (AWS, Azure, GCP) and big data ecosystems, with a focus on optimizing cost, performance, and scalability.
  • Experience with real-time data processing and event-driven architectures, leveraging tools such as Apache Kafka, Spark Streaming, and AWS Lambda.
  • Deep hands-on experience with AWS data services (S3, Glue, Lambda, ECS, Step Functions, IAM) and Snowflake as a primary data warehouse.
  • Strong Python skills with a track record of writing modular, well-tested, and production-ready data engineering code.
  • Strong SQL skills and understanding of data warehouse design principles including dimensional modeling, layered transformation patterns (raw/staging/mart), and performance optimization.
  • Experience with Terraform or other infrastructure-as-code tools, familiarity with orchestration platforms (Airflow, Prefect), or prior work supporting platform migrations alongside external implementation partners.
  • Proficient in the design of enterprise data warehouses, data lakes, and data marts.
  • In-depth knowledge of data governance practices, including data cataloging, data lineage, and data quality management.
  • Experience with enterprise data management frameworks, such as Master Data Management (MDM) and metadata management.

Nice To Haves

  • Experience with dbt
  • Experience with infrastructure-as-code practices using Terraform

Responsibilities

  • Architect and build scalable, production-grade data pipelines on AWS (S3, Glue, Lambda, ECS, MWAA) integrated with Snowflake as the core data warehouse.
  • Write clean, modular Python for data ingestion, transformation logic, and orchestration—applying software engineering best practices including testing, versioning, and code review.
  • Partner closely with an external implementation vendor during a multi-phase platform migration, ensuring technical decisions align with long-term internal ownership goals and that knowledge transfers effectively to the team.
  • Build and maintain data quality frameworks, pipeline observability, and alerting systems that give the team confidence in production data across critical domains including pricing, supply chain, and sales operations.
  • Contribute to infrastructure-as-code practices using Terraform to provision and manage cloud resources in a repeatable, auditable way (experience a plus; willingness to learn required).
  • Communicate clearly with both technical and non-technical audiences—documenting systems, setting expectations on delivery, and flagging risks early without overcommitting.
  • Develop and enforce data engineering standards, processes, and best practices, ensuring consistency and quality across all data projects.
  • Collaborate with stakeholders, including business leaders, data scientists, and IT teams, to understand and prioritize data-related business needs and translate them into actionable technical requirements producing reliable, well-tested data products.
  • Ensure data solutions are scalable, secure, and aligned with the long-term goals of the organization, considering both current and future data needs.
  • Drive the adoption of new technologies, tools, and methodologies that improve the efficiency, reliability, and scalability of data engineering practices.
  • Oversee the development and maintenance of comprehensive data documentation, including data architecture diagrams, process flows, and technical specifications.
  • Maintain awareness and solutioning to address the team's technical debt, ensuring that legacy systems are modernized and that new solutions are built with maintainability and scalability in mind.
  • Provide technical leadership in the resolution of complex data issues, guiding the team in troubleshooting and problem-solving efforts.

Benefits

  • Medical Insurance
  • Dental Insurance
  • Vision Insurance
  • 401(k) Retirement Plan with Company Match
  • Paid Parental Leave
  • Associate Assistance Plan
  • Education Assistance Program
  • Adoption Assistance
  • Vacation
  • Holiday Leave
  • Sick Leave
  • Personal Day
  • New Hire Referral Bonus Program
  • Home Purchase Discounts
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service