Data Engineer II

LennarBentonville, AR
19h

About The Position

We are Lennar Lennar is one of the nation's leading homebuilders, dedicated to making an impact and creating an extraordinary experience for their Homeowners, Communities, and Associates by building quality homes and providing exceptional customer service, giving back to the communities in which we work and live in, and fostering a culture of opportunity and growth for our Associates throughout their career. Lennar has been recognized as a Fortune 500® company and consistently ranked among the top homebuilders in the United States. Join a Company that Empowers you to Build your Future As a Data Engineer, you are responsible for analyzing large amounts of business data, solving real world problems, and developing metrics and business cases that will enable Business Insights. This is done by leveraging data from various platforms such as Jira, Portal, Salesforce. You will work with a team of Product Managers, Software Engineers and Business Intelligence Engineers to automate and scale the analysis, and to make the data more actionable to manage business at scale. You will own many large datasets, implement new data pipelines that feed into or from critical data systems. A career with purpose. A career built on making dreams come true. A career built on building zero defect homes, cost management, and adherence to schedules.

Requirements

  • Bachelor's degree in Computer Science, Information Technology, or a related field.
  • 3-5 years of experience in data engineering or a related role, with demonstrated success in delivering data solutions.
  • AWS Glue, Lambda, S3, EC2, CloudWatch, Cloud Trail.
  • Dbt, Snowflake, SQL, Python, Qlik.
  • Proficient in SQL, with the ability to write complex queries, perform query optimization, and conduct performance tuning.
  • Experience with NoSQL databases, such as MongoDB, Cassandra, or DynamoDB, and an understanding of their appropriate use cases.
  • Strong programming skills in Python, Java, or Scala, with experience in data processing frameworks (e.g., Apache Spark, Hadoop).
  • Experience with cloud platforms (AWS, Azure, GCP) and data services, such as AWS Redshift, Azure Synapse, or Google BigQuery.
  • Knowledge of big data technologies, including Hadoop, Spark, Kafka, and HBase, with experience in distributed data processing.
  • Familiarity with data orchestration tools, such as Apache Airflow for scheduling and managing data workflows.
  • Experience with data versioning and testing tools, such as DVC (Data Version Control) and dbt (data build tool).
  • Understanding of data security practices, including encryption, access controls, and data masking.

Responsibilities

  • Design, implement and support an analytical data infrastructure and working knowledge of Modern Data Warehouse concepts.
  • Design, build, and maintain efficient and scalable data pipelines and ETL processes to process large volumes of structured and unstructured data.
  • Optimize data storage and retrieval methods to ensure performance, scalability, and cost-efficiency.
  • Manage AWS resources including EC2, S3, Glue, Lambda, API’s, IAM, Cloud Watch etc.
  • Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL and AWS big data technologies
  • Explore and learn the latest AWS technologies to provide new capabilities and increase efficiency
  • Collaborate with Data Scientists and Business Intelligence Engineers (BIEs) to recognize and help adopt best practices in reporting and analysis
  • Help continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers
  • Maintain internal reporting platforms/tools including troubleshooting and development.
  • Interact with internal users to establish and clarify requirements in order to develop report specifications.
  • Work with Engineering partners to help shape and implement the development of BI infrastructure including Data Warehousing, reporting and analytics platforms.
  • Contribute to the development of the BI tools, skills, culture and impact.
  • Write advanced SQL queries and Python code to develop solutions.
  • Working Knowledge of Snowflake.
  • Collaborate across teams to align AI initiatives with organizational goals and understanding of AI concepts
  • Knowledge on continuous integration/continuous delivery (CI/CD) pipelines and working on deployments when necessary.

Benefits

  • robust health insurance plans, including Medical, Dental, and Vision coverage
  • 401(k) Retirement Plan, complete with a $1 for $1 Company Match up to 5%
  • Paid Parental Leave
  • Associate Assistance Plan
  • Education Assistance Program
  • up to $30,000 in Adoption Assistance
  • up to three weeks of vacation annually, alongside generous Holiday, Sick Leave, and Personal Day policies
  • New Hire Referral Bonus Program
  • significant Home Purchase Discounts
  • Everyone’s Included Day
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service