Invitation Homes-posted 4 months ago
$118,800 - $205,920/Yr
Full-time • Senior
Washington, DC
1,001-5,000 employees

Invitation Homes is nation’s premier home leasing company, pioneering a new industry supported by advanced and robust technology solutions to enhance the resident experience. We are looking for innovative, dynamic individuals who are passionate about building business focused technology solutions using best of the breed tech stacks and take the platform to the next level. The Data Architect will play a key role in designing, implementing, and maintaining robust data infrastructure to support our organization's data-driven initiatives. Candidate should have a strong background in data engineering, with expertise in data modeling, data processing, python programming, and database SQL/PLSQLs scripting encompassing procedures, functions, dynamic programming etc. This role is crucial in shaping the architecture of our data platform solutions by building and optimizing data models, and standardizing and implementing efficient ETL processes. Furthermore, active involvement in database management, ensuring data quality and governance, and harnessing programming skills for automation and scripting tasks are integral components of this position. If you are motivated, passionate, a quick learner, and have outstanding data engineering skills, this role is waiting for you! Invitation Homes does not offer employment-based visa sponsorship for this role at this time.

  • Develop and implement comprehensive and scalable data models that align with business requirements and objectives.
  • Collaborate closely with data architects and Analysts/Information engineers to understand data needs, ensuring that data models are optimized for both performance and analytical purposes.
  • Regularly review and enhance existing data models to accommodate evolving business requirements and ensure long-term sustainability.
  • Design, develop, and deploy robust ETL processes to extract, transform, and load data from diverse sources into data lake platform.
  • Work closely with business stakeholders to gather and understand data integration requirements, ensuring ETL workflows meet the organization's data processing needs.
  • Monitor and troubleshoot ETL processes, addressing issues promptly to maintain data integrity and minimize downtime.
  • Manage and configure data processing workflows orchestration process using enterprise scheduler like AWS Airflow.
  • Leverage Python and SQL to construct framework for source data extraction, transformation, and loading tasks.
  • Craft custom scripts using PowerShell and Unix shell to facilitate the creation of end-to-end automated pipelines.
  • Build reusable scripts to automate code deployment process (CI/CD) across environments.
  • Develop scripts to manage and migrate infrastructure as code (IaaC) using Python/Terrform or AWS CloudFormation.
  • Implement data governance policies and practices to ensure the accuracy, consistency, and security of organizational data.
  • Actively participate in data quality improvement initiatives and provide guidance on best practices.
  • Implement solution to automate code reviews based on organization best practices and standardization guidelines.
  • Bachelor’s Degree in Computer Sciences or equivalent work experience.
  • 7+ years of professional development experience working in enterprise-scale data warehousing, data engineering and data lake solutions.
  • Minimum of 3+ years of experience in Data Modeling i.e. Dimensional Modeling, Normalized models, OBTs etc.
  • Must have knowledge of one or more of ETL/ELT tools like AWS Glue, DBT, SSIS, Apache Spark, Informatica.
  • 3+ years of hands-on experience of working on modern columnar data platforms like Snowflake, AWS Redshift, Azure Synapse.
  • Extensive experience working with SQL / PLSQL scripting, dynamic programming & performance tuning.
  • 3+ years of working experience utilizing Python to implement data engineering solutions.
  • Good knowledge on Data pipeline orchestration tools like Airflow, Control-M, AutoSys etc.
  • 3+ years of experience working on cloud platform preferably AWS and related data engineering services like Glue, S3, Lambda, CloudWatch, Parameter Store, MWAA etc.
  • Strong understanding of core infrastructure components (servers, network, storage).
  • Experience with collaboration tools like Git for version control in a team environment.
  • Experience working in an agile development environment.
  • Experience working with Salesforce and Yardi in real estate domain highly desirable.
  • Experience with building analytic solutions applicable to Sales, Finance, Product, Operations, and Marketing organizations in an enterprise.
  • Experience managing, measuring, and improving data quality in a data warehouse.
  • Experience in working in large teams using CI/CD and agile methodologies.
  • Annual bonus program
  • Health, dental, vision, and life insurance
  • Long-term and short-term disability insurance
  • Generous paid time off plans include vacation accrual, sick time, standard holidays and floating holidays
  • 401(k) with company matching contributions
  • Awesome work environment with casual dress
  • Team events and gatherings (Pre- and Post-Covid)
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service