Data Warehouse Engineer II

The Larry H. Miller Company All GroupsSandy, UT
14d

About The Position

At the Larry H. Miller Company, our vision to be the best and our mission to enrich lives propels our strategic growth in operations, investments, and philanthropic efforts. Our combination of business acumen paired with our values — hard work, service, integrity, and stewardship — is what sets us apart; it’s who we are. Our founders, Larry and Gail, built our reputation with this unique perspective. It is the foundation of our legacy and our future. A future that promises transformational change through visionary leadership as we navigate an unprecedented landscape. By expanding our influence over an increasingly diverse portfolio of operating companies and investments, we embrace opportunities and obligations. Because, as we grow, so does our stewardship and ability to do good — for our employees, partners, and communities. Headquartered in Sandy, Utah, the Larry H. Miller Company is a privately-owned business, with operations located mainly across the western United States. LHM’s focus falls within the primary categories of real estate, health care, finance, entertainment, sports, and long term strategy and investments, as well as philanthropy. For more information about LHM, visit www.lhm.com. Data Warehouse Engineer II Reports to: Director of Data and Analytics

Requirements

  • The Individual must demonstrate deep knowledge of an Active Data Warehousing environment.
  • Candidate will be responsible for designing, developing, testing, implementing, and supporting complex ETL processes.
  • Must be able to understand and analyze requirements and develop, deploy, and configure ETL packages.
  • The developer will use their experience to implement robust, maintainable solutions that meet business requirements.
  • Experience implementing ETL for Data Warehouse and Business Intelligence solutions
  • Experience in RDBMS design and development, and performance tuning
  • Deep familiarity with database technologies and cloud infrastructure
  • Experience in data classification & handling around data types included in HIPAA, PCI, CCPA, and other types
  • Minimum 5 years with ETL tool (SSIS, Informatica, etc.)
  • Experience with scheduling ETL jobs using a scheduling tool
  • Experience in data streaming services
  • Ability to read and write effective, modular, dynamic (parameterized) and robust code, establish and follow already established code standards, and ETL framework
  • Excellent troubleshooting & optimization skills: interpreting ETL logs, performing data validation, dissecting code, understanding the benefits and drawbacks of parallelism, experience with change data capture, using expressions, scoping of variables, commonly used transforms, event handlers and logging providers, ability to understand and optimize the surrogate key generation and inconsistent data type handling
  • Clear understanding of fuzzy grouping, fuzzy lookup transforms, term extraction and term lookup transforms, import and export column transforms
  • Experience in performance tuning, deploying and administering ETL packages, proficient in utilizing scripting
  • Experience designing jobs that can be easily promoted from one (Dev) environment to another (Test or Prod) seamlessly, without modification.
  • To perform the job successfully, an individual should demonstrate the following competencies to perform the essential functions of this position.
  • Cultivates Innovation & Strategy: Create new and better ways for the organization to be successful. Come up with useful ideas that are new, better, or unique. Introduce new ways of looking at problems. Encourage diverse thinking to promote and nurture innovation.
  • Communicate Effectively: Develops and delivers multi-mode communications that convey a clear understanding of the unique needs of different audiences. Attentively listens to others. Adjust to fit the audience and the message. Provide timely and helpful information to others across the organization. Encourage the open expression of diverse ideas and opinions.
  • Collaborate: Builds partnerships and works collaboratively with others to meet shared objectives. Work cooperatively with others across the organization to achieve shared objectives. Represent own interests while being fair to others and their areas. Partner with others to get work done. Credit others for their contributions and accomplishments. Gain trust and support of others.
  • Operate with Integrity: Demand the highest ethical standards from self and others by setting an example of positive attitude and professionalism. Demonstrate ethical and followership behaviors which promotes Larry H. Miller standards
  • Show stewardship in providing a neat, orderly, and safe work environment.
  • Follow LHMCO policy and procedures when conducting business with customers, other employees, vendors and Government officials.
  • Observe safety and security procedures and uses equipment and materials properly.

Responsibilities

  • Protect the legal, financial, and moral well-being of the LHMCO and the portfolio companies.
  • Be a teacher to support the efforts of other employees to be successful.
  • Seek ways to improve business operations efficiencies.
  • Serve as a resource to the other Data Engineers on the team
  • Solidify and extend existing ETL Processes and Framework
  • Design ETL jobs and reusable components to implement specified business requirements
  • Troubleshoot and optimize ETL code; interpret ETL logs, perform data validation, dissect code, understand the benefits and drawbacks of parallelism, apply best practices using change data capture, expressions, scoping of variables, commonly used transforms, event handlers and logging providers, understand and optimize the surrogate key generation and inconsistent data type handling
  • Create technical specifications documents and design process diagrams
  • Develop functional specifications for data acquisition, transformation, and load processes.
  • Develop scripts for data file processing and process integration tasks
  • Define job parameters, reference lookups, filter criteria
  • Conduct code reviews and participate in technical design
  • Conduct performance analysis and optimize scheduled jobs, custom SQL and T-SQL jobs
  • Prepare unit test, integration test plan documents and perform unit test, system integration test, document test results
  • Support team members during functional, UAT, regression, and performance testing.
  • Manage data replication from various data sources
  • Design and develop BI reports based on business requirements
  • Design and implement semantic models for advanced analytics
  • Provide on-call support for ETL and BI processes
  • Create jobs that will pull data from a variety of databases and flat files
  • All other duties as assigned
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service