Staff Enterprise Data Engineer

TriNetAtlanta, GA
11dOnsite

About The Position

TriNet is a leading provider of comprehensive human resources solutions for small to midsize businesses (SMBs). We enhance business productivity by enabling our clients to outsource their HR function to one strategic partner and allowing them to focus on operating and growing their core businesses. Our full-service HR solutions include features such as payroll processing, human capital consulting, employment law compliance and employee benefits, including health insurance, retirement plans and workers’ compensation insurance. TriNet has a nationwide presence and an experienced executive team. Our stock is publicly traded on the NYSE under the ticker symbol TNET. If you’re passionate about innovation and making an impact on the large SMB market, come join us as we power our clients’ business success with extraordinary HR. Don't meet every single requirement? Studies have shown that many potential applicants discourage themselves from applying to jobs unless they meet every single requirement. TriNet always strives to hire the most qualified candidate for a particular role, ensuring we deliver outstanding results for our small and medium-size customers. So if you're excited about this role but your past experience doesn't align perfectly with every single qualification in the job description, nobody’s perfect – and we encourage you to apply. You may just be the right candidate for this or other roles. JOB SUMMARY The Data Engineer will build, manage and optimize data pipelines and move these data pipelines effectively into production for key data and analytics consumers like business/data analysts, data scientists or any business partner that needs curated data for data and analytics use cases across the enterprise. This role will be the key interface in operationalizing data and analytics on behalf of the business unit(s) and organizational outcomes and will require both creative and collaborative working with IT and the wider business. It will involve evangelizing effective data management practices and promoting better understanding of data and analytics. The data engineer will also be tasked with working with key business partners and IT experts to plan and deliver optimal analytics and data science solutions The data engineer will also be responsible for planning and implementing the best analytics and data science solutions utilizing a variety of technologies, including on-premises and cloud provider services (AWS or Azure), in collaboration with important business partners and IT specialists.

Requirements

  • Bachelor's Degree in Computer Science/Engineering or equivalent experience - preferred
  • Typically 8 or more years of experience in implementation of Data & BI projects in a large-scale enterprise data lake/ warehouse environment - required
  • Typically 8 or more years of experience in ETL/ELT Architecture and hands-on experience in developing the ETL/ELT jobs and tools like Informatica PowerCenter/ Informatica Cloud - IICS, AWS Glue - required
  • Must have working experience in SFDC and PeopleSoft in areas of Sales, Marketing, Finance, or Support domains
  • Knowledge or experience in working with Cloud data warehouses like Snowflake, AWS Redshift, Python and ANSI SQL
  • Proven hands-on experience in using Informatica components such as Designer, Workflow Manager, Workflow Monitor, Repository Manager, Python etc
  • Hands on experience with database performance tuning like Oracle, Postgres and strong database languages like SQL, PL/SQL, ANSI SQL, Python, Unix Shell and Perl Scripting
  • Informatica Powercenter/ Informatica Cloud - IICS preferred
  • Knowledge of end-to-end SDLC process in EDW, Data Lake, BI & MLOps projects - Advanced
  • Dimensional Modeling and Data warehouse, ODS concepts, such as star schemas, snowflakes and normalized data models - Advanced
  • Ability to coordinate effectively with on-site and offshore resources through Managed Service Providers & IT Teams - Intermediate
  • Knowledge of Reporting tools like Tableau is desired - Intermediate
  • Excellent verbal and written communication skills - Advanced

Responsibilities

  • Create, maintain, and optimize data pipelines from development to production for specific use cases.
  • Use innovative and modern tools, data services, techniques, and well architected frameworks to automate the most-common, repeatable and tedious data preparation and integration tasks partially or completely in order to minimize manual and error-prone processes and improve productivity.
  • Assist with renovating the data management infrastructure to drive automation in data integration and management.
  • This will include (but not be limited to):
  • Learning and using modern data preparation, integration and AI-enabled metadata management tools and techniques. Including MLOps framework and infrastructure automation.
  • Tracking data consumption patterns
  • Performing intelligent sampling and caching
  • Monitoring schema changes
  • Recommending — or sometimes even automating — existing and future integration flows.
  • Collaborate in close relationship with data science teams and with business (data) analysts in refining their data requirements for various data and analytics initiatives and their data consumption requirements.
  • Build, model and curate data lake/warehouse and other data consumption methods.
  • Performs other duties as assigned
  • Complies with all policies and standards

Benefits

  • medical, dental, and vision plans
  • life and disability insurance
  • a 401(K) savings plan
  • an employee stock purchase plan
  • eleven (11) Company observed holidays
  • PTO and a comprehensive leave program
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service