Data Engineer Associate

Teacher Retirement System of Texas (TRS)Austin, TX
4dOnsite

About The Position

The Information Technology (IT) Division lays the foundation for TRS to deliver excellent service experiences across the organization and with our members. We serve with purpose through mentorship and collaboration across a broad variety of teams unified by innovation to create technology and information solutions that have a positive impact on our members’ lives. We invite you to join one of Austin’s Top Workplaces. TRS offers a best-in-class combination of technology and continuous learning opportunities to equip you to solve problems, expand your knowledge, and create impact for 1 in 20 Texans. The Data Engineer Associate is responsible for supporting the development and maintenance of data platforms and pipelines, ensuring smooth operations and data integrity. The incumbent will assist with ETL processes, CI/CD deployments, data modeling, managing data catalogs for accessibility, performing basic data analysis, creating reports and dashboards, and collaborating with stakeholders to support data-driven applications and governance initiatives. This position works in a highly collaborative environment and contributes to Agile team activities, proactively partnering with the Data Engineering Team, IT staff, enterprise stakeholders, and agency employees.

Requirements

  • Bachelor's degree from an accredited college or university in computer information systems, computer science, data management, information systems, information science, mathematics or a closely related field.
  • High School diploma or equivalent and additional full-time experience data pipeline builds and implementation may be substituted on an equivalent year-for-year basis.
  • One (1) year of full-time experience in data pipeline builds and implementation; utilizing ETL tools for creating ETL scripts; Power BI or equivalent modern data visualization package or related experience.
  • Experience may be concurrent.
  • A master's degree or doctoral degree in a closely related field may be substituted on an equivalent year-for-year basis.
  • Strong knowledge of business intelligence tools, databases, SQL, and object-oriented programming languages to support data analysis and predictive analytics.
  • Database modeling and data warehousing principles.
  • System development life cycle concepts.
  • Basic understanding of data platforms, assist in platform management.
  • Understand basic ETL concepts, assist in building pipelines.
  • Basic knowledge of data architecture and modeling.
  • Basic familiarity with Microsoft Fabric, Databricks, Snowflake, Spark, and cloud platforms.
  • Basic understanding of data encryption (REST, SOAP, XML, JSON) and web services (SSL, SSH, SFTP, Certificates).
  • Basic proficiency in Python and SQL.
  • Understanding of data governance principles and metadata management practices.
  • Experience using technical and statistical analysis skills to deliver clear, concise, and visually appealing management metrics and reports to inform decision making and actions.
  • Writing report queries, including an understanding of relational databases.
  • Business process analysis and problem solving.
  • Completing detailed work with a high degree of accuracy.
  • Planning, organizing, and prioritizing work assignments to manage a high-volume workload in a fast-paced and changing environment.
  • Using a computer in a Windows environment with Microsoft Office word processing, spreadsheet, and other business software.
  • Effective written and verbal communications, including explaining complex information to others in an understandable manner and writing clear and precise policies, procedures, and training or other materials.
  • Working in a team environment and influence a diverse group of stakeholders.
  • Operate effectively in a fast-paced environment with competing and shifting priorities.
  • Continually learn new concepts and tools to support job needs, and strive for ongoing professional development.
  • Establish and maintain harmonious working relationships with co-workers, agency staff, and external contacts.
  • Work effectively in a professional team environment.
  • Ability to assist with automating metadata ingestion and contribute to improving catalog usability through user feedback.
  • Support integration of data catalogs with BI tools, ETL pipelines, and APIs under guidance.
  • Participate in Scrum activities, basic understanding of Agile

Nice To Haves

  • Experience with data architecture and modeling.
  • Experience with Microsoft Fabric, Databricks, Snowflake, Spark, SQL, Python and cloud platforms.
  • Experience with data encryption (REST, SOAP, XML, JSON) and web services (SSL, SSH, SFTP, Certificates).
  • Experience with AI and Data Governance

Responsibilities

  • Assist in managing data platforms, ensuring smooth operation and helping troubleshoot basic issues.
  • Support in building and maintaining ETL pipelines, ensuring data is properly processed and transferred between systems.
  • Assist in the implementation of CI/CD deployment processes to streamline data pipeline deployments.
  • Assist in creating and maintaining basic data models to ensure structured data storage and accessibility.
  • Help manage data catalogs to maintain an organized and easily accessible repository of data resources.
  • Perform administrative tasks and assist in basic system administration duties, contributing to the overall management and upkeep of data systems.
  • Assist in managing data permission, ensuring proper access control and data security.
  • Contribute to the maintenance and optimization of the data warehouse, ensuring data integrity and availability.
  • Help develop data-driven applications, contributing to the creation of data solutions that support business needs.
  • Conduct basic data analysis and assist in larger data analysis projects, providing insights and support to business stakeholders.
  • Participate in scrum activities and gain exposure to Agile methodologies.
  • Generate basic reports using reporting tools, and provide support in more advanced reporting tasks as required.
  • Work with stakeholders to gather requirements and assist in defining project scopes and objectives.
  • Assists in developing visually appealing and intuitive dashboards that provide visibility into key performance, risk and control indicators across agency divisions and departments.
  • Identify and register new data sources into the data catalog.
  • Tag datasets with relevant metadata (e.g., business terms, sensitivity level, owner).
  • Connect the catalog to data sources, BI tools, and ETL pipelines.
  • Automate metadata ingestion from new systems.
  • Implement APIs for catalog queries and updates.
  • Work with data stewards to classify data assets
  • Provide guidance on how to search and use the catalog effectively.
  • Create documentation and training materials for business users.
  • Gather feedback to improve catalog usability.
  • Performs related work as assigned.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service