CFO Data Architect

TX-HHSC-DSHS-DFPSAustin, TX
5dHybrid

About The Position

The Health and Human Services Commission (HHSC) Chief Financial Officer Division's Data Architect I performs highly advanced data migration (ETL) activities including detailed data mapping, data profiling and cleansing, authoring migration program specifications, and designing complex reconciliation reports. Focuses on designing and leading the migration of data from legacy systems to new platforms, ensuring data accuracy, accessibility, and security throughout the process. This involves creating migration strategies, developing data models, and collaborating with various teams to minimize disruption and ensure a smooth transition. Provides recommendation, guidance and leadership in business architecture and systems processing and leads creation of business architecture models and initiatives using project management best practices and industry architecture frameworks. Provides analyses and consultation to agency leadership on key technology issues affecting the agency and research emerging trends and ideas. Develops a comprehensive data center migration plan, outlining the step-by-step process, timeline, and resource requirements.

Requirements

  • Knowledge of Information Technology theories and practical application approaches
  • Knowledge of data center infrastructure and migration processes.
  • Knowledge of on-premises data storage systems, virtualization, and networking technologies.
  • Knowledge of data center architectures, data replication, and disaster recovery strategies.
  • Knowledge of data architecture, data modeling, and data warehousing.
  • Knowledge of business intelligence and big data solutions.
  • Knowledge of how to lead and manage advanced analytics projects.
  • Ability to develop, deploy, and manage applications and data storage.
  • Strong knowledge of data architecture principles and best practices.
  • Knowledge of advanced analytics tools and strategies.
  • Ability to lead teams and manage complex projects.
  • Ability to develop and maintain enterprise data models, data warehouses, and data lakes to support analytics and reporting.
  • Ability to design and implement solutions using SAS software, manage data, and integrate SAS with other systems.
  • Ability to problem-solving and analytical skills, with the ability to develop effective migration plans and solutions.
  • Ability to communicate effectively using interpersonal skills, with the ability to collaborate effectively with cross-functional teams.
  • Ability to work independently and efficiently manage time and tasks to meet project deadlines.
  • Ability to evaluate and integrate emerging technologies such as AI, machine learning, and big data analytics.
  • Skilled at working closely with data engineers, developers, and analysts to ensure seamless data accessibility and usability.
  • Skilled in working with ETL tools, data integration platforms, and APIs.
  • Skilled at architecting and designing ETL pipelines using Informatica tools.
  • Skilled in SQL and other database technologies.
  • Skilled in communication, with the ability to explain complex technical concepts to a non-technical audience.
  • Skills in SAS programming, data management, and system architecture
  • Skills in communication, including technical and non-technical audiences
  • Skills in leadership and organizational management
  • Bachelor's degree from a 4-year university in Information Technology, Computer Science, or a related field or equivalent work experience (experience may be substituted for education on a year to year basis)
  • At least 3 years of experience in developing, deploying, and managing applications and data storage
  • At least 2 years of experience developing and maintaining enterprise data models, data warehouses, and data lakes to support analytics and reporting
  • At least 2 years of experience working closely with data engineers, developers, and analysts to ensure seamless data accessibility and usability

Nice To Haves

  • 5 years of experience as a Data Center Migration Consultant or in a similar role, with expertise in data center infrastructure and migration processes
  • Experience in data architecture, data modeling, and data warehousing
  • Experience evaluating and integrating emerging technologies such as AI, machine learning, and big data analytics
  • Experience in data center architectures, data replication, and disaster recovery strategies
  • Experience using analytical skills to develop effective migration plans and solutions
  • 4 or more years of experience handling Data Analytics projects with on the job experience in Snowflake and Informatica
  • Preferred certifications: Snowflake - Snow-pro; ITIL; or Informatica

Responsibilities

  • Snowflake Support Duties: Data Load and ETL Monitoring, including monitoring scheduled data loads and ETL jobs, troubleshooting failed or slow data loads, and coordinating with ETL developers for issue resolution; Performance Monitoring and Optimization, including monitoring warehouse performance and query execution, identifying and resolving bottlenecks (e.g., long-running queries), and recommending and implementing warehouse scaling or clustering; Maintenance and Housekeeping, including managing storage usage and retention policies, archiving or purging old/unnecessary data, and applying best practices for cost control; Incident and Problem Management, including responding to incidents, alerts, and service requests, documenting and escalating issues as needed, and maintaining knowledge base articles for common issues; and Data Analysis & Ad hoc Data Requests, including working on Data Analysis requests, providing data extracts on demand based on the program team and automating common Data Requests.
  • SAS and IICS (Informatica Intelligent Cloud Services) Support Duties: Job and Workflow Monitoring Monitor daily/weekly/monthly ETL job runs. Investigate and resolve job failures or performance issues. Restart failed jobs and communicate status to stakeholders. Connection and Integration Management Set up and maintain connections to source/target systems (databases, cloud apps, etc.). Troubleshoot connectivity issues. Change and Release Management Deploy new or updated mappings, workflows, and tasks. Validate and test changes in lower environments before promoting to production. Incident and Service Request Handling Respond to tickets and service requests. Provide root cause analysis for recurring issues. Document solutions and update runbooks. Common Support Activities Across Both Platforms Documentation: Maintain up-to-date documentation for processes, runbooks, and troubleshooting guides. Reporting: Provide regular status reports on incidents, job status, and platform health.
  • Coordinates with internal CFO division data users regarding data needs and system performance, communicates changes to CFO division users, and serves as liaison to the Information Technology division and other HHSC business areas regarding data needs and system performance.
  • Drafts and submits reports, responds to inquires, and participates in briefings, meetings, presentations, and other activities to ensure CFO division leadership are aware of data needs and system performance; develops and recommends solutions to data related problems and concerns; and advises on training needs for CFO division staff to optimize user experience and expertise related to data.

Benefits

  • Our comprehensive benefits package includes 100% paid employee health insurance for full-time eligible employees, a defined benefit pension plan, generous time off benefits, numerous opportunities for career advancement and more.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service