Data Analyst V

TX-HHSC-DSHS-DFPSAustin, TX
1dHybrid

About The Position

The Texas Department of Family and Protective Services (DFPS) works to build on strengths of families and communities to keep children and vulnerable adults safe, so they thrive. We do this through investigations, services, and referrals. Performs highly complex (senior-level) data analysis and data research work involving statistical modeling on existing data sets to identify patterns, trends and reporting on the results. Works under limited supervision, with moderate latitude for the use of initiative and independent judgment.

Requirements

  • Knowledge of statistics and analyzing data sets; running queries, report writing, and presenting findings; and record keeping, including security procedures for handling, protecting, and distributing confidential data.
  • Knowledge of relational and non-relational databases, data modeling, and SQL programming.
  • Skill in creating dashboards and visualizations with Tableau or Power BI to present findings.
  • Establish and support data security and privacy compliance across data platforms.
  • Familiarity with data models, mining, and segmentation techniques in analytics workflows.
  • Skill in the use of a computer and applicable software, in conducting data searches, in evaluating and translating large amounts of data, and in critical thinking.
  • Skill in building, maintaining, and troubleshooting data pipelines (ETL/ELT).
  • Python and/or R programming skills for data analysis and automation.
  • Advanced troubleshooting and problem isolation in complex data environments.
  • Experience with cloud-based integration platforms (e.g., Azure Databricks).
  • Ability to compile, review, and analyze data; to prepare reports; to maintain accuracy and attention to detail; and to communicate effectively.
  • Prepare concise, comprehensive technical reports and justify data revisions.
  • Automate recurring reporting using Tableau Cloud or on-prem reporting platforms.
  • Communicate effectively with both technical and non-technical audiences, including leadership.
  • Maintain accuracy and attention to detail when translating large datasets into insights.
  • Knowledge of data models, database design development, data mining, and segmentation techniques.
  • Collaborate on development and maintenance of logical and physical data models for analytics and operations.
  • Work closely with IT and business teams to implement data architecture standards and integration processes.
  • Use ETL/ELT tools (Informatica, Talend, Azure Data Factory, SSIS) to integrate enterprise data.
  • Apply MDM practices to manage golden records and enterprise reference data.
  • Skill in analyzing problems and devising effective solutions.
  • Apply strong problem-solving and critical thinking to complex data challenges.
  • Continuously improve data workflows via performance tuning and automation.
  • Monitor database performance and implement efficiency improvements.
  • Provide proactive and reactive support to resolve data pipeline and database issues.
  • Ability to oversee and/or supervise the work of others.
  • Provide guidance and training on Databricks, MDM concepts, and ETL best practices to team members.
  • Serve as a technical consultant and mentor to colleagues across project teams.
  • Facilitate adoption of data governance policies within IT processes and project activities.
  • Coordinate cross-functional collaboration among developers, architects, DBAs, and analysts.
  • Graduation from a four-year college or university with major coursework in computer engineering, computer science, information systems, information technology or a STEM related field. Work experience may be substituted for education on a year-for-year basis.

Responsibilities

  • Collects, queries, and analyzes data using standard statistical tools, applications, methods, and techniques.
  • Analyze and interpret datasets using standard statistical tools to derive actionable insights in Databricks notebooks.
  • Create dashboards and visualizations in Tableau or Power BI to communicate results.
  • Develop programming scripts to support statistical modeling and data analysis.
  • Integrate data from disparate sources via ETL processes to enable comprehensive analyses.
  • Interprets data analysis results to identify significant differences and trends in data to inform decisions.
  • Identify significant differences, patterns, and trends and translate them into actionable insights.
  • Prepare concise, comprehensive technical reports with recommendations for decision makers.
  • Partner with stakeholders to align findings with program objectives and technology needs.
  • Consults with internal and external customers to identify data analytics needs.
  • Work with stakeholders to understand user needs and day-to-day process pain points.
  • Collaborate with developers, architects, DBAs, system administrators, and business analysts to scope solutions.
  • Provide guidance and training to team members on Databricks, MDM concepts, and ETL best practices.
  • Provide proactive and reactive database management support and training to development teams
  • Serve as a technical consultant to colleagues across IT and program areas.
  • Cleans and prunes data to discard irrelevant information.
  • Identify data gaps, errors, anomalies, inconsistencies, and redundancies across sources.
  • Develop data quality measures, analyze results, and implement changes to improve quality.
  • Recommend and deploy tools to monitor and control data quality with business data owners.
  • Define data quality rules and measures within MDM initiatives.
  • Identify and interpret data patterns and trends and assess data quality.
  • Interpret results to highlight patterns, trends, and quality issues that impact business outcomes.
  • Monitor database performance and recommend efficiency improvements.
  • Use ETL processes to collect, clean, and integrate data from disparate systems for analysis.
  • Leverage cloud-based big data environments (e.g., Databricks) to monitor pipeline outputs for anomalies.
  • Prepare concise, comprehensive technical reports to present and interpret data, identify alternatives, and make and justify recommendations on data revisions.
  • Produce clear technical reports that summarize analyses and recommend data revisions.
  • Build supporting dashboards (Power BI/Tableau) to visualize findings for leadership.
  • Communicate results effectively to technical and non-technical audiences.
  • Document assumptions, methods, and limitations to support transparency and reuse.
  • Establish and maintain standard work procedures governing the appropriate use of data.
  • Define and implement standards, usage guidelines, and procedures for data management and reporting tools.
  • Provide data management and governance guidance for both cloud and on-prem environments.
  • Assist in designing and implementing database backup, recovery, and continuity procedures.
  • Coordinate with IT and business teams to document data architecture standards and integration processes.
  • Implement business metadata processes for naming, definitions, and valid values in the repository.
  • Guide the selection of data management tools and the development of standards, usage guidelines, and procedures for those tools.
  • Leverage ETL/ELT tools (e.g., Informatica, Talend, Databricks, Azure Data Factory, SSIS) and cloud platforms to meet requirements.
  • Identify key datasets as MDM candidates and define associated data quality rules and measures.
  • Develop an agency-wide access point for program terminology and reference data.
  • Assist in documenting procedures for data ingestion, transformation, storage, and retrieval.
  • Define, develop, and implement data and reporting standards.
  • Work with cross-functional teams to define data architecture standards and governance processes.
  • Document and implement procedures for data ingestion, transformation, storage, and retrieval.
  • Establish data security and privacy compliance throughout data platforms and workflows.
  • Standardize reporting semantics using a business glossary and metadata repository.
  • Develop data quality measures, analyze data quality results, and implement necessary changes to ensure data quality improvement.
  • Identify data gaps, errors, anomalies, inconsistencies, and redundancies across sources.
  • Recommend and deploy tools to monitor and control data quality with business data owners.
  • Define data quality rules and measures within MDM initiatives.
  • Leverage cloud-based platforms (e.g., Databricks) to monitor pipeline outputs for anomalies.
  • Develop software applications or programming to use for statistical modeling, data analysis, and graphic analysis.
  • Develop programming scripts to support statistical modeling and data analysis.
  • Build and automate data pipelines using Databricks and ETL/ELT tools.
  • Create dashboards and visualizations in Tableau or Power BI to communicate analytical results.
  • Integrate data from disparate sources via ETL processes for advanced statistical analysis.

Benefits

  • 100% paid health insurance for you, and 50% paid for eligible family members—saving you hundreds every month in out-of-pocket medical costs
  • Retirement plans with lifetime monthly payments after five years of state service, plus options to save even more with 401(k) and 457 plans
  • Paid vacation, holidays, and sick leave so you can recharge and take care of life outside work (that’s time off you’re actually paid for)
  • Optional dental, vision, and life insurance—at rates much lower than most private plans
  • Flexible spending accounts for added tax savings on health and dependent care
  • Employee discounts on things like gym memberships, electronics, and entertainment
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service