Data Engineer III

Dallas College
2d

About The Position

The Data Engineer III will lead the design and implementation of robust, scalable data solutions and analytical models to support enterprise analytics and AI initiatives. This role requires advanced technical expertise, leadership in setting architectural standards, and the ability to drive innovation through automation and cloud integration. The position involves leading technical initiatives, mentoring team members, and ensuring compliance with data governance and regulatory requirements.

Requirements

  • Advanced data engineering principles and architectural design.
  • Cloud integration strategies and AI/ML pipeline design.
  • Regulatory compliance frameworks (FERPA, TRAIGA).
  • Best practices for data security, privacy, and governance.
  • DevOps methodologies for data engineering.
  • Ability to lead technical initiatives and set organizational standards.
  • Expertise in automation and CI/CD for data workflows.
  • Strong analytical and problem-solving capabilities.
  • Proficiency in performance tuning and cost optimization.
  • Effective leadership and mentoring skills.
  • Advanced communication skills for technical and executive audiences.
  • Ability to design and enforce governance standards across the enterprise.
  • Ability to evaluate and integrate emerging AI technologies.
  • Ability to manage complex projects and prioritize competing demands.
  • Ability to foster collaboration across technical and business teams.
  • Ability to anticipate future data needs and plan scalable solutions.
  • Bachelor’s degree in Computer Science, Data Engineering, or related field.
  • 5+ years of relevant experience in data engineering or related field.
  • Advanced proficiency in data processing frameworks, programming languages (e.g., SQL, Python), and databases.
  • Expertise in designing scalable architectures and analytical models.
  • Strong knowledge of cloud-based integration, AI, and analytics platforms.
  • Ability to automate pipelines using DevOps practices.
  • Excellent leadership, communication, and problem-solving skills.
  • Ability to maintain the security or integrity of the critical infrastructure of Dallas College.
  • Microsoft Fabric (OneLake, Data Factory, Synapse Data Engineering, Synapse Data Warehouse)
  • Power BI (semantic models, dataflows, paginated reports)
  • Microsoft Purview (data governance, lineage, cataloging)
  • Azure OpenAI
  • Copilot Studio
  • Required Dallas College Professional Development Hours per Academic Year. - All employees are required to complete a minimum of 19 hours.

Nice To Haves

  • Bilingual or multilingual preferred
  • Will be subject to a criminal background check.
  • Some positions may be subject to a fingerprint check.

Responsibilities

  • Design and implement enterprise-scale data architectures and pipelines using Microsoft Fabric components (OneLake, Data Factory, Synapse Data Engineering, Synapse Data Warehouse).
  • Develop analytical data models to support business intelligence and advanced analytics.
  • Automate and streamline data workflows using DevOps practices and CI/CD pipelines.
  • Lead technical initiatives and establish best practices for data engineering standards.
  • Conduct performance tuning and capacity planning for large-scale data systems.
  • Evaluate emerging technologies and recommend adoption strategies.
  • Ensure high availability, disaster recovery, and security compliance for data platforms.
  • Integrate data solutions with cloud-based platforms and AI services.
  • Optimize data systems for scalability, performance, and security.
  • Implement cost optimization strategies for cloud resources.
  • Design and enforce data governance standards, including data quality frameworks and classification policies.
  • Maintain compliance with FERPA and TRAIGA requirements.
  • Document data lineage and catalog datasets in Microsoft Purview.
  • Establish monitoring and auditing processes for data governance adherence.
  • Architect AI/ML Data Pipelines: Design and implement robust, scalable pipelines optimized for machine learning workloads, including feature engineering and real-time data ingestion.
  • Evaluate AI Services: Assess and recommend AI platforms and services (e.g., Azure OpenAI, Copilot Studio) for institutional use cases, ensuring alignment with strategic goals.
  • Model Operationalization: Collaborate with data scientists to deploy models into production environments, integrating monitoring and retraining workflows.
  • Experiment Management: Establish frameworks for experiment tracking, version control, and reproducibility to support research and development.
  • Compliance & Governance: Ensure all AI/ML initiatives comply with FERPA and TRAIGA (Texas Responsible AI Governance Act, effective January 2026), including ethical AI practices and risk assessments.
  • Performance Optimization: Implement strategies for model performance tuning, scalability, and cost efficiency in cloud environments.
  • AI Strategy Leadership: Contribute to institutional AI strategy by defining standards, best practices, and governance for AI-driven solutions.
  • Security & Privacy: Design secure data handling processes for sensitive datasets used in AI/ML workloads, including anonymization and encryption.
  • Mentor junior and mid-level engineers, fostering technical growth and knowledge sharing.
  • Collaborate with engineers, data scientists, analysts, and product teams to align data strategies with business objectives.
  • Lead cross-functional technical workshops and architecture reviews.
  • Perform other duties not listed as assigned.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service