Data Architect - Hybrid Tallahassee, FL

Novalink Solutions LLC
1dHybrid

About The Position

As part of the Department's ongoing efforts to modernize core enterprise data systems, various data initiatives for the strategic priorities of safety, mobility, cybersecurity, and operational efficiency must be documented and modernized for implementation of a framework to enable scalable data integration and analytics. The successful candidate will design and implement a data integration framework to support enterprise data. The enterprise data assets consist of interfaces, enterprise applications, databases, automated processes, and reporting programs within the Department. The candidate will guide the transition and integration of the enterprise data to unify various asset domains. The integration framework is anchored on three complementary components. · Join location data with engineering datasets · Integration backbone—orchestrate extract transform and load (ETL) pipelines, application programming interface (APIs), and service endpoints. Provides interoperable data flows across statewide enterprise systems and enforces lineage, stewardship, and “collect once, use many” principles. · Blueprint to standardize data schemas. Together, these components transform the Department’s enterprise data ecosystem into a connected, governed, and analytics ready environment. Each data asset shares a common foundation of enterprise technology and governance. This involves reviewing as-is business processes, remediation strategies, reengineering, design, and integration. Ensuring compatibility with cloud architecture is a key objective, aligning with the state’s cloud-first policy. The role focuses on the analysis and remediation of data that is secure and scalable and built upon the foundation of platforms the Department has already invested in such as Azure, Snowflake, Informatica, and PostgreSQL. Once the data is integrated and provisioned within the agency the candidate will then verify data quality. The candidate will play a pivotal role in ensuring that the transition to an integrated and modern digital asset management system is smooth, efficient, and aligned with the strategic objectives of the Department, supporting the modernization of financial management processes and data architecture.

Requirements

  • A minimum of 7 years of experience with large and complex database management systems.
  • A minimum of 7 years of experience as a data architect or senior data analyst, with extensive experience in designing, implementing, and maintaining data models and ETL processes.
  • 10+ years of broad experience working with database management systems (such as Oracle, SQL Server, or DB2) and data integration tools.
  • Strong background in financial data systems, including data modeling, data governance, and data quality management.
  • Proven experience with ETL tools, data warehousing, and automation of data processes to ensure accuracy, timeliness, and completeness of data.
  • Experience in data visualization and reporting best practices, with familiarity with tools like Power BI, Tableau, or similar platforms.
  • Familiarity with cloud-based data integration solutions, such as Azure or Snowflake.
  • Proficiency in SQL and scripting languages for database querying and data manipulation.
  • Strong analytical and problem-solving skills, with the ability to conduct complex systems analysis and identify areas for improvement.
  • Translate business requirements into technical solutions that scale.
  • Create architecture diagrams, data dictionaries, and governance playbooks for stakeholders.
  • Ability to design scalable and robust canonical data models and coordinate schemas across disparate systems to support business processes and reporting needs.
  • Ability to work with ETL tools to streamline and automate data extraction, transformation, and loading processes.
  • Build efficient ingestion and transformation workflows using Informatica or AZURE Data Factory.
  • Skilled in developing RESTful endpoints for spatial and engineering data exchange.
  • Skilled in verbal and written communication, with the ability to explain technical concepts to non-technical stakeholders.
  • Competency in project management, including the ability to plan, prioritize, and manage multiple tasks effectively.
  • Strong collaboration and interpersonal skills to work effectively with cross-functional teams and stakeholders to align technical and business goals.
  • Skilled in conducting data impact assessments and presenting findings to management and development teams.
  • Ability to lead and mentor technical teams, providing guidance on best practices in data architecture, integration, and quality management.
  • Ability to conduct detailed analysis of existing systems and identify opportunities for optimization and modernization.
  • Capability to develop comprehensive data models and ETL processes that align with business requirements and organizational goals.
  • Ability to translate user requirements into technical specifications and scalable data solutions.
  • Ability to integrate research, trends, and industry best practices into continuous improvement efforts.
  • Ability to communicate effectively with stakeholders at all levels, ensuring alignment between technical solutions and business needs.
  • Ability to adapt to changing requirements and priorities, maintaining a focus on quality and project deadlines.
  • Strong organizational skills and attention to detail, with the ability to document data architecture, processes, and standards.
  • Ability to ensure data solutions comply with security, privacy, and compliance standards.
  • In-depth understanding of data modeling principles, including relational and dimensional modeling techniques, canonical models, schema design, and integration patterns
  • Comprehensive understanding of ETL processes and data integration tools like Informatica, Snowflake, or Microsoft SSIS.
  • Understanding of data pipelines, batch vs. real-time integration, and API-driven architectures.
  • Familiarity with data quality management practices, data governance frameworks, and best practices.
  • Knowledge of role-based access, encryption, and FDOT security standards.
  • Knowledge of data visualization tools (e.g., Power BI) and reporting best practices.
  • Understanding of data security, privacy regulations, and compliance requirements.
  • Proficiency with AZURE, Snowflake, Informatica, PostgreSQL, and cloud-first strategies.

Nice To Haves

  • Experience with DB2 systems.
  • Experience with Informatica or other modern data integration platforms.
  • Strong analytical skills and the ability to translate business requirements into technical specifications.
  • Excellent communication and collaboration skills, capable of working effectively with technical and non-technical stakeholders.
  • Familiarity with financial management Work Program Application (WPA), RCI LRS, RCI/TCI, and Pavement Systems
  • Familiarity with the Florida PALM system or similar state or federal financial management systems.
  • Proficiency in database management systems (DBMS) such as Microsoft SQL Server and DB2.
  • Knowledge of spatial data concepts and event segmentation

Responsibilities

  • Analyze and verify existing data models and interfaces to assess impacts and identify potential disruptions due to the implementation of a modernized data model.
  • Design and implement comprehensive data models to ensure compatibility with business processes and reporting needs.
  • Review and enhance ETL processes to improve data quality, timeliness, and accuracy, and implement robust data quality measures to maintain high standards of production data.
  • Collaborate with development teams to deploy revised data models, verifying successful implementation and resolving any integration issues.
  • Provide expert guidance on data visualization and reporting, establishing best practices to optimize business insights, and monitor dependencies with interfacing systems to propose remediation where necessary.
  • Act as a liaison between technical teams, business stakeholders, OOT, and OIT to ensure clear communication regarding data architecture changes and project timelines.
  • Mentor and provide technical leadership to team members, guiding them on best practices in data integration, architecture, and quality management.
  • Ensure continuous improvement in data processes by defining roadmaps, standardizing methodologies, and collaborating on strategic initiatives.
  • Evaluate and align current data systems with the state’s cloud-first initiative, ensuring data models are scalable and compatible with cloud architecture.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service