Data Modeler – Intermediate - 26-01010

NavitasPartnersTallahassee, FL
1dHybrid

About The Position

We are seeking an experienced Data Modeler – Intermediate to support the design, development, and governance of data pipelines, data models, and ETL workflows. The ideal candidate will work closely with data engineers, analysts, and stakeholders to ensure high-quality, secure, and scalable data solutions.

Requirements

  • 3–5 years of experience in data engineering , including ETL and data pipeline development (Level 4)
  • Proficiency in Python and SQL (Level 3)
  • Strong analytical and problem-solving skills (Level 4)
  • Knowledge of relational database design and data modeling (Level 3)
  • Experience with data warehouses, data lakes, or data lakehouse architectures (Level 3)
  • Ability to work independently and manage priorities effectively (Level 3–4)
  • Strong collaboration and interpersonal skills (Level 3)
  • Effective written and verbal communication skills (Level 3)

Nice To Haves

  • 3–5 years of hands-on experience with Alteryx Designer
  • Familiarity with environmental, scientific, or regulated data domains
  • Experience with business intelligence tools such as Qlik Sense or similar platforms

Responsibilities

  • Design, implement, and maintain robust data pipelines and data architectures using Alteryx Designer .
  • Create and maintain logical data models using Oracle SQL Developer Data Modeler or similar tools.
  • Read, write, update, and manage structured datasets across multiple systems.
  • Develop and maintain ETL code repositories following best practices.
  • Perform ad hoc data cleansing and transformation as needed.
  • Define, implement, and monitor data quality standards, metrics, and control procedures .
  • Identify data quality issues and recommend remediation strategies.
  • Optimize data processing workflows for performance, scalability, and reliability .
  • Ensure compliance with data privacy regulations and security best practices .
  • Monitor and tune data systems, addressing performance bottlenecks through indexing, caching, and query optimization.
  • Transform raw data into analytics-ready formats using cleansing, aggregation, filtering, and enrichment techniques.
  • Establish governance standards for data and analytical models used in reporting and automated decision-making.
  • Collaborate with data scientists and analysts to improve data quality, security, and governance.
  • Stay current with emerging tools, technologies, and methodologies in data engineering.
  • Provide guidance and mentorship to junior team members, promoting continuous improvement and best practices.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service