Senior Data Analytics Engineer

TechnologentPhoenix, AZ
13h

About The Position

The Opportunity: We are looking for a Senior Data Analytics Engineer to support the design, development, and optimization of data pipelines and analytics solutions within an agile, SDLC-driven environment. This role focuses on building scalable ETL frameworks, developing complex SQL and data models, and enabling reliable reporting and analytics across enterprise data platforms. The ideal candidate brings strong experience in data engineering, big data technologies, and analytics tooling, along with a hands-on approach to building and maintaining high-quality, production-grade data solutions. Role: Senior Data Analytics Engineer Experience: 6–9 Years Work Location: Phoenix, AZ Project Duration: 12+ Month Contract

Requirements

  • 6–9 years of experience in data engineering, analytics engineering, or related roles
  • Strong proficiency in SQL (MS SQL Server, Oracle) and relational data modeling
  • Hands-on experience building ETL pipelines using Python and enterprise ETL tools (SSIS/SSRS)
  • Experience working with big data technologies and distributed data processing (Spark/PySpark)
  • Solid understanding of data warehousing, data lakes, and master data management concepts
  • Experience integrating systems using APIs and data exchange frameworks
  • Strong problem-solving skills and ability to write efficient, reusable, production-grade code

Nice To Haves

  • Experience with Oracle PL/SQL
  • Familiarity with Airflow or workflow orchestration tools
  • Exposure to cloud-based data platforms and analytics services
  • Experience working with Apptio or financial/IT analytics platforms
  • Background supporting enterprise-scale analytics and reporting environments

Responsibilities

  • Design, build, and maintain ETL and data pipelines using Python, SQL, and big data technologies
  • Develop and optimize complex SQL queries and data models across platforms such as SQL Server and Oracle
  • Support data warehousing and data lake architectures, including schema design and entity relationship modeling
  • Build and support analytics and reporting solutions using tools such as SSIS, SSRS, Tableau, Power BI, or Qlik
  • Integrate data across systems using REST/SOAP APIs, file-based pipelines, and ETL frameworks
  • Develop and maintain PySpark/Spark-based data processing workflows
  • Collaborate within an agile development team across the full SDLC lifecycle
  • Ensure data quality, performance, and reliability across pipelines and reporting solutions
  • Identify opportunities to improve data platform performance, scalability, and maintainability
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service