Information Technology - BI Developer

TCCFishers, IN
Hybrid

About The Position

The Cellular Connection (TCC) is looking for full-time resource who has solid experience in Business Intelligence solution development and support, specifically focused in Data Profiling, Data Warehouse, and ETL using Microsoft toolset. The successful candidate will embody and work to reinforce our company’s mission. At TCC we believe our communities need more than just a wireless retailer, and our employees want more than just a job. We are committed to making a positive and sustainable impact on the lives of our employees, customers and communities. The foundation of TCC’s business is built around 5 promises –Care, Drive the Business, Connect, Inspire, Be Authentic. We will Care about the people around us and the world. We will Drive the Business to greater success so that we can do more good. We will Connect with the people around us. We will Inspire others to join us in doing good. We will Be Authentic in our words and actions.

Requirements

  • Strong experience with Microsoft Fabric, Fabric Data Factory, or Azure Data Factory.
  • Hands-on experience with Spark SQL, PySpark, and Python for data engineering.
  • Strong SQL skills with experience optimizing complex queries.
  • Experience building and supporting enterprise-scale ETL/ELT pipelines.
  • Familiarity with CI/CD, version control, and modern DevOps practices.
  • Experience supporting BI and analytics platforms (Power BI preferred)
  • Proven ability to communicate well verbally and in writing, with all areas at TCC.
  • Excellent communication skills are important – both written and spoken.
  • Ability to work independently, multi-task, and manage time and priorities to meet deadlines.
  • Strong collaborative, analytic skills to work with business partners to drive requirements for solutions
  • Mitigate problems and roadblocks, involving internal and external stakeholders.
  • Innovative mindset: ability to think both creatively and critically
  • Strong understanding of data warehousing concepts, dimensional modeling and data modeling
  • Help create ad-hoc queries and provide data as requested, with a goal of transitioning to self-service
  • Routinely analyzes data across data lineage, from source to stage to target to reports
  • Able to evaluate priorities and manage work to critical project timelines in fast-paced environment.
  • Must demonstrate the ability to interpret business data and translate into insightful recommendations
  • Ability to prioritize and handle multiple tasks in a high-pressure environment
  • Adaptability to fast-paced retail environments and changing business priorities
  • Demonstrate leadership qualities, flexibility and adaptability in roles and responsibilities as required
  • Research and troubleshoot issues and questions related to deployed solutions
  • Maintain knowledge of the latest data technology and best practices
  • Bachelor’s degree (B.A. / B.S.) in Information Technology or related field or the equivalent combination of education and experience.
  • 5+ years of experience in Microsoft ETL (Azure Data Factory & Fabric)
  • Proficiency with SQL Server (T-SQL), and Azure Data Factory
  • 5+ years of experience in Microsoft databases (Azure SQL DB, Blob, Data Lake)
  • Familiarity with source control and CI/CD pipelines for ETL deployment – Git, Azure DevOps
  • Experience working with large-scale datasets and performance tuning techniques
  • 5+ years of experience supporting a BI solution
  • Working experience with Agile methodology
  • Experience with Data Warehousing Modeling concepts
  • Excellent problem-solving skills and the ability to work independently or in a team
  • Continuous improvement mindset with interest in modern data and AI capabilities.
  • Strong ownership and accountability for data pipeline quality and reliability.
  • Attention to detail with a bias toward testing, stability, and defect reduction.

Nice To Haves

  • Experience with Azure Data Lake, Delta Lake, One Lake and Databricks, Python, Scala
  • Experience enabling AI/ML workloads (feature engineering, model input pipelines).
  • Experience with lakehouse architecture and modern analytics platforms.
  • Knowledge of data governance, security, and compliance best practices.
  • Knowledge of Microsoft Fabric Catalog / Azure Purview for metadata management
  • Experience in the retail, finance, or healthcare industry
  • Experience creating Power BI Semantic Models

Responsibilities

  • Design, develop, and maintain end-to-end ETL/ELT pipelines using Microsoft Fabric Data Factory and Fabric-native experiences.
  • Build and optimize Spark-based transformations using Spark SQL and PySpark for large-scale data processing.
  • Build and manage scalable data workflows using Azure Data Factory and Fabric Data Factory and integrate with OneLake, Dataflows Gen2
  • Develop reusable, modular pipelines leveraging Python for data transformation, orchestration logic, and automation.
  • Implement ingestion patterns for structured and semi-structured data from internal and external data sources
  • Support AI and advanced analytics use cases by providing curated, high-quality data layers optimized for machine learning and AI workloads.
  • Partner with analytics and data science teams to enable feature-ready datasets and scalable model inputs.
  • Apply data quality, validation, and observability patterns to ensure accuracy, reliability, and trust in downstream analytics and AI outputs.
  • Expertise in Azure tools, specifically Data Factory, Azure Automation - Runbooks/Workbooks and DevOps
  • Develop and tune SQL and Spark SQL queries, views, and transformations for performance and scalability.
  • Optimize data models, partitioning strategies, and storage formats to support enterprise BI and AI workloads.
  • Leverage Fabric monitoring and logging to identify bottlenecks and improve pipeline performance.
  • Implement CI/CD and deployment best practices using Azure DevOps or equivalent tooling for Fabric assets.
  • Create and manage pipelines, schedules, and dependencies with a focus on reliability and recoverability.
  • Monitor pipeline health, address failures proactively, and ensure SLAs are met.
  • Experience in implementing Logic Apps, Stored Procedures, ETL, Cube, Reports
  • Ensure data accuracy, data quality, and data governance across all ETL processes.
  • Produce clear documentation including data flows, pipeline logic, dependencies, data dictionaries, and business rules.
  • Build and maintain automated testing and validation to reduce defects and improve data quality.
  • Follow data governance, security, and compliance standards across the platform
  • Interact with business users and customers to gather requirements, understand issues, and communicate resolutions clearly.
  • Handle production support for Data and Orchestration systems, ensuring high availability and quick resolution of data issues.
  • Handle and resolve support tickets related to data discrepancies, report issues, job failures, and user queries.
  • Collaborate with business stakeholders, analysts, and data scientists to understand data requirements and deliver BI solutions.
  • Optimize data models for performance and usability in Power BI and other visualization tools.
  • Support production data pipelines and BI systems, ensuring high availability and timely issue resolution.
  • Investigate and resolve data discrepancies, job failures, and user-reported issues.
  • Collaborate closely with BI developers, analysts, QA engineers, and business stakeholders to translate requirements into technical solutions.
  • Participate in Agile ceremonies including sprint planning, stand-ups, reviews, and retrospectives.
  • Participate in code reviews, performance tuning, and process improvements
  • Handle Stage and Production migrations and deployments
  • Ability to perform tasks following agile/sprint methodology; participate in scrum ceremonies
  • Participate as part of a weekend rotation schedule for support of production loads
  • Provide estimation of software deliverables to management, highlighting risks and ensuring the solution is delivered as per Project plan
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service