How will you make an impact? We are seeking an expert Microsoft Fabric Developer & Administrator to support projects by building and improving modern analytics and AI solutions using the Microsoft Fabric platform. This role will help the company make faster, more thoughtful decisions by delivering AI-powered insights, automation, and real-time reporting across global business functions. The ideal candidate will upgrade and streamline our data systems by using tools like Power BI, Fabric Lakehouse, Data Activator, Data Agent, and Copilot, and by integrating ChatGPT and Model Context Protocol (MCP) to develop experienced, conversational, and user-friendly analytics. This role requires strong technical skills, problem-solving abilities, and the capability to build secure, scalable systems that drive innovation, efficiency, and business growth. What will you do? Build and enhance Power BI datasets, dashboards, and reports within the Microsoft Fabric ecosystem, ensuring performance, scalability, and an outstanding user experience. Leverage Copilot, ChatGPT, and MCP-based AI integrations to automate insights, enable natural language analytics, and deliver predictive and generative intelligence. Develop advanced data modeling, DAX, and Power Query (M) solutions to enable complex, high-volume enterprise reporting. Coordinate and configure Fabric capacities, workspaces, pipelines, and Lakehouses, ensuring secure, reliable, and compliant operations. Manage and integrate Data Activator and Data Agent for event-driven, AI-assisted analytics workflows. Perform capacity analysis, optimize resource utilization, and build proactive monitoring processes to sustain peak performance. Integrate Fabric telemetry and activity logs with Log Analytics and Azure Monitor to build actionable intelligence on usage, performance, and cost efficiency. Develop CI/CD automation for Power BI and Fabric assets through Azure DevOps, Git, or GitHub Actions, ensuring version control, alignment, and deployment consistency. Collaborate with engineering and data teams to unify data across Databricks, AWS Redshift, and Data Lakes, establishing a governed, high-availability data fabric. Optimize SQL and PySpark queries for analytical performance and efficiency. Implement data lineage, governance, and security protocols using enterprise-level RLS/OLS and workspace oversight. Promote and improve reporting by using advanced technology, MCP-based analytics, and automation to help teams create their own reports more easily and effectively. Lead platform optimization and capacity scaling initiatives for sustained reliability and performance. Demonstrate advanced problem-solving, debugging, and cross-platform integration skills across hybrid data and AI ecosystems.
Stand Out From the Crowd
Upload your resume and get instant feedback on how well it matches this job.
Job Type
Full-time
Career Level
Mid Level
Number of Employees
5,001-10,000 employees