NCS Canada-posted 2 months ago
Full-time • Mid Level
Houston, TX
51-100 employees
Professional, Scientific, and Technical Services

The Business Intelligence (BI) Data Engineer will be responsible for designing, building, and maintaining the data infrastructure, pipelines, and analytics solutions that drive data-informed decision making across the organization. This role bridges data engineering and business intelligence, ensuring data is clean, reliable, and transformed into meaningful insights through platforms like Microsoft Fabric, Power BI, SQL Server, and Snowflake. This role requires the ability to solve complex data challenges, experience across cloud and on-premises platforms, and the ability to turn raw data into trusted, actionable intelligence for business stakeholders.

  • Design and implement relational and dimensional databases, including schema design, indexing strategies, and performance tuning
  • Develop, maintain, and optimize ETL/ELT pipelines using Apache Airflow or Azure Data Factory (Fabric Data Factory)
  • Design scalable data models and pipelines for SQL Server and Snowflake
  • Deliver reporting and analytics solutions using Microsoft BI, including Fabric and Power BI
  • Ensure high availability, reliability, and performance of data processes
  • Build, maintain, and optimize dashboards and reports in Power BI and Fabric
  • Translate complex data sets into clear visualizations and metrics for business stakeholders
  • Partner with teams to identify KPIs and deliver actionable insights
  • Implement and monitor data validation, error handling, and performance tuning strategies
  • Contribute to best practices in data security, compliance, and governance
  • Work closely with data scientists, analysts, and business units to support cross-functional analytics initiatives
  • Participate in architectural discussions to improve scalability and efficiency of data solutions
  • Design and implement backup and restore strategies (full, differential, transaction log backups)
  • Knowledge of disaster recovery planning and RPO/RTO requirements
  • Experience with high availability solutions: SQL Server: Always On Availability Groups, Failover Clustering; Snowflake: Time Travel, Fail-safe
  • Diagnose slow queries, deadlocks, and bottlenecks
  • Use tools like SQL Profiler, Extended Events, DMVs, Query Store, or Performance Monitor
  • Tune indexes, statistics, and query plans
  • Optimize ETL job performance and concurrency
  • Implement role-based access control (RBAC), encryption at rest/in transit, and auditing
  • Understand GDPR, SOC2, HIPAA, or other compliance frameworks
  • Manage user provisioning, privilege management, and data masking
  • Set up and monitor database maintenance plans (index rebuilds, integrity checks)
  • Automate housekeeping tasks via SQL Agent, PowerShell, or Fabric pipelines
  • Forecast storage growth and manage file groups/partitions
  • Understand I/O characteristics and underlying hardware/cloud configurations
  • Capacity and scalability planning for BI workloads
  • Data lifecycle management and archiving strategies
  • Collaboration with data architects to align database design with business goals
  • Documentation and governance of data assets and metadata
  • Snowflake Administration: Roles, warehouses, resource monitors, credit usage optimization
  • Azure SQL/Synapse/Fabric Data Warehouse administration
  • Familiarity with IAM, networking, and cost control in cloud data platforms
  • Experience with infrastructure as code (IaC) tools for database provisioning (e.g., Terraform, ARM templates)
  • Support and uphold HS&E policies and procedures of NCS and the customer
  • Align individual goals with NCS corporate goals, while adhering to the NCS Promise
  • Participate in your Personal Development for Success (PDS)
  • Other duties, relevant to the position, shall be assigned as required
  • Bachelor's degree in Computer Science, Information Technology, or equivalent experience
  • 3+ years of experience in BI development, data engineering, or similar roles
  • Strong proficiency in SQL Server (database design, query optimization, stored procedures, performance tuning)
  • Hands-on experience with Snowflake (warehousing, schema design, data sharing, performance optimization)
  • Practical knowledge of Apache Airflow or Azure Data Factory (Fabric Data Factory) for orchestration and workflow management
  • Proficiency with the Microsoft BI stack, including Fabric and Power BI
  • Track record of building and maintaining well-designed databases, complex data pipelines, and reporting solutions
  • Strong analytical skills and ability to explain technical concepts clearly to business audiences
  • Experience with Python or other scripting languages for data manipulation
  • Knowledge of CI/CD practices for data pipeline deployment
  • Exposure to data governance frameworks and compliance standards
  • Familiarity with APIs and data integration tools
  • Understanding of AI-powered BI tools, including how to prepare and connect datasets for Microsoft Copilot in Power BI/Fabric
  • Awareness of how to design data models for AI-driven analytics and natural language queries
  • Eligible for Target Discretionary Bonus
  • Laptop provided
  • On-call 24/7 for support
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service