Senior Data Platform Engineer

AllegionCarmel, IN
1d

About The Position

Creating Peace of Mind by Pioneering Safety and Security At Allegion, we help keep the people you know and love safe and secure where they live, work and visit. With more than 30 brands, 12,000+ employees globally and products sold in 130 countries, we specialize in security around the doorway and beyond. Additionally, in 2024 we were awarded the Gallup Exceptional Workplace Award, which recognizes the most engaged workplace cultures in the world. Senior Data Platform Engineer This position serves as the senior technical expert for Data Hub processes within the Global Data & Analytics Team, owning the design and implementation of data ingestion, transformation, and transmission workflows for the Operational Data Hub. The Senior Data Platform Engineer is the go-to authority for Data Hub architecture and processes, while actively contributing technical expertise to enterprise data structure design in partnership with the Data Architect. This role mentors mid-level engineers, collaborates with Lead Engineers and cross-functional stakeholders, and balances deep technical expertise (80%) with informal leadership and mentorship (20%). Qualified candidates must be legally authorized to be employed in the United States. The company does not intend to provide sponsorship for employment visa status (e.g., H-1B, TN, etc.) for this employment position.

Requirements

  • 5-7 years of experience designing and implementing data hub solutions, operational data platforms, or similar data processing architectures.
  • Expert-level experience owning data pipeline architectures, including event-driven patterns, real-time processing, and batch workflows.
  • Advanced expertise architecting serverless solutions with Azure Function Apps or equivalent compute services for data processing at scale.
  • Deep experience with data hub processes including ingestion patterns, transformation frameworks, orchestration, and operational monitoring.
  • Advanced experience designing and optimizing Azure SQL Databases or similar platforms for operational data storage within hub architectures.
  • Expert SQL skills with proven ability to optimize queries, design efficient processing logic, and contribute to data modeling discussions.
  • Strong understanding of enterprise data architecture principles, with ability to contribute technical expertise to data structure and modeling decisions.
  • Proven ability to collaborate with Data Architects, providing Data Hub process perspective on data structure designs and enterprise models.
  • Demonstrated expertise in data integration patterns involving ERP systems, APIs, and enterprise application data flows.
  • Experience with Azure Data Lake, Azure Synapse Analytics, or similar platforms, understanding how Data Hub processes interact with enterprise data layers.
  • Proven track record of owning complex Data Hub architectures and delivering technical solutions independently.
  • Strong mentorship capabilities with experience guiding engineers on Data Hub patterns, pipeline development, and operational best practices.
  • Excellent problem-solving skills for Data Hub operational challenges, including performance tuning, troubleshooting, and reliability improvements.
  • Experience with CI/CD pipelines and Infrastructure as Code for automating Data Hub deployments and configuration management.
  • Strong understanding of data security and compliance requirements, implementing controls within Data Hub processes.
  • Excellent technical communication skills to collaborate with Data Architects, document architectures, and explain Data Hub technical tradeoffs.
  • Experience contributing to architectural discussions across data structures and processes, providing balanced technical perspectives.
  • Self-motivated with strong organizational skills, able to own Data Hub initiatives while contributing to broader architectural decisions.
  • Deep experience with DataOps/DevOps practices, including monitoring, logging, observability, and operational excellence for data systems.
  • Knowledge of cloud cost optimization for data processing workloads and efficient resource utilization.
  • Bachelor’s degree in computer science, Information Technology, Engineering, or related technical discipline, or equivalent practical experience.

Responsibilities

  • Own and architect Data Hub processes and workflows, including data ingestion pipelines, transformation logic, transmission patterns, and integration orchestration.
  • Design and implement sophisticated data pipeline architectures from various ERP and source systems, establishing patterns and best practices for the Data Hub.
  • Partner with the Data Architect on enterprise data structure design, contributing Data Hub process expertise, feasibility assessments, and technical requirements.
  • Define Data Hub technical standards and patterns for pipeline development, error handling, monitoring, and operational workflows.
  • Collaborate with the Data Architect to ensure alignment between Data Hub processes and enterprise data models, providing input on how data structures impact processing efficiency.
  • Lead technical design sessions for Data Hub initiatives, making architectural decisions on ingestion patterns, transformation approaches, and integration strategies.
  • Contribute technical expertise to data structure discussions, advising on pipeline performance implications, transformation complexity, and operational considerations.
  • Provide technical mentorship to mid-level Data Platform Engineers on Data Hub architecture, patterns, and implementation best practices.
  • Drive complex problem-solving for Data Hub operational issues, including performance optimization, pipeline troubleshooting, and root cause analysis.
  • Implement advanced transformation and enrichment frameworks that leverage enterprise data structures designed by the Data Architect.
  • Own end-to-end architecture of Data Hub components, including serverless compute, orchestration, monitoring, and operational tooling.
  • Evaluate and prototype emerging technologies for Data Hub capabilities, providing recommendations on tools and platforms for data processing.
  • Establish data quality frameworks for Data Hub processes, implementing validation, testing, and monitoring strategies in collaboration with the Data Architect.
  • Design and implement Data Hub governance controls, including security, compliance, and operational best practices for data processing workflows.
  • Optimize Data Hub performance and cloud resource utilization, identifying opportunities for efficiency improvements and cost savings.
  • Document Data Hub architectures and technical patterns comprehensively, creating runbooks, design guides, and knowledge base articles.
  • Participate in on-call rotation for Data Hub production support, leading incident resolution and conducting technical post-mortems.

Benefits

  • Health, dental and vision insurance coverage, helping you “be safe, be healthy”.
  • A commitment to your future with a 401K plan, offering a 6% company match and no vesting period
  • Tuition Reimbursement
  • Unlimited PTO
  • Employee Discounts through Perks at Work
  • Community involvement and opportunities to give back so you can “serve others, not yourself”
  • Opportunities to leverage your unique strengths through CliftonStrengths testing and coaching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service