Hannon Armstrong Sustainable Infrastructure Capital-posted 3 months ago
$135,000 - $175,000/Yr
Full-time
Hybrid • Annapolis, MD
Real Estate

Responsible for the design, development, implementation, and support of mission-critical enterprise data solutions. As a Data Architect, you will work with cross-functional teams to gather and document data, integration, and development requirements to meet business needs. You will design, develop, test, and implement database solutions that bring together complex data from disparate sources into databases, data warehouses, and data lakes. You will integrate data across systems using web-based APIs and other unstructured and semi-structured sources, including extensive ingestion and processing of Excel files. The role will also require practical experience with data governance tools and ERP systems, specifically SAP General Ledger.

  • Develop and implement scalable, high-performance, and secure data architectures that meet business requirements and integrate with various data systems (e.g., databases, data warehouses, data marts, data lakes, Microsoft Fabric, and other cloud platforms).
  • Translate business requirements into specifications for data structures and relationships, including data imports/exports, transformations, and associated processes.
  • Create and maintain data models for complex, non-uniform datasets using modern data access and storage methodologies.
  • Lead data management and governance practices by establishing data quality standards, defining data governance policies (leveraging tools such as Microsoft Purview), ensuring compliance with regulatory requirements, and driving best practices for data security, privacy, and lifecycle management.
  • Design and oversee the integration of data from multiple sources and formats-structured, semi-structured, and unstructured-including large-scale ingestion and parsing of hundreds of Excel files.
  • Manage and support production databases, data warehouses, and ETL processes including security, backups, and upgrades, ensuring speed, scalability, and reliability.
  • Provide expertise in Excel data modeling, Power BI, and PowerQuery for business analytics and reporting solutions.
  • Provide support as required to ensure availability and operation of applications and ETL processes.
  • Continuously monitor, assess, and improve the performance of data systems, ensuring efficient data processing and reducing system bottlenecks.
  • Ensure proper configuration management and change controls are executed to policy.
  • Provide training and assistance to users for any developed systems or processes.
  • Participate in business analysis activities to gather data requirements.
  • Design and implement technology using best practices to create efficient, manageable, understandable, reliable, and well-documented implementations.
  • Bachelor's degree in Computer Science, Data Science, or relevant field.
  • 5+ years' experience designing and developing with Microsoft SQL Server (required).
  • 5+ years' experience developing with Python, including data parsing and transformation (required).
  • 2+ years' experience working with SAP General Ledger and performing integration/migration tasks with SAP data.
  • Hands-on experience with data governance tools such as Microsoft Purview.
  • Demonstrated ability to ingest, parse, and transform data from a wide variety of formats, including the ability to process and parse hundreds of Excel files as part of the data ingestion pipeline.
  • Experience with Microsoft Fabric, Excel data modeling, Power BI, and PowerQuery.
  • 4+ years' experience designing and developing data warehouses.
  • 3+ years' experience with modern data technologies (e.g., Snowflake, Spark, Kafka).
  • 3+ years' experience developing and deploying ETL solutions.
  • 3+ years' experience working with data pipeline and orchestration services (e.g., AirFlow, SSIS, Azure Data Factory, AWS Data Pipeline/Glue, FiveTran).
  • Expert knowledge of logical and physical data modeling concepts (relational and dimensional).
  • Experience performance tuning for large, complex data sets.
  • Experience working with Git.
  • Experience working with structured, semi-structured, and unstructured data sources.
  • Familiarity with DevOps and database deployment automation tools (e.g., CI/CD, SSDT).
  • Expected salary range of $135,000 - $175,000, based on experience and location.
  • Annual bonus program.
  • 401(k) with company match.
  • Equity incentive program.
  • Comprehensive medical, dental and vision benefits.
  • Paid time off for vacation, holidays, and sick days.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service