About The Position

The Senior Director, Data Architect / ETL Architect has advanced technical knowledge and data warehouse expertise and can effectively articulate/evangelize data integration solutions across the Movement. This high performing individual is an integral member of the Data Engineering team, providing strategic direction and innovative solutions to meet the needs of the organization. This position is responsible for 15+ data pipelines and integrations serving several mission critical applications including the Enterprise Data Warehouse. The complexity and uniqueness of the Girl Scout technology stack requires a diverse range of Extract, Transform, Load (ETL/ELT) development and data architecture experience across multiple platforms, formats, and extraction methods. This requires expertise on a variety of platforms and projects such as: Snowflake integration; AWS cloud migration; centralized ETL; data warehouse modeling, building new data pipelines; and broadening existing data integration services to support our data domains. As the Girl Scout technology ecosystem grows, the individual is responsible for designing and building the processes necessary to make the data consumable and ready for use to solve the tough challenges of our business. The GSUSA Enterprise Data Warehouse serves the entire organization as well as our 112 Councils. The large scale and complexity of the implementation consists of over 2 million active members; over 18 years of history from three generations of operational systems; and fact tables with hundreds of millions of records. The Senior Director, Data Architect / ETL Architect works with Sr Leadership to understand the organization and community objectives and provide strategic and innovative solutions that leverage data to grow our membership and increase revenue. They will make data useful by designing, developing and maintaining scalable data pipelines from multiple sources; building out new API integrations to support continuing increases in data sources, volume and complexity; addressing data quality issues; modeling, transforming, distilling, combining and delivering it to a repository or consuming system in a performant manner. This position will work with various forms of data (xml, json, text, csv) and data sources (APIs, server logs, external data, no-sql, etc). Ultimately, it will be the ETL Architect that manages Girl Scouts’ complex data landscape in today's world.

Requirements

  • Fifteen plus (15+) years of experience in a data integration/transformation driven role
  • Ten plus (10+) years of experience in data warehousing and data architecture
  • Minimum 15+ years of industry experience overall
  • Demonstrated ETL experience with but not limited to, the following enterprise platforms: Salesforce, SAP Commerce Cloud, NetSuite, OpSuite, Qualtrics
  • Demonstrated experience with dimensional data warehouse and data lake architectures (Kimball methodology preferred)
  • Demonstrated ability to extract a variety of data from disparate enterprise platforms and third-party data enrichment services and apply complex transformations
  • Extensive experience using ETL tools and techniques in a large-scale implementation
  • Experience in architecting data pipelines and solutions for both streaming and batch integrations.
  • Deep understanding of ETL industry standards and best practices including full/incremental patterns, Change Data Capture, etc.
  • Significant experience with cloud data warehouses, Snowflake required.
  • Expertise with third party API data integrations (REST and SOAP)
  • Expertise in SQL programming skills including complex query design and tuning
  • Demonstrated expertise in scripting languages, ex. Python, JavaScript, Windows PowerShell
  • Demonstrated ability to architect and build complex enterprise data models which incorporate data from disparate enterprise platforms and third-party data enrichment services
  • Working knowledge of change control methodologies and tools such as GitHub
  • Expertise in the following products & methodologies -, Snowflake, SQL, Informatica, AWS, Azure Blob, GitHub, JIRA, Agile, Looker
  • Experience with data security, encryption and data lake
  • Advanced experience with Master Data Management (MDM) implementations
  • Experience with Cloud migration and cloud-hosted ETL environments (i.e. Snowflake cloud data warehouse, AWS a plus)
  • Experience working in a cloud ecosystem, such as: AWS, Azure
  • Demonstrated problem solving, troubleshooting skills
  • Ability to work independently once guidance and goals are provided
  • Excellent communication, analytical, and development skills
  • Demonstrated expertise in Scripting languages, (Python required)
  • Expertise in SQL programming skills including complex query design and tuning
  • Extensive experience using modern ETL tools and techniques in a large-scale implementation
  • Cloud-based Warehouse as a Service experience (Snowflake required)
  • Demonstrated experience with dimensional data warehouse and data lake architectures (Kimball methodology preferred)
  • Experience with data security, encryption and data lake
  • Expertise in the following products & methodologies -, Snowflake, SQL, Informatica, AWS, Azure Blob, GitHub, JIRA, Agile, Looker
  • Demonstrated ETL experience with the following enterprise platforms: Salesforce, SAP Commerce Cloud, NetSuite, OpSuite, Qualtrics
  • Advanced experience with Master Data Management (MDM) implementations
  • Expertise with third party API data integrations (REST and SOAP)
  • Experience with enterprise business intelligence tools (Looker required)
  • Office 365 or similar suites
  • Competency in PowerPoint or similar presentation software
  • Competency in Microsoft Excel or similar software
  • Bachelor’s degree or equivalent experience
  • Minimum of fifteen (15) years’ experience in corporate environment
  • Mathematics, Computer Science, software engineering or related field

Nice To Haves

  • Master's degree in related field is a plus
  • Snowflake SnowPro Core and SnowPro Data Engineer a plus

Responsibilities

  • Work with Sr Leadership and Sr Data Architect to understand the organization and community goals to attract and retain new members and provide innovative data modeling solutions.
  • Architect and build complex enterprise data models which incorporate data from disparate enterprise platforms and third-party data enrichment services.
  • Work with Technology to develop enterprise data connectivity to drive business goals leveraging API as a Service.
  • Collaborate with and provide backup for Sr Data Architect to support and maintain the Enterprise Data Warehouse.
  • Work with other team members, including Project Manager Lead, Data Architect, DBA, Technical Architects, and Business Analysts to define the best solution, estimate development efforts and ensure accurate requirements fulfillment.
  • Participate in design and code reviews.
  • Document technical requirements and solutions.
  • Work with Sr Leadership to understand the organization and community objectives and provide strategic and innovative data pipeline solutions.
  • Work closely with Sr Data Architect to integrate data models into data integration processes.
  • Design, build, and implement scalable data pipelines to efficiently extract and transform high volumes of data.
  • Collaborate closely with stakeholders to understand business requirements and translate them into effective technical designs to ensure complete delivery of solutions.
  • Lead evaluations and recommendations of new and emerging technologies to support organizational objectives.
  • Balance multiple projects of varying difficulty in an Agile framework, assisting in change management and translating business requirements to technical requirements where necessary.
  • Assist with time-sensitive requests as needed to ensure delivery of innovative yet practical solutions that meet business needs.
  • Adhere to standards and processes to ensure successful operational support required to ensure successful completion of ETL/ELT process.
  • Assist with configuration of application software.
  • Manage change control process.
  • Participate in design and code reviews.
  • Document technical requirements and solutions.
  • Performance tune ETL/ELT applications to manage high volume data transfer to and from internal and external system locations.
  • Manage and schedule Driver Upgrades in the Dev and Prod environments and oversee all UAT of said upgrades.
  • Provide documentation and knowledge transfer to Data Engineering team members.
  • Manage API upgrades in Dev and Prod environments including ETL/EDW modifications if needed.
  • Support and monitor existing ETL/ELT processes.
  • Troubleshoot ETL/ELT issues, recommend, test, and implement solutions expedited as needed for mission critical applications
  • Engage in project planning and delivering to commitments.
  • Mentor and train Analytics & Insights team members including ETL developers, Data Engineers, Data and Business Analysts, Looker BI developers and Research specialists.
  • Serve as Data Integration SME

Benefits

  • 20 days of paid time off
  • 2 floating holidays
  • 9 workplace holidays
  • Paid holiday year-end office closure between Christmas and New Year's
  • Medical and Behavioral Health Coverage
  • Plan options with individual and family coverage which includes wellness, hospitalization, and fertility assistance.
  • Dental and vision coverage
  • Health Savings Accounts (HSAs) and Flexible Spending Accounts (FSAs) including Health, Dependent Care, and Limited FSA for those with Health Savings Accounts
  • Company-paid life insurance
  • Flexible work arrangements
  • 12 weeks of paid parental leave
  • 401(K) with company match
  • Sick leave
  • Short- and Long-Term Disability for salary continuation
  • Health and Wellness Classes and Activities throughout the year
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service