About The Position

Authentic8 Silo places any type of digital analyst in region-specific, multi-application workspaces, securely and anonymously, anywhere across the globe. Target content can be captured, analyzed, and transformed in order to derive intelligence and support investigation requirements. All delivered in a cloud-native, multi-tenant platform. Compliance officers, mission managers, and administrators have their own specific audit and oversight requirements; to meet these needs, Silo also ensures compliance and appropriate use through class-leading policy enforcement and audit logging. Silo transforms how more than 750 of the world's most sophisticated organizations, from domestic and foreign government agencies to commercial entities across all sectors conduct their digital investigations. Authentic8 is seeking a highly capable DBA & Enterprise Data Engineer to lead all database-related operations while also building and managing data pipelines that drive informed business decisions. In this hybrid role, you will be the primary owner of PostgreSQL database operations and play a lead role in integrating business-critical data from various enterprise systems into centralized data pools to support analytics and decision-making by Business Operations. This role requires deep experience in PostgreSQL administration and a strong understanding of data integration practices, ETL/ELT pipelines, and business data modeling. You will regularly collaborate with back-end engineers, DevOps, and Business Operations analysts to ensure the integrity, performance, and accessibility of company data.

Requirements

  • 8+ years of experience as a DBA
  • 8+ years of experience supporting SaaS platforms
  • 4+ years of experience working with automation tools and workflows
  • Hands-on experience with PostgreSQL (strongly preferred)
  • Ability to design and maintaining ETL/ELT pipelines
  • Ability to work with APIs and third-party integrations for platforms like Salesforce and Jira
  • Strong SQL skills and proficiency with data modeling
  • Familiarity with cloud-based data warehousing solutions (e.g., BigQuery, Kestrel)
  • Experience working in cross-functional teams including Engineering, DevOps, and Business Operations
  • Excellent documentation and communication skills

Nice To Haves

  • Experience with Chef/Terranova
  • Proficiency in Ruby and/or Python
  • Experience with Grafana or similar monitoring tools
  • Background in InfoSec or experience handling sensitive/restricted data
  • Experience with Airbyte, Fivetran, or similar data integration platforms

Responsibilities

  • Own the availability, incident response, and operational health of the company’s PostgreSQL platform databases (Eng, QA, and Production)
  • Ensure high availability and disaster recovery processes are tested and functioning.
  • Implement PostgreSQL changes with minimal service interruption.
  • Perform live schema updates and patches in coordination with Engineering and DevOps.
  • Plan for database growth, execute tuning analysis, and conduct performance optimization.
  • Review and assist with SQL query optimization, identify and resolve deadlocks.
  • Coordinate schema deployments with DevOps.
  • Support ad hoc query needs from Customer Success, Sales, and Marketing teams.
  • Ensure all PostgreSQL security best practices are applied and monitored.
  • Help restructure or re-architect current and future databases to support performance and scalability.
  • Lead data integration projects that centralize company data from multiple systems into one or more centralized data pools (e.g., BigQuery).
  • Architect and implement data pipelines (ETL/ELT) from systems such as: Salesforce (CRM) Zuora (billing and revenue management) PostgreSQL (service data) Pendo.io (product analytics) Freshdesk (customer support/ticketing) Jira (work management and issue tracking)
  • Partner closely with Business Operations analysts to: Align on data models and schema designs that support reporting and analytics as well as build performant queries and transformations for ongoing business intelligence.
  • Ensure centralized data repositories are clean, consistent, and optimized for analysis.
  • Implement monitoring and alerting for data pipeline health and performance.
  • Ensure all data pipeline design and database operations meet strict internal security policies and customer privacy requirements.
  • Collaborate with Security and Compliance personnel to determine what data may or may not be ingested into centralized data pools, particularly when boundary controls, classification levels, or customer-specific restrictions apply.
  • Apply best practices in secure data handling, least privilege access, and logging/auditing for sensitive data operations.

Benefits

  • medical
  • dental
  • vision
  • flexible PTO
  • a 401k program
  • stock options

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

11-50 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service