GCI embodies excellence, integrity and professionalism. The employees supporting our customers deliver unique, high-value mission solutions while effectively leverage the technological expertise of our valued workforce to meet critical mission requirements in the areas of Data Analytics and Software Development, Engineering, Targeting and Analysis, Operations, Training, and Cyber Operations. We maximize opportunities for success by building and maintaining trusted and reliable partnerships with our customers and industry. At GCI, we solve the hard problems. As a Software Engineer, a typical day will include the following duties: Conduct comprehensive assessments of existing data pipelines, infrastructure, and data flows including integrations with operational systems like ServiceNow, network management platforms, and business applications to identify technical debt, bottlenecks, and reliability issues. The Candidate shall evaluate current data architecture against industry best practices and organizational needs; develop technical recommendations and roadmaps for data infrastructure improvements. Design, build, and maintain production-grade data pipelines using orchestration tools such as Airflow or Prefect. Develop robust ETL (Extract-Transform-Load/ELT (Extract-Load Transform) processes from diverse sources: Saas platforms, network management systems, databases, APIs, files, and streams. Build API integrations handling authentication (OAuth, API keys, and Single Sign-On (SSO), rate limiting, retry logic, and error handling. Extract data from systems not designed for export; reverse-engineer undocumented data structures and relationships. Handle semi-structured data (JSON and XML); and transform into structured datasets with consistent schemas. Design dimensional models, data warehouses, and data marts following industry methodologies. Create conceptual, logical, and physical data models optimized for query performance and storage efficiency. Implement slowly changing dimensions and other data warehousing patterns. Establish naming conventions, data standards, and modeling best practices. Implement comprehensive data quality checks, validation rules, and automated monitoring with alerting. Build error handling, failure recovery, logging, and observability into all processes. Optimize pipelines for performance, cost, and resource utilization. Develop reusable components and frameworks; refactor legacy pipelines for reliability. Build and maintain data infrastructure on cloud platforms (Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP) using infrastructure-as-code using Terraform and CloudFormation. Implement CI/CD pipelines, version control (Git), and automated testing frameworks. Manage database performance tuning, indexing, partitioning, and capacity planning. Establish backup, recovery, security controls, access controls, and compliance measures. Partner with analysts, software developers, and business stakeholders to translate requirements into technical solutions. Create comprehensive documentation for systems, processes, and integrations. Provide technical guidance on data availability and proper usage; enable self-service access. Troubleshoot pipeline failures, performance issues, and data discrepancies; perform root cause analysis.
Stand Out From the Crowd
Upload your resume and get instant feedback on how well it matches this job.
Job Type
Full-time
Career Level
Senior
Education Level
No Education Listed