Software Developer 3

OracleAustin, TX
Remote

About The Position

Design, develop, troubleshoot and/or test/QA software. As a member of the software engineering division, apply knowledge of software architecture to perform tasks associated with developing, debugging, or designing software applications or operating systems according to provided design specifications. Build enhancements within an existing software architecture and/or suggest improvements to the architecture. May telecommute. (385.34734) Employer will accept a Master's degree in Computer Science, Engineering, or related technical field and 4 years of experience in the job offered or in a Software Developer 3-related occupation. Only Oracle brings together the data, infrastructure, applications, and expertise to power everything from industry innovations to life-saving care. And with AI embedded across our products and services, we help customers turn that promise into a better future for all. Discover your potential at a company leading the way in AI and cloud solutions that impact billions of lives. True innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing a workforce that promotes opportunities for all with competitive benefits that support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing [email protected] [[email protected]] or by calling 1-888-404-2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.

Requirements

  • Master's degree in Computer Science, Engineering, or related technical field
  • 4 years of experience in the job offered or in a Software Developer 3-related occupation
  • Experience designing and maintaining high-volume, multi-stage data ingestion and transformation pipelines integrating Snowflake, MySQL, Redis, and cloud storage systems (AWS S3 or similar), including schema design, query optimization, performance tuning, warehouse sizing, clustering strategies, and secure integrations with upstream and downstream systems
  • Experience building enterprise-grade ELT/ETL frameworks that unify data from internal tools, Salesforce/CRM platforms, operational systems, and cloud services using REST APIs, event-driven pipelines, and containerized workflows
  • Experience integrating enterprise systems with Salesforce or equivalent CRM systems, building secure API connectors, automation layers, custom data synchronization engines, and multi-directional ingestion pipelines
  • Experience engineering heuristics-driven intelligence modules to detect anomalies, optimize pipeline behavior, improve reliability, and enforce data-quality constraints across distributed ingestion platforms
  • Experience migrating legacy or monolithic applications (e.g., Ruby on Rails systems) into modern distributed cloud architectures leveraging cloud, Kubernetes, microservices, and identity-aware service frameworks such as Network Identity Services
  • Experience implementing advanced observability, monitoring, and operational analytics (WaveFront or equivalent), and real-time anomaly detection and end-to-end reliability of multi-cluster ingestion pipelines
  • Experience performing advanced Linux and network systems engineering, including kernel-level debugging, process/thread analysis, network stack troubleshooting (TCP/IP, HTTP), and container runtime analysis, and optimization of distributed services running on Linux-based cloud infrastructure

Responsibilities

  • Designing and maintaining high-volume, multi-stage data ingestion and transformation pipelines integrating Snowflake, MySQL, Redis, and cloud storage systems (AWS S3 or similar), including schema design, query optimization, performance tuning, warehouse sizing, clustering strategies, and secure integrations with upstream and downstream systems
  • Building enterprise-grade ELT/ETL frameworks that unify data from internal tools, Salesforce/CRM platforms, operational systems, and cloud services using REST APIs, event-driven pipelines, and containerized workflows
  • Integrating enterprise systems with Salesforce or equivalent CRM systems, building secure API connectors, automation layers, custom data synchronization engines, and multi-directional ingestion pipelines
  • Engineering heuristics-driven intelligence modules to detect anomalies, optimize pipeline behavior, improve reliability, and enforce data-quality constraints across distributed ingestion platforms
  • Migrating legacy or monolithic applications (e.g., Ruby on Rails systems) into modern distributed cloud architectures leveraging cloud, Kubernetes, microservices, and identity-aware service frameworks such as Network Identity Services
  • Implementing advanced observability, monitoring, and operational analytics (WaveFront or equivalent), and real-time anomaly detection and end-to-end reliability of multi-cluster ingestion pipelines
  • Performing advanced Linux and network systems engineering, including kernel-level debugging, process/thread analysis, network stack troubleshooting (TCP/IP, HTTP), and container runtime analysis, and optimization of distributed services running on Linux-based cloud infrastructure

Benefits

  • flexible medical
  • life insurance
  • retirement options
  • volunteer programs
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service