This role is categorized as hybrid. This means the successful candidate is expected to report to Austin Technical Center three times per week, at minimum [or other frequency dictated by the business if more than 3 days]. The Role We are looking for a Java Microservices Developer to design, build, and support scalable, resilient microservices for our Daignostics platform team. You will work closely with product managers, architects, and DevOps engineers to deliver secure, performant APIs and back-end services that power critical business and customer-facing applications. What You’ll Do Own the end-to-end design, development, and operation of scalable data engineering pipelines and backend services using Java, Quarkus , Spring Boot ensuring reliability, observability, and maintainability. Lead the design and implementation of cron -based and event-driven orchestration services that retrieve and process data from multiple enterprise systems via REST APIs and messaging platforms. Architect and implement real-time data processing solutions using Kafka and Azure Event Hub, including schema design, consumer group strategy, and resiliency patterns. Design and optimize relational data models and database solutions using PostgreSQL and other relational data stores, including indexing strategies, query optimization, and performance tuning at scale. Drive the deployment, scaling, and lifecycle management of services on Azure Kubernetes Service (AKS), including workload identity, networking, and security configuration. Define and implement CI/CD pipelines using GitHub Actions/Workflows, and manage automated, GitOps-based deployments using ArgoCD across multiple environments. Lead infrastructure automation using Terraform, establishing reusable modules, environment standards, and best practices for cloud resource provisioning and governance, including Datadog monitor creation and management. Design and implement end-to-end observability using Prometheus, Datadog, and related tooling, including metrics, logs, traces, dashboards, and alerting with clear SLOs/SLIs. Build and maintain data processing workflows using Databricks and distributed data frameworks, including batch and streaming jobs, job orchestration, and cost-optimized compute . Collaborate closely with product, architecture, and cross-functional engineering teams to refine requirements, define technical roadmaps, and translate business outcomes into robust technical designs. Drive performance, reliability, and scalability improvements across data and service layers, including load testing, capacity planning, and performance benchmarking. Troubleshoot complex production issues, perform root cause analysis, and implement durable fixes and resiliency patterns . Champion engineering best practices (code reviews, testing strategy, documentation, security, and monitoring) and help evolve team standards, patterns, and reference architectures. Mentor and coach engineers on the team, providing technical guidance, pairing, and feedback to elevate overall engineering quality and delivery.
Stand Out From the Crowd
Upload your resume and get instant feedback on how well it matches this job.
Job Type
Full-time
Career Level
Mid Level
Number of Employees
5,001-10,000 employees