Data Engineer

GMAustin, TX
Hybrid

About The Position

This role is categorized as hybrid. This means the successful candidate is expected to report to Austin IT Innovation Center three times per week, at minimum [or other frequency dictated by the business if more than 3 days]. What You’ll Do Design, develop, and maintain scalable data engineering pipelines and backend services built on Quarkus and Spring Boot frameworks. Build and manage cron -based orchestration services that retrieve data from multiple enterprise systems via REST APIs. Work with event streaming and messaging platforms such as Kafka and Azure Event Hub for real-time data processing. Design and optimize database solutions using PostgreSQL and other relational data stores. Deploy, monitor , and manage applications on Azure Kubernetes Service (AKS) . Implement CI/CD pipelines using GitHub Workflows and automate deployments using ArgoCD . Write and maintain Terraform scripts for infrastructure automation and cloud resource provisioning. Design observability solutions using Prometheus metrics , Datadog monitoring , and alerting systems. Build and maintain data processing workflows using Databricks and distributed data frameworks. Collaborate with cross-functional teams to gather requirements and translate them into technical implementations. Optimize application performance, reliability, and scalability across data and service layers. Build and maintain email services, templates, and customer communication workflows using Adobe Journey Optimizer (AJO) or similar tools. Troubleshoot production issues and implement proactive monitoring and resiliency improvements.

Requirements

  • Bachelor’s degree in Computer Science , Software Engineering, Information Systems, or related field, or equivalent practical experience.
  • 4+ years of software or data engineering experience.
  • Strong proficiency in Java development and object-oriented programming .
  • Hands-on experience with Quarkus and Spring Boot application development.
  • Experience building and consuming REST APIs and microservices architectures.
  • Strong knowledge of event-driven architectures using Kafka or Azure Event Hub .
  • Experience with relational databases, especially PostgreSQL , including performance tuning and query optimization.
  • Hands-on experience with Azure cloud services , especially AKS , networking, and managed services.
  • Experience implementing CI/CD pipelines using GitHub Actions .
  • Infrastructure-as-Code experience using Terraform .
  • Experience with observability tools such as Prometheus and Datadog .
  • Understanding of containerization technologies such as Docker and Kubernetes.

Nice To Haves

  • Strong knowledge of Azure platform services and architecture patterns .
  • Experience with email marketing or customer communication platforms , including template design and orchestration workflows.
  • Experience integrating enterprise marketing tools with backend services.
  • Knowledge of security best practices in cloud and API development.
  • Familiarity with telemetry and log analytics in cloud environments.
  • Experience working with large-scale customer engagement or notification systems.

Responsibilities

  • Design, develop, and maintain scalable data engineering pipelines and backend services built on Quarkus and Spring Boot frameworks.
  • Build and manage cron -based orchestration services that retrieve data from multiple enterprise systems via REST APIs.
  • Work with event streaming and messaging platforms such as Kafka and Azure Event Hub for real-time data processing.
  • Design and optimize database solutions using PostgreSQL and other relational data stores.
  • Deploy, monitor , and manage applications on Azure Kubernetes Service (AKS) .
  • Implement CI/CD pipelines using GitHub Workflows and automate deployments using ArgoCD .
  • Write and maintain Terraform scripts for infrastructure automation and cloud resource provisioning.
  • Design observability solutions using Prometheus metrics , Datadog monitoring , and alerting systems.
  • Build and maintain data processing workflows using Databricks and distributed data frameworks.
  • Collaborate with cross-functional teams to gather requirements and translate them into technical implementations.
  • Optimize application performance, reliability, and scalability across data and service layers.
  • Build and maintain email services, templates, and customer communication workflows using Adobe Journey Optimizer (AJO) or similar tools.
  • Troubleshoot production issues and implement proactive monitoring and resiliency improvements.

Benefits

  • GM offers a variety of health and wellbeing benefit programs.
  • Benefit options include medical, dental, vision, Health Savings Account, Flexible Spending Accounts, retirement savings plan, sickness and accident benefits, life insurance, paid vacation & holidays, tuition assistance programs, employee assistance program, GM vehicle discounts and more.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service