The selected candidate will be responsible for architecting, designing, developing, and implementing a next-generation data streaming and event-based architecture/platform using software engineering best practices in the latest technologies. This includes working with Data Streaming, Event Driven Architecture, and Event Processing Frameworks. The role also involves DevOps practices using tools like Jenkins, Red Hat OpenShift, Docker, and SonarQube, as well as Infrastructure-as-Code and Configuration-as-Code methodologies using Ansible, Terraform, and scripting. The candidate will administer Kafka, including automating, installing, migrating, upgrading, deploying, troubleshooting, and configuring on Linux. They will provide expertise in areas such as Kafka administration, event-driven architecture, automation, application integration, monitoring and alerting, security, business process management, CI/CD pipeline, and data ingestion/data modeling. The role requires investigating and repairing issues to ensure business continuity across various components, including the Kafka Platform, business logic, middleware, networking, CI/CD pipeline, or database. The candidate will also be responsible for briefing management, customers, teams, or vendors using appropriate technical communication skills.