The Copilot Security Team is chartered with securing Microsoft’s agentic and autonomous AI systems. We build the adversarial testing, mitigations, telemetry, guardrails, and evaluation capabilities that reduce real‑world risk across Copilot and emerging AI agents. Our mission: increase safety, resilience, and trustworthiness by proactively identifying weaknesses and engineering durable defenses at scale. As a Software Engineer (IC3), you will design and implement foundational components that harden Copilot’s agentic systems against jailbreaks, prompt injection, toolchain misuse, unsafe autonomy, and other emerging XPIA‑class attack vectors. You will contribute to both adversarial evaluations and the Agentic Security Platform—the shared services, pipelines, and instrumentation that enable reproducible, auditable security evaluation across Microsoft. This role is ideal for engineers who are passionate about secure-by-design engineering, love building well‑constructed systems, and want to help define the future of responsible AI. Why Join the Copilot Security Team? Join a team at the center of Microsoft’s most critical AI safety work. You will build systems that shape how Copilot—and future autonomous AI—behave in the world. Your work will directly reduce real‑world risk, improve product safety, and influence Microsoft-wide engineering standards for agentic AI. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond.
Stand Out From the Crowd
Upload your resume and get instant feedback on how well it matches this job.
Job Type
Full-time
Career Level
Entry Level