Arctic Wolf-posted 3 days ago
$145,018 - $180,000/Yr
Full-time • Mid Level
Remote • Eden Prairie, MN
1,001-5,000 employees

At Arctic Wolf, we're not just navigating the cybersecurity landscape - we're redefining it. Our global team of dedicated Pack members is driving innovation and setting new industry standards every day. Our impact speaks for itself: we've earned recognition on the Forbes Cloud 100, CNBC Disruptor 50, Fortune Future 50, and Fortune Cyber 60 lists, and we recently took home the 2024 CRN Products of the Year award. We’re proud to be named a Leader in the IDC MarketScape for Worldwide Managed Detection and Response Services and earning a Customers' Choice distinction from Gartner Peer Insights. Our Aurora Platform also received CRN’s Products of the Year award in the inaugural Security Operations Platform category. Join a company that’s not only leading, but also shaping, the future of security operations. Our mission is simple: End Cyber Risk. We’re looking for a Senior Developer to be part of making this happen. Location: 8939 Columbine Road, Eden Prairie, MN 55347; Telecommuting permissible from any location in the US. About the Role: Translate Cybersecurity data requirements from requestors of varying levels of technical expertise into data solutions. Automate data flows and system maintenance. Create Automated Threat Reporting Solutions. Install updates or new systems, providing specifications and flowcharts to stakeholders, coordinating installation requirements. Maintain database performance by calculating optimum values for database parameters, implementing new systems, completing maintenance requirements, and evaluating computer operating systems. Support database functions by designing and coding utilities as needed. Utilize advanced ability to troubleshoot and resolve problems on a regular basis. Stay up-to-date on the latest applicable technologies and attend applicable training, conferences, and learn new tools/technologies as needed.

  • Translate Cybersecurity data requirements from requestors of varying levels of technical expertise into data solutions.
  • Automate data flows and system maintenance.
  • Create Automated Threat Reporting Solutions.
  • Install updates or new systems, providing specifications and flowcharts to stakeholders, coordinating installation requirements.
  • Maintain database performance by calculating optimum values for database parameters, implementing new systems, completing maintenance requirements, and evaluating computer operating systems.
  • Support database functions by designing and coding utilities as needed.
  • Utilize advanced ability to troubleshoot and resolve problems on a regular basis.
  • Stay up-to-date on the latest applicable technologies and attend applicable training, conferences, and learn new tools/technologies as needed.
  • Bachelor’s degree or foreign degree equivalent in Information Technology, or related field and four (4) years of experience in the job offered or related role.
  • DataLake Experience with Databricks to develop and optimize EPP, EDR pipelines, conduct schema transformations, and execute distributed data processing tasks
  • Amazon Web Services (AWS) cloud services, including Amazon Simple Storage Service for scalable data storage, Amazon Relational Database Service for data warehousing, and Amazon Elastic MapReduce for big data processing, to support data storage, processing, and analytics.
  • Experience in DataBricks Apps with Streamlit to expose Delta Live tables for high volume queries.
  • Experience in curating gold delta tables for EDR data Analytics and threat hunting data.
  • Python Programming language for data transformation and in conducting data analysis.
  • Relational database management system (RDBMS): Design, maintain and optimize database schemas, write complex Structured Query Language (SQL) queries, and perform data modeling.
  • Version Control systems like BitBucket to manage code repositories, track changes to data pipeline code, and ensure code reliability and traceability.
  • Unix Shell Scripting for file manipulation, data processing, and task scheduling
  • Continuous Integration and Continuous Deployment (CI/CD) to ensure the seamless development, testing, and deployment of data pipelines and analytical solutions. Reduce production errors and enhance data pipeline reliability.
  • Equity for all employees
  • Flexible time off and paid volunteer days
  • RRSP and 401k match
  • Training and career development programs
  • Comprehensive private benefits plan including medical, mental health, dental, disability, life and AD&D, and value-added services
  • Robust Employee Assistance Program (EAP) with mental health services
  • Fertility support and paid parental leave
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service