Senior Data Engineer

NinjaOne
57d$110,000 - $200,000Hybrid

About The Position

At NinjaOne, we’re looking for a skilled Senior Data Engineer to join our team and help drive the future of our data infrastructure. You’ll play a critical role in building, maintaining, and scaling our systems to ensure smooth data flow, accuracy, and security across the organization. This is an exciting opportunity to work on innovative projects, collaborate with cross-functional teams, and help shape how we leverage data to fuel growth, optimize products, and drive business decisions. Location - We are flexible on remote working from home, if you are located in the USA and reside in one of the following states - CA, CO, CT, FL, GA, IL, KS, MA, MD, ME, NJ, NC, NY, OR, TN, TX, VA, and WA. We have physical offices in Austin, TX and Tampa, FL, if you prefer a hybrid option. We hire the best software engineers, but experience in our stack can’t hurt: NinjaOne is built on Java, Kotlin, C++, Golang and Postgres; supporting millions of user endpoints and running as a scalable cloud service in AWS. Knowing large-scale datastore bottlenecks, asynchronous application design and client-server architecture will help you.

Requirements

  • Bachelor’s degree in Computer Science, Computer Engineering, Information Technology or equivalent work experience preferred.
  • 10+ years of experience in software development, with a strong focus on data engineering and data science.
  • Experience in building data pipelines and managing large-scale data systems using technologies like SQL and Python.
  • Expertise in Python.
  • Experience in cloud platforms like AWS, GCP, or Azure, and experience with tools like Airflow, Kafka or dbt for orchestrating data workflows.
  • Mastery with both relational databases including MySQL, PostgreSQL and NoSQL databases like MongoDB, Cassandra.
  • Experience with data warehousing concepts and tools such as Redshift, BigQuery, Snowflake.
  • Solid understanding of Microservices Architecture and DevOps principles.

Nice To Haves

  • Previous experience working with large-scale data pipelines and machine learning models.
  • Understanding of Generative AI and Deep Learning frameworks.

Responsibilities

  • Data Pipeline Development: Design and implement scalable data pipelines that move and transform large volumes of data from multiple sources to central data warehouses, transforming data to enable business reporting and advanced analytics.
  • Database Management: Manage and optimize the performance of relational databases, ensuring data availability, reliability, and consistency.
  • Automation & Optimization: Automate and optimize data workflows to reduce manual processes and improve efficiency in data collection, storage, and processing.
  • Monitoring & Maintenance: Ensure the integrity and security of data across systems, monitor performance, and troubleshoot any issues that arise within the data pipeline.
  • Data Visualization: Build dashboards and reports in Tableau and Databricks to expose key data points and trends to business stakeholders.
  • Collaboration: Work closely with data scientists, analysts, and other teams to gather requirements, understand data needs, and provide solutions that support data-driven decision-making.
  • Other duties as needed.

Benefits

  • We are a collaborative, kind, and curious community.
  • We honor your flexibility needs with full-time work that is hybrid remote.
  • We have you covered with our comprehensive benefits package, which includes medical, dental, and vision insurance.
  • We help you prepare for your financial future with our 401(k) plan.
  • We prioritize your work-life balance with our unlimited PTO.
  • We reward your work with opportunity for growth and advancement.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service