Nebraska Furniture Martposted about 1 month ago
$91,934 - $138,906/Yr
Full-time • Senior
Hybrid • Omaha, NE
Electrical Equipment, Appliance, and Component Manufacturing

About the position

As NFM's Senior Cloud Engineer, you will be responsible for any technological duties associated with cloud computing, including design, planning, management, maintenance, and support. You will design, build, and manage our cloud-based data infrastructure, playing a critical role in the development and optimization of NFM's data lake, ensuring scalability, reliability, and security. This is a remote position requiring the candidate to reside in one of the following states: NE, IA, MO, KS, or TX. Applicants must be currently authorized to work in the USA on a full-time basis. NFM will not sponsor applicants for work visas for this position.

Responsibilities

  • Design and implement scalable and secure data solutions using Fabric OneLake technology
  • Develop and maintain scalable data pipelines and architectures that support data ingestion, processing, storage, and delivery across multiple sources and destinations
  • Maintain knowledge of vendor technical solutions in the cloud space
  • Create data tools for analytics and data scientist team members to assist them in building and optimizing our product
  • Manage the full lifecycle and maintain the NFM cloud environment
  • Assist the I.T. organization / teams with integrations into configuration management and deployment tools

Requirements

  • Bachelor's degree in computer science, engineering, or a related field, or equivalent work experience
  • 5+ years' experience in network or systems administration or engineering
  • 2+ years cloud experience
  • Proficiency in SQL and Python, and familiarity with other programming languages and frameworks such as Scala, R, or Spark
  • Experience with cloud-based data services and platforms such as Azure, AWS, or GCP, and with data warehouse and ETL tools such as Snowflake, SSIS, or Informatica
  • Knowledge of data modeling, data quality, data governance, and data security best practices and standards
  • Proven experience with Azure DataBricks, Azure Data Factory, and other Azure services
  • Strong analytic skills related to working with unstructured datasets
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with data pipeline and workflow management tools
  • Experience with Azure SQL DB, Cosmos DB, or other database technologies
  • Experience with stream-processing systems
  • Strong project management and organizational skills

Nice-to-haves

  • Experience with Fabric OneLake development and management
  • Knowledge of networking within Azure Data Bricks, including VNET settings and firewall rules
  • Ability to set up linked services within Azure Data Factory and execute ADB notebooks
  • Familiarity with on-premises to cloud data migration and managing data across hybrid environments
  • Proficient in Azure, including Azure DataBricks, Azure Data Factory, and Azure SQL Data Warehouse
  • Strong command of Python, Scala, and SQL
  • Experience with Hadoop, Spark, Kafka, and other big data technologies
  • Knowledge of various data storage solutions like Azure Blob Storage, Azure Data Lake Storage, and Cosmos DB
  • Familiarity with data processing tools such as Azure Stream Analytics and Azure HDInsight
  • Experience with CI/CD pipelines, using tools like Jenkins, Azure DevOps, or GitHub Actions
  • Understanding of data security practices, including encryption, data masking, and access control within cloud environments
  • Proficiency in using monitoring tools like Azure Monitor and log analytics solutions like Azure Log Analytic
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service