Data Engineer II

Shoptikal, LLCGreen Bay, WI
1d

About The Position

Responsible for independently designing, building, and maintaining scalable data pipelines and infrastructure to support company data-driven initiatives. Take ownership of complex data systems, implement best practices, and collaborate closely with cross- functional teams to ensure the availability, reliability, and performance of the organizations data platform.

Requirements

  • Bachelor’s degree in computer science, data engineering, information systems, or related field or equivalent experience; master’s degree preferred
  • 5+ years of hands-on experience in data engineering or related roles
  • Professional experience using Python, Java, or Scala for data processing
  • Solid understanding of SQL and analytical data warehouses (Snowflake, Redshift)
  • Hands-on experience implementing ETL/ELT best practices at scale
  • Experience with data pipeline tools (Airflow, Luigi, Azkaban, dbt)
  • Strong data modeling skills and familiarity with Kimball methodology
  • Experience with big data technologies (Hadoop, Spark, Kafka, Hive)
  • Proficiency with cloud platforms (AWS, Azure, or GCP) and their data services
  • Understanding of data governance and security best practices
  • Excellent communication skills
  • Able to work independently and as part of a team
  • Willing to share knowledge and experience with other members of the team
  • Strong analytical and problem-solving skills
  • Attention to detail and commitment to data quality
  • Solid planning and organizational skills
  • Proficiency with Microsoft Office Suite of programs
  • Ability to effectively communicate at all levels within the organization through written and two-way verbal communication
  • Able to read and write at a high school graduate level
  • Able to sit or stand for extended periods of time
  • Able to operate various office equipment (e.g., personal computer, telephone, fax machine, copier, etc.)
  • Able to lift 10 to 20 pounds
  • Able to work overtime and regular and/or extended (evenings, nights, and weekends) office hours to meet established deadlines
  • Able to travel independently to support Company objectives and personal development

Nice To Haves

  • Master’s degree preferred
  • Experience with stream-processing systems preferred
  • Knowledge of Data Lifecycle Management processes preferred
  • Familiarity with Agile methodologies preferred
  • Relevant certifications (Google Professional Data Engineer, AWS Data Analytics, Cloudera CCP)

Responsibilities

  • Design, implement, and optimize end-to-end data pipelines for ingesting, processing, and transforming large volumes of structured and unstructured data
  • Develop robust ETL/ELT processes to integrate data from diverse sources into the data ecosystem
  • Implement data validation and quality checks to ensure accuracy and consistency
  • Build analytical tools that provide practical insights into key business performance indicators
  • Conduct code reviews
  • Design and maintain data models, schemas, and database structures to support analytical and operational use cases
  • Optimize data storage and retrieval mechanisms for performance and scalability
  • Evaluate and implement data storage solutions including relational databases, NoSQL databases, data lakes, and cloud storage services
  • Generate architecture recommendations and implement improvements
  • Configure and manage data infrastructure components including databases, data warehouses, data lakes, and distributed computing frameworks
  • Monitor system performance, troubleshoot issues, and implement optimizations
  • Implement data security controls and access management policies
  • Plan and execute system expansion to support company growth and analytic need
  • Build and maintain integrations with internal and external data sources and APIs
  • Implement RESTful APIs and web services for data access and consumption
  • Ensure compatibility and interoperability between different systems and platforms
  • Work with SaaS application APIs (Salesforce, Zuora, Zendesk, Marketo)
  • Collaborate with internal and external data scientists, analysts, and stakeholders to understand requirements and deliver solutions
  • Document technical designs, workflows, and best practices
  • Provide technical guidance and support to team members
  • Perform technical interviews and contribute to hiring decisions
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service