Data Engineer I (59869)

Core Health & Fitness, LLCVancouver, WA
1d$38 - $41Hybrid

About The Position

At Core Health & Fitness, our purpose is to live and share our passion for fitness. We bring innovative health and fitness solutions to the global market with brands like StairMaster, Schwinn, Nautilus, Star Trac, Throwdown, Wexer, and we’re still growing. We press into the future of fitness to ensure the creation of quality products and programming that meet the needs of an ever-evolving industry. At Core we are committed to building an energetic, diverse, and inclusive workspace. We value our differences and see community strength in diversity and representation. We’re always on the lookout for innovators, dreamers and doers who are passionate about fitness and wellbeing. We explore all opportunities to improve ourselves, our business partners, and our communities. If you’re looking for a fulfilling career in helping people, find the best version of themselves, you’ve come to the right place. We are looking for a Data Engineer I to join our growing organization! QualificationsGeneral Position Summary As a Data Engineer at Core Health & Fitness, you will play a pivotal role in designing, developing, and maintaining our data infrastructure. This position will work closely with our IT department, Business Intelligence Analysts, Systems Software Engineers, and other stakeholders to ensure the efficient collection, storage, and accessibility of data. Your expertise will enable us to make informed decisions and drive our business forward. This position is eligible to work from home with monthly in-office meetings

Requirements

  • Education: Bachelor's or master’s degree in Computer Science, Information Technology, Engineering, or a related field – or equivalent 3+ years of experience in data engineering, data warehousing, or a related field.
  • Technical Skills: Proficiency in SQL and experience with relational databases (e.g., MySQL, PostgreSQL, Oracle).
  • Experience with ETL processes which aggregate IoT data from various external sources into a common data lake format.
  • Experience with big data technologies.
  • Proficiency in programming languages such as Python or Java.
  • Familiarity with ETL tools (e.g., Apache NiFi, Talend, Informatica).
  • Experience with cloud platforms (e.g., Azure, AWS) and their data services (e.g., Redshift, Databricks, BigQuery).
  • Experience with data modeling and schema design.
  • Familiarity with data visualization tools (e.g., Power BI, Tableau)
  • Experience building and managing data connections and integrations for transferring data between data lakes, cloud storage, and other software systems (e.g., CRM, ERP, loT platforms.)
  • Problem-Solving – Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
  • Communication – Excellent communication skills with the ability to collaborate effectively with other stakeholders.
  • Creativity/Innovation - Generates new ideas, challenges the status quo, takes risks, supports change, encourages innovation, solves problems creatively.
  • Initiative - Tackles problems and takes independent action, seeks out new responsibilities, acts on opportunities, generates new ideas, practices self-development.
  • Productivity - Manages a fair workload, volunteers for additional work, prioritizes tasks, develops good work procedures, manages time well, handles information flow.
  • Quality - Is attentive to detail and accuracy, is committed to excellence, looks for improvements continuously, monitors quality levels, finds root cause of quality problems, owns/acts on quality problems.

Responsibilities

  • Data Pipeline Development: Design, build, and maintain scalable and robust data pipelines to ingest, process, and transform data from various sources (IoT, internal business systems, etc.).
  • Data Warehousing: Develop and optimize data warehouses and data lakes to ensure efficient storage and retrieval of large datasets.
  • ETL Processes: Create and manage ETL (Extract, Transform, Load) processes to ensure data is accurate, reliable, and up to date.
  • Collaboration: Work closely with Business Intelligence Analysts, IT, Systems Software Engineers, and other stakeholders to understand data requirements and deliver solutions that meet their needs.
  • Database Management: Administer and optimize databases, ensuring high performance, security, and availability.
  • Data Quality: Implement data quality checks and validation procedures to maintain the integrity of the data.
  • Automation: Automate repetitive tasks to improve efficiency and reduce manual effort.
  • Documentation: Maintain comprehensive documentation of data processes, architectures, and workflows.
  • Analytics & Reporting: enable data analytics and reporting by providing well-structured, consistent data to either be analyzed within data warehouse or within an external repository system (CRM, ERP, etc.).
  • API Development: Design, develop, and maintain RESTful APIs to facilitate data exchange between internal systems, external partners, and third-party applications, ensuring secure and efficient data integration.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service