Charter Spectrum-posted 2 months ago
Town And Country, MO
5,001-10,000 employees
Telecommunications

Curious how your expertise in data engineering could drive impactful solutions across reporting, analytics, and data science? As a Data Engineer IV at Spectrum, you will play a pivotal role in building reliable data systems and automating pipelines that support strategic decision-making. Collaborate with stakeholders, optimize data processes, and deliver scalable solutions that enable the organization to thrive through actionable insights. Your contributions will directly elevate Spectrum's data-driven capabilities.

  • Design and maintain scalable systems that support data operations for reporting, analytics, applications, and data science
  • Gather and process raw data at scale using scripts, web scraping, APIs, SQL queries, applying ETL methods to clean and enhance datasets
  • Assess data quality, integrity, accuracy, and completeness through profiling techniques
  • Develop and implement tools, scripts, queries, and applications for ETL/ELT and data operations
  • Design, build, and automate Machine Learning Data Pipeline
  • Deliver solutions by coding, developing, and testing scripts, ensuring timely delivery and reporting
  • Manage life cycle of multiple data sources, collaborating with analysts and data scientists to meet data needs
  • Accelerate project delivery by automating workload and workflow
  • Bachelor's degree in an Engineering discipline or Computer Science
  • 5+ years of hands-on working experience with RDBMS, SQL, scripting, and coding
  • 3+ years of Linux/Unix/CentOS system admin
  • Ability to read, write, speak and understand English
  • Ability to use a wide variety of open-source technologies and cloud services and identify and resolve end-to-end performance, network, server, and platform issues
  • Extensive coding/scripting experience using Python, R, shell scripts
  • Extensive experience with Spark, Hadoop/Hive, SQL, Tableau, ML Pipeline techniques, and ETL techniques
  • Extensive background in Linux/Unix/CentOS installation and administration; Windows experience preferred
  • Extensive knowledge in data storage that demonstrates knowledge of when to use a file system, relational database, or NoSQL variant
  • Extensive familiarity with JavaScript API, Rest API or Data Extract APIs
  • Extensive experience receiving, converting, and cleansing big data and with data virtualization concepts, and software (Denodo, Teiid, Jboss), visualization or BI tools, such as Tableau, and data workflow/data prep platforms, such as Infomatica, Pentaho, or Talend
  • Effective attention to detail with the ability to effectively prioritize and execute multiple tasks
  • Leadership experience in advanced operational analytics
  • Extensive experience with Python, JSON, Rest API or Data Extract APIs, data workflow/data prep platforms such as NiFi, Knime, or Talend
  • Expert knowledge of best practices and IT operations in an always-up, always-available service
  • Experience with Cloud platforms and Cloud services such as AWS (S3, Glue, EMR, Redshift, Athena)
  • Experience with process-scheduling/orchestration products such as Apache Airflow, AWS Data Pipelines, or Azure Data Factory
  • Extensive experience with data virtualization concepts, and software (Denodo, Tibco, PolyBase) and experience with visualization or BI tools, such as Tableau and Power BI
  • Extensive experience receiving, converting, and cleansing big data
  • Extensive knowledge of best practices and IT operations in an always-up, always-available service
  • Ability to create proof of concept experiments for analytics, machine learning, or visualization tools that include hypothesis, test plans, and outcome analysis
  • Comprehensive pay and benefits package that rewards employees for their contributions to our success, supporting all aspects of their well-being at every stage of life.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service