Lowe's Companies, Inc.-posted 3 months ago
Full-time • Entry Level
5,001-10,000 employees

The main purpose of this role is to build components and pipelines for delivering end-to-end data solutions for medium complex business problems along with the team. This role understands technical requirements and architecture and helps implement and maintain solutions as directed by more senior engineers. This role participates in developing the right solution, following best practices, and accounting for both functional and non-functional requirements, like reliability, scalability, performance, stability, security, and long-term maintainability.

  • Develops scalable, flexible, and maintainable data and software solutions across business/enterprise applications, ensuring alignment with architectural standards and business requirements.
  • Designs and builds data pipelines (on-premises/cloud) to move, transform, and combine datasets from multiple systems, implementing SQL-based business metrics in collaboration with technical leads, analysts, and product owners.
  • Ensures quality through unit/functional testing, supports UAT, and follows best practices in source control and CI/CD for efficient deployment.
  • Reviews technical designs, code, and documentation; participates in peer reviews and group design sessions to maintain team alignment.
  • Analyzes and organizes data to deliver consumable datasets, reports, and insights, while troubleshooting issues, supporting root cause analysis, and contributing to infrastructure and governance compliance.
  • Bachelor's degree in engineering, computer science, computer information systems (CIS), or related field (or equivalent work experience in lieu of degree).
  • 2 years of experience in data, business intelligence, or platform engineering, data warehousing/ETL, or software engineering or equivalent experience.
  • 2 years of expertise in object-oriented programming/structure programming, SQL, and scripting.
  • 2 years of experience in big data technology and Cloud big data technologies.
  • 1 year of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC).
  • Master's degree in computer science, CIS, or related field or equivalent experience.
  • 2 years of expertise in Python, spark/dataproc, airflow or GCP data engineering.
  • Exceptional benefits and opportunities to grow skills.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service