Data Engineer

Southern CompanyAtlanta, GA
10d

About The Position

The Data Engineer I (TO) is responsible for translating data into readily consumable forms to deliver integrated data to consumers by building, operationalizing, and maintaining data pipelines for Data, Analytics & Artificial Intelligence use cases across heterogeneous environments. The Data Engineer I (TO) also plays a role in working with various data integration tools which support a combination of data delivery styles such as virtualization, data replication, messaging and streaming in hybrid and multi-cloud integration scenarios.

Requirements

  • Bachelor’s degree in computer science (CS), MIS, CIS, Mathematics, Statistics (Theoretical/Computational), Machine Learning or a related field.
  • Proven knowledge of data engineering, data integration and data science principles are required
  • 6+ years of related work experience in a fast-paced, competitive organization driven by data and enabled by technology
  • Working experience with batch and real-time data processing frameworks.
  • Working experience with data modelling, data access, schemas, and data storage techniques.
  • Working experience with data quality tools.
  • Experience in creating functional and technical designs for data engineering and analytics solutions.
  • Experience implementing data models of different schemas and working with diverse data source types.
  • Hands-on experience developing solutions with big data technologies such as Hadoop, HIVE and Spark.
  • Hands-on experience developing and supporting Python based AI/ML solutions.
  • 6+ years hands on experience designing, developing, testing, deploying, and supporting data engineering and analytics solutions using on-premises tools such as, Microsoft’s BI Stack (SSIS/SSAS/SSRS), Informatica, Oracle Golden Gate, SQL, Oracle, and SQL Server.
  • 4+ years hands on experience designing, developing, testing, deploying, and supporting data engineering and analytics solutions using Microsoft cloud-based tools such as Azure Data Lake, Azure Data Factory, Azure Databricks, Python, Azure Synapse, Azure Key Vault, and Power BI.
  • Experience with Containerization methodologies – Docker, OpenShift etc.,
  • Experience with Agile as well as DevOps, CI/CD methodologies.
  • Hands-on experience designing and developing solutions involving data sourcing, enrichment and delivery using APIs & Web Services.
  • Experience working with Jira or similar tools.
  • Experience working with Kafka or similar tools.

Responsibilities

  • Designing and Developing methods to process structured, semi-structured, and unstructured data using batch and real-time data processing techniques.
  • Delivering fast, reliable, and scalable data by incrementally and efficiently processing as it arrives from files or streaming sources like Kafka, DBMS and NoSQL.
  • Developing Release pipelines to automate recurring manual tasks like creating a build package, checking that build package into a version control repository and deploying it to a DV/UA environment.
  • Building and Maintaining templates such as code libraries, pipeline patterns and semantic models to promote reuse and agility.
  • Establishing gatekeeping processes that monitor and controls the promotion of successful data processes into production by understanding the business criticality.
  • Collaborating with cross-functional teams with a combination of data, business, and technical personas, as well as a product owner/manager as necessary.
  • Advocating data reusability by breaking down monolithic data delivery processes into modular data product delivery.
  • Ensuring data reliability by defining data quality and integrity controls within the pipeline with defined data expectations and addressing data quality errors with predefined policies.
  • Actively working with less experienced data engineers providing technical guidance and oversight.
  • Understanding the usage of performance optimization clusters that parallelize jobs and minimize the data movement in Batch and stream data processing.
  • Recommending improvements to the processes, technology, and interfaces that reduce the development time and effort and enhance the effectiveness of the team.
  • Promptly participating in the Enterprise Social Networking sites, staying up to date on new data technologies and best practices and sharing insights with others in the organization.

Benefits

  • Southern Company invests in the well-being of its employees and their families through a comprehensive total rewards strategy that includes competitive base salary, annual incentive awards for eligible employees and health, welfare and retirement benefits designed to support physical, financial, and emotional/social well-being.
  • This position may also be eligible for additional compensation, such as an incentive program, with the amount of any bonus/awards subject to the terms and conditions of the applicable incentive plan(s).
  • A summary of the benefits offered for this position can be found here https://seo.nlx.org/southernco/pdf/SOCO-Benefits.pdf.
  • Additional and specific details about total compensation and benefits will also be provided during the hiring process.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service