Data Engineer -Software Developer II

Lucas James Talent PartnersChicago, IL
12hHybrid

About The Position

We are seeking a mid-to-senior level Software Developer with a strong Python focus to join our engineering team. This role centers on building production-grade Python services, APIs, and ETL workflows that operate at scale across AWS. While the position interfaces closely with data pipelines and cloud infrastructure, the primary focus is software development: writing clean, maintainable Python code; designing APIs; and building reliable backend services that integrate with AWS-native tools. This is an excellent opportunity for a Python-first developer who enjoys cloud-native architecture, backend systems, and infrastructure-aware application design.

Requirements

  • Strong proficiency in Python (3.x), with experience writing production-grade, testable, and maintainable code.
  • Experience building APIs or backend services using Flask, FastAPI, Django, or similar Python frameworks
  • Hands-on AWS experience with services like S3, Lambda, Glue, API Gateway, ECS/EKS, Step Functions, DynamoDB, and CloudWatch.
  • Proven ability to write efficient, scalable, and cost-conscious code in a cloud environment.
  • Experience with API development (REST/GraphQL) and integrating APIs with AWS services.
  • Experience with containerization and orchestration (Docker, Kubernetes on EKS).
  • Familiarity with Terraform or CloudFormation for infrastructure as code.
  • Solid understanding of cloud networking, IAM, and security principles within AWS.
  • Knowledge of observability and monitoring (CloudWatch, Prometheus, Grafana, Datadog).
  • Familiarity with ETL design patterns, data pipelines, and orchestration tools (Airflow, Prefect, Dagster, or AWS-native).
  • Working knowledge of SQL and data modeling.
  • Strong debugging and problem-solving skills.

Nice To Haves

  • Experience with production ML workflows (KubeFlow, MLFlow)
  • Exposure to event-driven architectures (Kinesis, Kafka, SNS/SQS).
  • Experience with cost optimization strategies in AWS (e.g., reserved/spot instances, right-sizing resources).
  • Experience with connecting with several types of client CRMs such as Salesforce, StackAdapt, Thoughtspot
  • Experience mentoring junior engineers to provide industry best practices/habits

Responsibilities

  • Design, build, and maintain Python applications and APIs that support ETL workflows.
  • Integrate deeply with AWS services such as S3, Lambda, Glue, ECS/EKS, Athena, Step Functions, DynamoDB, and API Gateway to deliver scalable, cloud-native solutions.
  • Write efficient, cost-aware code that optimizes performance and leverages AWS resources responsibly.
  • Automate and orchestrate workflows using AWS tools (e.g., Step Functions, MWAA/Airflow) and infrastructure-as-code.
  • Monitor, debug, and optimize pipelines to ensure scalability and reliability across large data volumes.
  • Implement security best practices in AWS, including IAM policies, secrets management, and least-privilege access.
  • Contribute to CI/CD pipelines (e.g., GitHub Actions, CodeBuild, CodePipeline) and enforce best practices for testing and deployment.
  • Work closely with data engineers, analysts, and shareholders to deliver robust, cloud-native solutions.

Benefits

  • Own mission-critical infrastructure in a cloud-first environment.
  • Work in a modern AWS-based ecosystem with opportunities to shape architecture and best practices.
  • Join a collaborative and fast-moving team where your expertise in Python and AWS will have a direct impact.
  • Grow your career at the intersection of software development, cloud infrastructure, and data systems.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service