About The Position

Want to help create the future of work? We’re putting AI and intelligent automation to real use in the hands of real people. We are looking for a Senior Data Engineer specializing in working with data pipelines and multiple types of data formats as we bring our next-gen platform to the world. This is a rare opportunity for someone who wants to be at the center of the AI movement and actually build things that matter. As a Senior Data Engineer at Kizen , you’ll work directly with our Senior Director of Ecosystem Engineering and partner closely with our Implementations team to design and develop scalable data pipelines and system integrations that power the platform. You’ll collaborate with product owners, solutions engineers, and software engineers to drive technical projects that require deep expertise in data transformation and real-world system connectivity.

Requirements

  • 3+ years of cloud-based, high-level experience focused on modeling, building, and maintaining production-grade data solutions
  • Strong proficiency in at least one of: JavaScript/TypeScript, Python, Go, or other modern programming languages
  • Advanced knowledge of SQL, database design, and data transformation techniques
  • Strong understanding of authentication mechanisms (OAuth, API keys, JWT) and API security best practices
  • Experience with containerization (Docker) and container orchestration (Kubernetes)
  • Familiarity with CI/CD pipelines and automated deployment processes
  • Solid understanding of software design patterns and architectural principles
  • Experience with version control systems (Git) and collaborative development workflows

Nice To Haves

  • Experience with cloud platforms (AWS, GCP, Azure)
  • Experience with message queues and streaming platforms (Kafka, RabbitMQ)
  • Background in enterprise software integrations across CRM, ERP, marketing automation systems
  • Experience with SQL databases and data warehousing solutions
  • Understanding of data security, compliance, and privacy considerations in software development

Responsibilities

  • Design, develop and deploy efficient data pipelines and ETL processes with modern software engineering practices
  • Ensure data integrity and quality by implementing robust data validation, schema checks, and error-handling mechanisms to prevent data corruption and maintain reliability.
  • Serve as a technical steward by continuously optimizing data performance and resource utilization to ensure maximum cost-effectiveness and system speed.
  • Write clean, maintainable code for complex data transformation workflows between disparate systems
  • Optimize existing pipelines for performance, security, and maintainability
  • Collaborate with the engineering team in code reviews and architectural decisions
  • Create technical documentation for data specifications
  • Evaluating and recommending process modifications
  • Participate in agile development processes including sprint planning and retrospectives

Benefits

  • Career Growth Opportunities
  • Engaging Work Culture
  • Top-Tier Compensation
  • Equity Package
  • Healthcare Coverage
  • Professional Development Stipends
  • PTO

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

11-50 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service