Data Engineer

Kiewit CorporationPapillion, NE
4dOnsite

About The Position

The Data Engineer is responsible for designing, developing, and maintaining scalable, reliable, and secure data pipelines within a modern data platform environment. This role supports self-service data consumption, reporting, analytics, and ensures data quality, integrity, and compliance. This role offers opportunity for professional development, certifications, and career advancement, exposure to large-scale, structured and unstructured data environments and participation in a dynamic, innovative, and supportive team culture.District Overview Kiewit Technology Group builds solutions to enable and support our company's expansive operations. Our mission is to deliver project schedule and cost certainty by employing technology designed by and for the construction industry. Our team utilizes systems and tools that manage every part of Kiewit's business and the project lifecycle to improve planning and day-to-day execution in the field. We give our people real-time data to make faster, smarter decisions. Location This is an in-office role located on our LaVista, NE campus. This is not a remote opportunity and requires regular, daily attendance.

Requirements

  • 2-5 years of experience in a data engineering role, preferably in a data warehouse or cloud data platform environment.
  • Proficiency in SQL, data modeling, and performance optimization.
  • Experience with ETL/ELT tools (e.g., Informatica, Azure Data Factory, dbt) and scripting languages (e.g., Python).
  • Experience with data pipeline orchestration tools (e.g., Airflow, Prefect) and cloud platforms (AWS, Azure, GCP).
  • Proven ability to interpret complex business rules and processes, and translate them into robust data integration solutions.
  • Strong commitment to meeting deadlines and delivering high-quality solutions.
  • Excellent interpersonal and communication skills, with the ability to explain technical concepts to both technical and non-technical audiences.
  • Demonstrated ability to work collaboratively in agile, cross-functional teams.
  • Passion for continuous learning and professional development in the data engineering space.
  • Regular, reliable attendance
  • Work productively and meet deadlines timely
  • Communicate and interact effectively and professionally with supervisors, employees, and others individually or in a team environment.
  • Perform work safely and effectively. Understand and follow oral and written instructions, including warning signs, equipment use, and other policies.
  • Work during normal operating hours to organize and complete work within given deadlines. Work overtime and weekends as required.
  • May work at various different locations and conditions may vary.

Nice To Haves

  • Familiarity with containerization (Docker, Kubernetes) and infrastructure automation (Terraform) is a plus.

Responsibilities

  • Design, develop, and optimize data pipelines and ETL/ELT processes for ingesting, transforming, and loading data from diverse sources (databases, file systems, cloud platforms) into BI and analytics platforms (e.g., Snowflake, Azure Data Lake, SQL Server).
  • Implement data orchestration using modern tools (e.g., Airflow, Prefect, Azure Data Factory).
  • Collaborate with data scientists, analysts, business stakeholders, and cross-functional IT teams to understand data requirements and deliver solutions.
  • Contribute to data modeling (star/snowflake schema), documentation, and metadata management.
  • Ensure data quality, integrity, and compliance with relevant standards and regulations (e.g., GDPR, HIPAA).
  • Monitor, troubleshoot, and resolve production issues in a timely manner.
  • Participate in agile delivery teams, including design and code review sessions, to ensure optimal, bug-free code is deployed to production.
  • Support CI/CD processes and version control (e.g., Git) for data engineering workflows.
  • Mentor junior engineers and contribute to team knowledge sharing and best practices.
  • Stay current with emerging technologies and trends in data engineering, cloud platforms, and data governance.

Benefits

  • We offer our fulltime staff employees a comprehensive benefits package that's among the best in our industry, including top-tier medical, dental and vision plans covering eligible employees and dependents, voluntary wellness and employee assistance programs, life insurance, disability, retirement plans with matching, and generous paid time off.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

5,001-10,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service