At PGE, our work involves dreaming about, planning for, and realizing a smarter, cleaner, more enduring Oregon neighborhood. Its core to our DNA and we haven’t stopped since we started in 1888. We energize lives, strengthen communities and drive advancements in energy that promote social, economic and environmental progress. We’re always on the lookout for people passionate about leading and being a part of teams that are advancing innovative clean energy solutions that are also affordable and accessible to all. SUMMARY Portland General Electric Co. seeks Application Developer to work in Portland, OR RESPONSIBILITIES The Application Developer provides advanced data engineering services for complex, large-scale data systems in a fast-paced, innovative environment. This role involves designing, developing, and optimizing scalable data pipelines, ETL processes, and data warehousing solutions using AWS services and Snowflake. Responsibilities include coding, testing, debugging, and documenting complex data processing applications using Python, ensuring high data quality and integrity, and integrating diverse data sources to meet complex data requirements. The candidate will also design and implement intricate data models and algorithms using advanced technologies such as AWS Glue, AWS Lambda, and Snowflake's unique features. They will utilize tools like AWS CloudWatch and Snowflake's query profiler to optimize query performance and generate analytics reports. The role requires recommending and designing innovative data architectures that ensure scalability, security, and seamless integration with existing systems. This position also involves translating conceptual data models into efficient physical designs in Snowflake, producing detailed technical documentation, and implementing data governance policies to ensure compliance with data privacy regulations using tools like Snowflake’s role-based access control (RBAC) and column-level security. The candidate will define and manage complex data integration processes, collaborating with cross-functional teams to align data solutions with business objectives. Responsibilities also include configuring and optimizing cloud-based data environments using AWS S3 and Snowflake, creating rigorous integration test plans, and conducting thorough performance testing. In addition, the candidate will lead troubleshooting efforts for critical data issues and implement strategic solutions to prevent recurrence. This includes developing and maintaining Python based ETL jobs for data reconciliation, implementing robust data pipelines using Snowflake tasks and AWS Step Functions, creating efficient stored procedures in Snowflake, and configuring AWS Glue jobs for data cleansing and automation. The role involves streamlining the software development lifecycle for data solutions using Jenkins pipelines for CI/CD.
Stand Out From the Crowd
Upload your resume and get instant feedback on how well it matches this job.
Job Type
Full-time
Career Level
Mid Level
Number of Employees
501-1,000 employees