The Venetian Resort Las Vegas-posted 3 months ago
Las Vegas, NV
5,001-10,000 employees
Accommodation

The primary responsibility of the Data Engineer II - Enterprise Analytics is assisting in designing, developing, and deploying data-driven solutions as part of Enterprise Analytics data strategy and goals. Data Engineer II - Enterprise Analytics is responsible for creating reliable ETLs and scalable data pipelines to support Analytics and BI environment (including modeling and machine learning, visualizations, reports, forecasts, etc.). Data Engineer II - Enterprise Analytics participates development of robust data models by interpreting business logic required to turn complex ideas into a sustainable value-add processes. All duties are to be performed in accordance with departmental and The Venetian Resort's policies, practices, and procedures.

  • Collaborate with Enterprise Analytics BI Analysts, Data Scientists, and other business stakeholders to understand business problems and data requirements to build data structures to be ingested by analytics products (e.g.: reports, dashboards, etc.) and complex algorithms that provide unique insights into data.
  • Build data pipelines that clean, transform, and aggregate data from disparate sources.
  • Develop robust data models, including dimensional models, that can be used to answer questions for overall business and assist Data Scientists in predictive models building.
  • Develop logic for KPIs as requested by the business leadership.
  • Troubleshoot existing and create new ETLs and pipelines, SSIS packages, DAGs, Python/ Big Query/ SQL stored procedures and jobs.
  • Write efficient and optimized SQL code for use in data pipelines and data processing.
  • Drive data quality processes like data profiling, data cleansing, etc.
  • Develop best practices and approaches to support continuous process automation for data ingestion and data pipelines.
  • Use innovative problem solving and critical thinking approaches to troubleshoot challenging data obstacles.
  • Test, optimize, troubleshoot, and fine-tune queries for maximum efficiency in addition to accuracy of results.
  • Perform QA and UAT processes to foster an agile development cycle.
  • Participate in informal reviews of design, code, QA and UAT artifacts, both for owned work and for the work of colleagues. Provide challenging and meaningful feedback when appropriate.
  • Create documentation on table design, mapping out steps and underlying logic within data marts to facilitate data adoption with minimum guidance from the Enterprise Analytics management.
  • Identify opportunities for improvement not just in owned work, but also other areas of the department.
  • Safety is an essential function of this job.
  • Consistent and regular attendance is an essential function of this job.
  • Performs other related duties as assigned.
  • 21 years of age.
  • Proof of authorization/eligibility to work in the United States.
  • Bachelor's degree in computer science, information systems, engineering, analytics, or related field is required; Master's degree preferred.
  • Must be able to obtain and maintain a Nevada Gaming Control Board registration and any other certification or license, as required by law or policy.
  • 3+ years of experience in building data pipelines and ETL processes is required.
  • 3+ years of experience in writing advanced SQL, data mining and working with traditional relational databases (tables, views, window functions, scalar and aggregate functions, primary/foreign keys, indexes DML/DDL statements, joins and unions) and/or distributed systems (Hadoop, Big Query) is required.
  • 1+ years of experience with programming/scripting languages such as Python or Big Query is required.
  • Hands-on experience using Git and working in a CI/CD development environment is preferred.
  • Excellent understanding of data types, data structures and database systems and their specific use cases is required.
  • Experience in Microsoft Azure, Google Cloud Platform, Databricks or other cloud-based development environments is required.
  • Ability to perform root cause analysis on external and internal processes and data to identify opportunities for improvement and answer questions.
  • Strong understanding of data modeling principles including Dimensional modelling, and Data Normalization principles is required.
  • Strong understanding of performance tuning, especially in cloud-based environments, is preferred.
  • Excellent critical thinker and effective problem solver with creative solutions.
  • Strong communication skills, especially for explaining technical concepts to nontechnical business leaders.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service