Snowflake SnowPro Advanced - Data Engineer Certification Exam Syllabus

DEA-C01 Dumps Questions, DEA-C01 PDF, SnowPro Advanced - Data Engineer Exam Questions PDF, Snowflake DEA-C01 Dumps Free, SnowPro Advanced - Data Engineer Official Cert Guide PDFThe Snowflake DEA-C01 exam preparation guide is designed to provide candidates with necessary information about the SnowPro Advanced - Data Engineer exam. It includes exam summary, sample questions, practice test, objectives and ways to interpret the exam objectives to enable candidates to assess the types of questions-answers that may be asked during the Snowflake Certified SnowPro Advanced - Data Engineer Certification exam.

It is recommended for all the candidates to refer the DEA-C01 objectives and sample questions provided in this preparation guide. The Snowflake SnowPro Advanced - Data Engineer certification is mainly targeted to the candidates who want to build their career in Advance domain and demonstrate their expertise. We suggest you to use practice exam listed in this cert guide to get used to with exam environment and identify the knowledge areas where you need more work prior to taking the actual Snowflake SnowPro Advanced - Data Engineer exam.

Snowflake DEA-C01 Exam Summary:

Exam Name
Snowflake SnowPro Advanced - Data Engineer
Exam Code DEA-C01
Exam Price $375 USD
Duration 115 minutes
Number of Questions 65
Passing Score 750 + Scaled Scoring from 0 - 1000
Recommended Training / Books Snowflake Data Engineering Training
SnowPro Advanced: Data Engineer Study Guide
Schedule Exam PEARSON VUE
Sample Questions Snowflake DEA-C01 Sample Questions
Recommended Practice Snowflake Certified SnowPro Advanced - Data Engineer Certification Practice Test

Snowflake SnowPro Advanced - Data Engineer Syllabus:

Section Objectives Weight
Data Movement - Given a data set, load data into Snowflake.
  • Outline considerations for data loading
  • Define data loading features and potential impact

- Ingest data of various formats through the mechanics of Snowflake.

  • Required data formats
  • Outline Stages

- Troubleshoot data ingestion.

  • Identify causes of ingestion errors
  • Determine resolutions for ingestion errors 

- Design, build and troubleshoot continuous data pipelines.

  • Stages
  • Tasks
  • Streams
  • Snowpipe (for example, Auto ingest as compared to Rest API)

- Analyze and differentiate types of data pipelines.

  • Create User-Defined Functions (UDFs) and stored procedures including Snowpark
  • Design and use the Snowflake SQL API

- Install, configure, and use connectors to connect to Snowflake.
- Design and build data sharing solutions.

  • Implement a data share
  • Create a secure view
  • Implement row level filtering

- Outline when to use External Tables and define how they work.

  • Partitioning external tables
  • Materialized views
  • Partitioned data unloading
Performance Optimization - Troubleshoot underperforming queries.
  • Identify underperforming queries
  • Outline telemetry around the operation
  • Increase efficiency
  • Identify the root cause

- Given a scenario, configure a solution for the best performance.

  • Scale out as compared to scale up
  • Virtual warehouse properties (for example, size, multi-cluster)
  • Query complexity
  • Micro-partitions and the impact of clustering
  • Materialized views
  • Search optimization service
  • Query acceleration service 

- Outline and use caching features.
- Monitor continuous data pipelines.

  • Snowpipe
  • Tasks
  • Streams
Storage and Data Protection - Implement data recovery features in Snowflake.
  • Time Travel
  • Fail-safe

- Outline the impact of Streams on Time Travel.
- Use System Functions to analyze Micro-partitions.

  • Clustering depth
  • Cluster keys

- Use Time Travel and Cloning to create new development environments.

  • Clone objects
  • Validate changes before promoting
  • Rollback changes
Security - Outline Snowflake security principles.
  • Authentication methods (Single Sign-On (SSO), Key pair Authentication, Username/Password, Multi-factor Authentication (MFA))
  • Role Based Access Control (RBAC)
  • Column Level Security and how data masking works with RBAC to secure sensitive data

- Outline the system defined roles and when they should be applied.

  • The purpose of each of the system defined roles including best practices usage in each case
  • The primary differences between SECURITYADMIN and USERADMIN roles
  • The difference between the purpose and usage of the USERADMIN/SECURITYADMIN roles and SYSADMIN

- Manage Data Governance.

  • Explain the options available to support column level security including Dynamic Data Masking and External Tokenization
  • Explain the options available to support row level security using Snowflake Row Access Policies
  • Use DDL required to manage Dynamic Data Masking and Row Access Policies
  • Use methods and best practices for creating and applying masking policies on data
  • Use methods and best practices for Object Tagging
Data Transformation - Define User-Defined Functions (UDFs) and outline how to use them.
  • Snowpark UDFs (for example, Java, Python, Scala)
  • Secure UDFs
  • SQL UDFs
  • JavaScript UDFs
  • User-Defined Table Functions (UDTFs)

- Define and create External Functions.

  • Secure external functions
  • Integration requirements

- Design, build, and leverage Stored Procedures.

  • Snowpark stored procedures (for example, Java, Python, Scala)
  • SQL Scripting stored procedures
  • JavaScript stored procedures
  • Transaction management

- Handle and transform semi-structured data.

  • Traverse and transform semi-structured data to structured data
  • Transform structured data to semi-structured data
  • Understand how to work with unstructured data

- Use Snowpark for data transformation.

  • Understand Snowpark architecture
  • Query and filter data using the Snowpark library
  • Perform data transformations using Snowpark (for example, aggregations)
  • Manipulate Snowpark DataFrames
Your rating: None Rating: 5 / 5 (80 votes)