Google GCP-PDE Certification Exam Sample Questions

GCP-PDE Braindumps, GCP-PDE Exam Dumps, GCP-PDE Examcollection, GCP-PDE Questions PDF, GCP-PDE Sample Questions, Professional Data Engineer Dumps, Professional Data Engineer Official Cert Guide PDF, Professional Data Engineer VCEWe have prepared Google Professional Data Engineer (GCP-PDE) certification sample questions to make you aware of actual exam properties. This sample question set provides you with information about the Professional Data Engineer exam pattern, question formate, a difficulty level of questions and time required to answer each question. To get familiar with Google Cloud Platform - Professional Data Engineer (GCP-PDE) exam, we suggest you try our Sample Google GCP-PDE Certification Practice Exam in simulated Google certification exam environment.

To test your knowledge and understanding of concepts with real-time scenario based Google GCP-PDE questions, we strongly recommend you to prepare and practice with Premium Google Professional Data Engineer Certification Practice Exam. The premium Google Professional Data Engineer certification practice exam helps you identify topics in which you are well prepared and topics in which you may need further training to achieving great score in actual Google Cloud Platform - Professional Data Engineer (GCP-PDE) exam.

Google GCP-PDE Sample Questions:

01. Several years ago, you built a machine learning model for an ecommerce company. Your model made good predictions. Then a global pandemic occurred, lockdowns were imposed, and many people started working from home. Now the quality of your model has degraded.
You want to improve the quality of your model and prevent future performance degradation. What should you do?
a) Retrain the model with data from the first 30 days of the lockdown.
b) Monitor data until usage patterns normalize, and then retrain the model.
c) Retrain the model with data from the last 30 days. After one year, return to the older model.
d) Retrain the model with data from the last 30 days. Add a step to continuously monitor model input data for changes, and retrain the model.
 
02. You used a small amount of data to build a machine learning model that gives you good inferences during testing. However, the results show more errors when real-world data is used to run the model. No additional data can be collected for testing. You want to get a more accurate view of the model's capability.
What should you do?
a) Reduce the amount of data to improve the model.
b) Cross-validate the data, and re-run the model building process.
c) Create feature crosses that will add new columns to increase the data.
d) Duplicate the data twice to increase the data, and re-run the model building process.
 
03. Your company is hiring several business analysts who are new to BigQuery. The analysts will use BigQuery to analyze large quantities of data. You need to control costs in BigQuery and ensure that there is no budget overrun while you maintain the quality of query results.
What should you do?
a) Set a customized project-level or user-level daily quota to acceptable values.
b) Reduce the data in the BigQuery table so that the analysts query less data, and then archive the remaining data.
c) Train the analysts to use the query validator or --dry_run to estimate costs so that the analysts can self-regulate usage.
d) Export the BigQuery daily costs, and visualize the data on Looker on a per-analyst basis so that the analysts can self-regulate usage.
 
04. A new member of your development team works remotely. The developer will write code locally on their laptop, which will connect to a MySQL instance on Cloud SQL. The instance has an external (public) IP address. You want to follow Google-recommended practices when you give access to Cloud SQL to the new team member.
What should you do?
a) Ask the developer for their laptop's IP address, and add it to the authorized networks list.
b) Remove the external IP address, and replace it with an internal IP address. Add only the IP address for the remote developer's laptop to the authorized list.
c) Give instance access permissions in Identity and Access Management (IAM), and have the developer run Cloud SQL Auth proxy to connect to a MySQL instance.
d) Give instance access permissions in Identity and Access Management (IAM), change the access to "private service access" for security, and allow the developer to access Cloud SQL from their laptop.
 
05. You are working on optimizing BigQuery for a query that is run repeatedly on a single table. The data queried is about 1 GB, and some rows are expected to change about 10 times every hour. You have optimized the SQL statements as much as possible. You want to further optimize the query's performance.
What should you do?
a) Create a materialized view based on the table, and query that view.
b) Enable caching of the queried data so that subsequent queries are faster.
c) Create a scheduled query, and run it a few minutes before the report has to be created.
d) Reserve a larger number of slots in advance so that you have maximum compute power to execute the query.
 
06. Your company collects data about customers to regularly check their health vitals. You have millions of customers around the world. Data is ingested at an average rate of two events per 10 seconds per user. You need to be able to visualize data in Bigtable on a per user basis.
You need to construct the Bigtable key so that the operations are performant. What should you do?
a) Construct the key as timestamp#device-id#activity-id#user-id.
b) Construct the key as timestamp#user-id#device-id#activity-id.
c) Construct the key as user-id#device-id#activity-id#timestamp.
d) Construct the key as user-id#timestamp#device-id#activity-id.
 
07. Your customer uses Hadoop and Spark to run data analytics on-premises. The main data is stored in hard disks that are centrally accessed. Your customer needs to migrate their workloads to Google Cloud efficiently while considering scalability. You want to select an architecture that requires minimal effort.
What should you do?
a) Use Dataproc to run Hadoop and Spark jobs. Move the data to Cloud Storage.
b) Use Dataflow to recreate the jobs in a serverless approach. Move the data to Cloud Storage.
c) Use Dataproc to run Hadoop and Spark jobs. Retain the data on a Compute Engine VM with an attached persistent disk.
d) Use Dataflow to recreate the jobs in a serverless approach. Retain the data on a Compute Engine VM with an attached persistent disk.
 
08. You are building the trading platform for a stock exchange with millions of traders. Trading data is written rapidly. You need to retrieve data quickly to show visualizations to the traders, such as the changing price of a particular stock over time. You need to choose a storage solution in Google Cloud.
What should you do?
a) Use Memorystore.
b) Use Firestore.
c) Use Cloud SQL.
d) Use Bigtable.
 
09. Your Bigtable database was recently deployed into production. The scale of data ingested and analyzed has increased significantly, but the performance has degraded. You want to identify the performance issue. What should you do?
a) Use Key Visualizer to analyze performance.
b) Use Cloud Trace to identify the performance issue.
c) Add logging statements into the code to see which inserts cause the delay.
d) Add more nodes to the cluster to see if that resolves the performance issue.
 
10. Your cryptocurrency trading company visualizes prices to help your customers make trading decisions. Because different trades happen in real time, the price data is fed to a data pipeline that uses Dataflow for processing. You want to compute moving averages. What should you do?
a) Use session windows in Dataflow.
b) Use hopping windows in Dataflow.
c) Use tumbling windows in Dataflow.
d) Use Dataflow SQL, and compute averages grouped by time.

Answers:

Question: 01
Answer: d
Question: 02
Answer: b
Question: 03
Answer: a
Question: 04
Answer: c
Question: 05
Answer: a
Question: 06
Answer: c
Question: 07
Answer: a
Question: 08
Answer: d
Question: 09
Answer: a
Question: 10
Answer: b

Note: Please update us by writing an email on feedback@vmexam.com for any error in Google Cloud Platform - Professional Data Engineer (GCP-PDE) certification exam sample questions

Your rating: None Rating: 4.9 / 5 (83 votes)