Google GCP-PCD Certification Exam Sample Questions

GCP-PCD Braindumps, GCP-PCD Exam Dumps, GCP-PCD Examcollection, GCP-PCD Questions PDF, GCP-PCD Sample Questions, Professional Cloud Developer Dumps, Professional Cloud Developer Official Cert Guide PDF, Professional Cloud Developer VCEWe have prepared Google Professional Cloud Developer (GCP-PCD) certification sample questions to make you aware of actual exam properties. This sample question set provides you with information about the Professional Cloud Developer exam pattern, question formate, a difficulty level of questions and time required to answer each question. To get familiar with Google Cloud Platform - Professional Cloud Developer (GCP-PCD) exam, we suggest you try our Sample Google GCP-PCD Certification Practice Exam in simulated Google certification exam environment.

To test your knowledge and understanding of concepts with real-time scenario based Google GCP-PCD questions, we strongly recommend you to prepare and practice with Premium Google Professional Cloud Developer Certification Practice Exam. The premium Google Professional Cloud Developer certification practice exam helps you identify topics in which you are well prepared and topics in which you may need further training to achieving great score in actual Google Cloud Platform - Professional Cloud Developer (GCP-PCD) exam.

Google GCP-PCD Sample Questions:

01. You are capturing important audit activity in Stackdriver Logging. You need to read the information from Stackdriver Logging to perform real-time analysis of the logs.
You will have multiple processes performing different types of analysis on the logging data. What should you do?
a) Read the logs directly from the Stackdriver Logging API.
b) Set up a Stackdriver Logging sync to BigQuery, and read the logs from the BigQuery table.
c) Set up a Stackdriver Logging sync to Cloud Pub/Sub, and read the logs from a Cloud Pub/Sub topic.
d) Set up a Stackdriver Logging sync to Cloud Storage, and read the logs from a Cloud Storage bucket.
 
02. Your application starts on the VM as a systemd service. Your application outputs its log information to stdout.
You need to send the application logs to Stackdriver without changing the application. What should you do?
a) Review the application logs from the Compute Engine VM Instance activity logs in Stackdriver.
b) Review the application logs from the Compute Engine VM Instance data access logs in Stackdriver.
c) Install Stackdriver Logging Agent. Review the application logs from the Compute Engine VM Instance syslog logs in Stackdriver.
d) Install Stackdriver Logging Agent. Review the application logs from the Compute Engine VM Instance system event logs in Stackdriver.
 
03. You have a service running on Compute Engine virtual machine instances behind a global load balancer. You need to ensure that when the instance fails, it is recovered. What should you do?
a) Set up health checks in the load balancer configuration.
b) Deploy a service to the instances to notify you when they fail.
c) Use Stackdriver alerting to trigger a workflow to reboot the instance.
d) Set up health checks in the managed instance group configuration.
 
04. You are building a storage layer for an analytics Hadoop cluster for your company. This cluster will run multiple jobs on a nightly basis, and you need to access the data frequently.
You want to use Cloud Storage for this purpose. Which storage option should you choose?
a) Multi-regional storage
b) Regional storage
c) Nearline storage
d) Coldline storage
 
05. You have an application that accepts inputs from users. The application needs to kick off different background tasks based on these inputs.
You want to allow for automated asynchronous execution of these tasks as soon as input is submitted by the user.
Which product should you use?
a) Cloud Tasks
b) Cloud Bigtable
c) Cloud Pub/Sub
d) Cloud Composer
 
06. As part of their expansion, HipLocal is creating new projects in order to separate resources. They want to build a system to automate enabling of their APIs. What should they do?
a) Copy existing persistent disks to the new project.
b) Use the service management API to define a new service.
c) Use the service management API to enable the Compute API.
d) Use the service management API to enable the Cloud Storage API.
 
07. Your organization has grown, and new teams need access to manage network connectivity within and across projects. You are now seeing intermittent timeout errors in your application.
You want to find the cause of the problem. What should you do?
a) Set up wireshark on each Google Cloud Virtual Machine instance.
b) Configure VPC flow logs for each of the subnets in your VPC.
c) Review the instance admin activity logs in Stackdriver for the application instances.
d) Configure firewall rules logging for each of the firewalls in your VPC.
 
08. Your company has a successful multi-player game that has become popular in the US. Now, it wants to expand to other regions. It is launching a new feature that allows users to trade points. This feature will work for users across the globe.
Your company’s current MySQL backend is reaching the limit of the Compute Engine instance that hosts the game. Your company wants to migrate to a different database that will provide global consistency and high availability across the regions.
Which database should they choose?
a) BigQuery
b) Cloud Spanner
c) Cloud SQL
d) Cloud Bigtable
 
09. Which architecture should HipLocal use for log analysis?
a) Use Cloud Spanner to store each event.
b) Start storing key metrics in Cloud Memorystore.
c) Use Stackdriver Logging with a BigQuery sink.
d) Use Stackdriver Logging with a Cloud Storage sink.
 
10. Your company plans to expand their analytics use cases. One of the new use cases requires your data analysts to analyze events using SQL on a near real–time basis.
You expect rapid growth and want to use managed services as much as possible. What should you do?
a) Create a Cloud Pub/Sub topic and a subscription. Stream your events from the source into the Pub/Sub topic. Leverage Cloud Dataflow to ingest these events into BigQuery.
b) Create a Cloud Pub/Sub topic and a subscription. Stream your events from the source into the Pub/Sub topic. Leverage Cloud Dataflow to ingest these events into Cloud Storage.
c) Create a Kafka instance on a large Compute Engine instance. Stream your events from the source into a Kafka pipeline. Leverage Cloud Dataflow to ingest these events into Cloud Storage.
d) Create a Cloud Pub/Sub topic and a subscription. Stream your events from the source into the Pub/Sub topic. Leverage Cloud Dataflow to ingest these events into Cloud Datastore.

Answers:

Question: 01
Answer: c
Question: 02
Answer: c
Question: 03
Answer: d
Question: 04
Answer: b
Question: 05
Answer: a
Question: 06
Answer: c
Question: 07
Answer: b
Question: 08
Answer: b
Question: 09
Answer: c
Question: 10
Answer: a

Note: Please update us by writing an email on feedback@vmexam.com for any error in Google Cloud Platform - Professional Cloud Developer (GCP-PCD) certification exam sample questions

Your rating: None Rating: 5 / 5 (1 vote)