Exam Details

  • Exam Code
    :PROFESSIONAL-CLOUD-ARCHITECT
  • Exam Name
    :Professional Cloud Architect on Google Cloud Platform
  • Certification
    :Google Certifications
  • Vendor
    :Google
  • Total Questions
    :277 Q&As
  • Last Updated
    :Jun 03, 2025

Google Google Certifications PROFESSIONAL-CLOUD-ARCHITECT Questions & Answers

  • Question 61:

    You have been engaged by your client to lead the migration of their application infrastructure to GCP. One of their current problems is that the on-premises high performance SAN is requiring frequent and expensive upgrades to keep up with the variety of workloads that are identified as follows: 20TB of log archives retained for legal reasons; 500 GB of VM boot/data volumes and templates; 500 GB of image thumbnails; 200 GB of customer session state data that allows customers to restart sessions even if off-line for several days.

    Which of the following best reflects your recommendations for a cost-effective storage allocation?

    A. Local SSD for customer session state datA. Lifecycle-managed Cloud Storage for log archives, thumbnails, and VM boot/data volumes.

    B. Memcache backed by Cloud Datastore for the customer session state datA. Lifecycle-managed Cloud Storage for log archives, thumbnails, and VM boot/data volumes.

    C. Memcache backed by Cloud SQL for customer session state datA. Assorted local SSD-backed instances for VM boot/data volumes. Cloud Storage for log archives and thumbnails.

    D. Memcache backed by Persistent Disk SSD storage for customer session state datA. Assorted local SSDbacked instances for VM boot/data volumes. Cloud Storage for log archives and thumbnails.

  • Question 62:

    You need to develop procedures to test a disaster plan for a mission-critical application.

    You want to use Google-recommended practices and native capabilities within GCP.

    What should you do?

    A. Use Deployment Manager to automate service provisioning. Use Activity Logs to monitor and debug your tests.

    B. Use Deployment Manager to automate provisioning. Use Stackdriver to monitor and debug your tests.

    C. Use gcloud scripts to automate service provisioning. Use Activity Logs monitor and debug your tests.

    D. Use automated scripts to automate service provisioning. Use Activity Logs monitor and debug your tests.

  • Question 63:

    Your company plans to migrate a multi-petabyte data set to the cloud. The data set must be available 24hrs a day. Your business analysts have experience only with using a SQL interface. How should you store the data to optimize it for ease of analysis?

    A. Load data into Google BigQuery.

    B. Insert data into Google Cloud SQL.

    C. Put flat files into Google Cloud Storage.

    D. Stream data into Google Cloud Datastore.

  • Question 64:

    You need to migrate Hadoop jobs for your company's Data Science team without modifying the underlying infrastructure. You want to minimize costs and infrastructure management effort. What should you do?

    A. Create a Dataproc cluster using standard worker instances.

    B. Create a Dataproc cluster using preemptible worker instances.

    C. Manually deploy a Hadoop cluster on Compute Engine using standard instances.

    D. Manually deploy a Hadoop cluster on Compute Engine using preemptible instances.

  • Question 65:

    For this question, refer to the TerramEarth case study. You are building a microservice-based application for TerramEarth. The application is based on Docker containers. You want to follow Google-recommended practices to build the application continuously and store the build artifacts. What should you do?

    A. 1. Configure a trigger in Cloud Build for new source changes.

    2.

    Invoke Cloud Build to build one container image, and tag the image with the label 'latest.'

    3.

    Push the image to the Artifact Registry.

    B. 1. Configure a trigger in Cloud Build for new source changes.

    2.

    Invoke Cloud Build to build container images for each microservice, and tag them using the code commit hash.

    3.

    Push the images to the Artifact Registry.

    C. 1 Create a Scheduler job to check the repo every minute.

    2.

    For any new change, invoke Cloud Build to build container images for the microservices.

    3.

    Tag the images using the current timestamp, and push them to the Artifact Registry.

    D. 1. Configure a trigger in Cloud Build for new source changes.

    2.

    The trigger invokes build jobs and build container images for the microservices.

    3.

    Tag the images with a version number, and push them to Cloud Storage.

  • Question 66:

    You are managing several projects on Google Cloud and need to interact on a daily basis with BigQuery, Bigtable and Kubernetes Engine using the gcloud CLI tool

    You are travelling a lot and work on different workstations during the week

    You want to avoid having to manage the gcloud CLI manually

    What should you do?

    A. Use a package manager to install gcloud on your workstations instead of installing it manually

    B. Create a Compute Engine instance and install gcloud on the instance Connect to this instance via SSH to always use the same gcloud installation when interacting with Google Cloud

    C. Install gcloud on all of your workstations Run the command gcloud components auto-update on each workstation

    D. Use Google Cloud Shell in the Google Cloud Console to interact with Google Cloud

  • Question 67:

    A small number of API requests to your microservices-based application take a very long time. You know that each request to the API can traverse many services. You want to know which service takes the longest in those cases. What should you do?

    A. Set timeouts on your application so that you can fail requests faster.

    B. Send custom metrics for each of your requests to Stackdriver Monitoring.

    C. Use Stackdriver Monitoring to look for insights that show when your API latencies are high.

    D. Instrument your application with Stackdnver Trace in order to break down the request latencies at each microservice.

  • Question 68:

    You have been asked to select the storage system for the click-data of your company's large portfolio of websites. This data is streamed in from a custom website analytics package at a typical rate of 6,000 clicks per minute, with bursts of up to 8,500 clicks per second. It must been stored for future analysis by your data science and user experience teams. Which storage infrastructure should you choose?

    A. Google Cloud SQL

    B. Google Cloud Bigtable

    C. Google Cloud Storage

    D. Google cloud Datastore

  • Question 69:

    Your web application must comply with the requirements of the European Union's General Data Protection Regulation (GDPR). You are responsible for the technical architecture of your web application. What should you do?

    A. Ensure that your web application only uses native features and services of Google Cloud Platform, because Google already has various certifications and provides "pass-on" compliance when you use native features.

    B. Enable the relevant GDPR compliance setting within the GCPConsole for each of the services in use within your application.

    C. Ensure that Cloud Security Scanner is part of your test planning strategy in order to pick up any compliance gaps.

    D. Define a design for the security of data in your web application that meets GDPR requirements.

  • Question 70:

    Your company acquired a healthcare startup and must retain its customers' medical information for up to 4 more years, depending on when it was created. Your corporate policy is to securely retain this data, and then delete it as soon as regulations allow.

    Which approach should you take?

    A. Store the data in Google Drive and manually delete records as they expire.

    B. Anonymize the data using the Cloud Data Loss Prevention API and store it indefinitely.

    C. Store the data using the Cloud Storage and use lifecycle management to delete files when they expire.

    D. Store the data in Cloud Storage and run a nightly batch script that deletes all expired datA.

Tips on How to Prepare for the Exams

Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Google exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your PROFESSIONAL-CLOUD-ARCHITECT exam preparations and Google certification application, do not hesitate to visit our Vcedump.com to find your solutions here.