Exam Details

  • Exam Code
    :PROFESSIONAL-CLOUD-ARCHITECT
  • Exam Name
    :Professional Cloud Architect on Google Cloud Platform
  • Certification
    :Google Certifications
  • Vendor
    :Google
  • Total Questions
    :277 Q&As
  • Last Updated
    :May 12, 2024

Google Google Certifications PROFESSIONAL-CLOUD-ARCHITECT Questions & Answers

  • Question 31:

    TerramEarth has equipped all connected trucks with servers and sensors to collect telemetry data. Next year they want to use the data to train machine learning models. They want to store this data in the cloud while reducing costs. What should they do?

    A. Have the vehicle's computer compress the data in hourly snapshots, and store it in a Google Cloud Storage (GCS) Nearline bucket

    B. Push the telemetry data in real-time to a streaming dataflow job that compresses the data, and store it in Google BigQuery

    C. Push the telemetry data in real-time to a streaming dataflow job that compresses the data, and store it in Cloud Bigtable

    D. Have the vehicle's computer compress the data in hourly snapshots, and store it in a GCS Coldline bucket

  • Question 32:

    Which of TerramEarth's legacy enterprise processes will experience significant change as a result of increased Google Cloud Platform adoption?

    A. Opex/capex allocation, LAN changes, capacity planning

    B. Capacity planning, TCO calculations, opex/capex allocation

    C. Capacity planning, utilization measurement, data center expansion

    D. Data Center expansion, TCO calculations, utilization measurement

  • Question 33:

    To speed up data retrieval, more vehicles will be upgraded to cellular connections and be able to transmit data to the ETL process. The current FTP process is error-prone and restarts the data transfer from the start of the file when connections fail, which happens often. You want to improve the reliability of the solution and minimize data transfer time on the cellular connections.

    What should you do?

    A. Use one Google Container Engine cluster of FTP servers. Save the data to a Multi-Regional bucket. Run the ETL process using data in the bucket

    B. Use multiple Google Container Engine clusters running FTP servers located in different regions. Save the data to Multi-Regional buckets in US, EU, and Asia. Run the ETL process using the data in the bucket

    C. Directly transfer the files to different Google Cloud Multi-Regional Storage bucket locations in US, EU, and Asia using Google APIs over HTTP(S). Run the ETL process using the data in the bucket

    D. Directly transfer the files to a different Google Cloud Regional Storage bucket location in US, EU, and Asia using Google APIs over HTTP(S). Run the ETL process to retrieve the data from each Regional bucket

  • Question 34:

    TerramEarth plans to connect all 20 million vehicles in the field to the cloud. This increases the volume to 20 million 600 byte records a second for 40 TB an hour. How should you design the data ingestion?

    A. Vehicles write data directly to GCS

    B. Vehicles write data directly to Google Cloud Pub/Sub

    C. Vehicles stream data directly to Google BigQuery

    D. Vehicles continue to write data using the existing system (FTP)

  • Question 35:

    You analyzed TerramEarth's business requirement to reduce downtime, and found that they can achieve a majority of time saving by reducing customer's wait time for parts. You decided to focus on reduction of the 3 weeks aggregate reporting time.

    Which modifications to the company's processes should you recommend?

    A. Migrate from CSV to binary format, migrate from FTP to SFTP transport, and develop machine learning analysis of metrics

    B. Migrate from FTP to streaming transport, migrate from CSV to binary format, and develop machine learning analysis of metrics

    C. Increase fleet cellular connectivity to 80%, migrate from FTP to streaming transport, and develop machine learning analysis of metrics

    D. Migrate from FTP to SFTP transport, develop machine learning analysis of metrics, and increase dealer local inventory by a fixed factor

  • Question 36:

    Your development team has created a structured API to retrieve vehicle data. They want to allow third parties to develop tools for dealerships that use this vehicle event data. You want to support delegated authorization against this data.

    What should you do?

    A. Build or leverage an OAuth-compatible access control system

    B. Build SAML 2.0 SSO compatibility into your authentication system

    C. Restrict data access based on the source IP address of the partner systems

    D. Create secondary credentials for each dealer that can be given to the trusted third party

  • Question 37:

    The TerramEarth development team wants to create an API to meet the company's business requirements. You want the development team to focus their development effort on business value versus creating a custom framework.

    Which method should they use?

    A. Use Google App Engine with Google Cloud Endpoints. Focus on an API for dealers and partners

    B. Use Google App Engine with a JAX-RS Jersey Java-based framework. Focus on an API for the public

    C. Use Google App Engine with the Swagger (Open API Specification) framework. Focus on an API for the public

    D. Use Google Container Engine with a Django Python container. Focus on an API for the public

    E. Use Google Container Engine with a Tomcat container with the Swagger (Open API Specification) framework. Focus on an API for dealers and partners

  • Question 38:

    For this question, refer to the TerramEarth case study. You are building a microservice-based application for TerramEarth. The application is based on Docker containers. You want to follow Google-recommended practices to build the application continuously and store the build artifacts. What should you do?

    A. Configure a trigger in Cloud Build for new source changes. Invoke Cloud Build to build container images for each microservice, and tag them using the code commit hash. Push the images to the Container Registry.

    B. Configure a trigger in Cloud Build for new source changes. The trigger invokes build jobs and build container images for the microservices. Tag the images with a version number, and push them to Cloud Storage.

    C. Create a Scheduler job to check the repo every minute. For any new change, invoke Cloud Build to build container images for the microservices. Tag the images using the current timestamp, and push them to the Container Registry.

    D. Configure a trigger in Cloud Build for new source changes. Invoke Cloud Build to build one container image, and tag the image with the label 'latest.' Push the image to the Container Registry.

  • Question 39:

    TerramEarth's CTO wants to use the raw data from connected vehicles to help identify approximately when a vehicle in the field will have a catastrophic failure. You want to allow analysts to centrally query the vehicle data.

    Which architecture should you recommend?

    A. Option A

    B. Option B

    C. Option C

    D. Option D

  • Question 40:

    For this question, refer to the TerramEarth case study. You start to build a new application that uses a few Cloud Functions for the backend. One use case requires a Cloud Function func_display to invoke another Cloud Function func_query. You want func_query only to accept invocations from func_display. You also want to follow Google's recommended best practices. What should you do?

    A. Create a token and pass it in as an environment variable to func_display. When invoking func_query, include the token in the request. Pass the same token to func_query and reject the invocation if the tokens are different.

    B. Make func_query 'Require authentication.' Create a unique service account and associate it to func_display. Grant the service account invoker role for func_query. Create an id token in func_display and include the token to the request when invoking func_query.

    C. Make func_query 'Require authentication' and only accept internal traffic. Create those two functions in the same VPC. Create an ingress firewall rule for func_query to only allow traffic from func_display.

    D. Create those two functions in the same project and VPC. Make func_query only accept internal traffic. Create an ingress firewall for func_query to only allow traffic from func_display. Also, make sure both functions use the same service account.

Tips on How to Prepare for the Exams

Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Google exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your PROFESSIONAL-CLOUD-ARCHITECT exam preparations and Google certification application, do not hesitate to visit our Vcedump.com to find your solutions here.