Exam Details

  • Exam Code
    :PROFESSIONAL-CLOUD-DEVELOPER
  • Exam Name
    :Professional Cloud Developer
  • Certification
    :Google Certifications
  • Vendor
    :Google
  • Total Questions
    :254 Q&As
  • Last Updated
    :

Google Google Certifications PROFESSIONAL-CLOUD-DEVELOPER Questions & Answers

  • Question 31:

    Your team is developing an application in Google Cloud that executes with user identities maintained by Cloud Identity. Each of your application's users will have an associated Pub/Sub topic to which messages are published, and a Pub/Sub subscription where the same user will retrieve published messages. You need to ensure that only authorized users can publish and subscribe to their own specific Pub/Sub topic and subscription. What should you do?

    A. Bind the user identity to the pubsub.publisher and pubsub.subscriber roles at the resource level.

    B. Grant the user identity the pubsub.publisher and pubsub.subscriber roles at the project level.

    C. Grant the user identity a custom role that contains the pubsub.topics.create and pubsub.subscriptions.create permissions.

    D. Configure the application to run as a service account that has the pubsub.publisher and pubsub.subscriber roles.

  • Question 32:

    You are using Cloud Build build to promote a Docker image to Development, Test, and Production environments. You need to ensure that the same Docker image is deployed to each of these environments. How should you identify the Docker image in your build?

    A. Use the latest Docker image tag.

    B. Use a unique Docker image name.

    C. Use the digest of the Docker image.

    D. Use a semantic version Docker image tag.

  • Question 33:

    You are developing a web application that will be accessible over both HTTP and HTTPS and will run on Compute Engine instances. On occasion, you will need to SSH from your remote laptop into one of the Compute Engine instances to conduct maintenance on the app. How should you configure the instances while following Google-recommended best practices?

    A. Set up a backend with Compute Engine web server instances with a private IP address behind a TCP proxy load balancer.

    B. Configure the firewall rules to allow all ingress traffic to connect to the Compute Engine web servers, with each server having a unique external IP address.

    C. Configure Cloud Identity-Aware Proxy API for SSH access. Then configure the Compute Engine servers with private IP addresses behind an HTTP(s) load balancer for the application web traffic.

    D. Set up a backend with Compute Engine web server instances with a private IP address behind an HTTP(S) load balancer. Set up a bastion host with a public IP address and open firewall ports. Connect to the web instances using the bastion host.

  • Question 34:

    You are using Cloud Build to create a new Docker image on each source code commit to a Cloud Source Repositoties repository. Your application is built on every commit to the master branch. You want to release specific commits made to the master branch in an automated method. What should you do?

    A. Manually trigger the build for new releases.

    B. Create a build trigger on a Git tag pattern. Use a Git tag convention for new releases.

    C. Create a build trigger on a Git branch name pattern. Use a Git branch naming convention for new releases.

    D. Commit your source code to a second Cloud Source Repositories repository with a second Cloud Build trigger. Use this repository for new releases only.

  • Question 35:

    You have an HTTP Cloud Function that is called via POST. Each submission's request body has a flat, unnested JSON structure containing numeric and text data. After the Cloud Function completes, the collected data should be immediately available for ongoing and complex analytics by many users in parallel. How should you persist the submissions?

    A. Directly persist each POST request's JSON data into Datastore.

    B. Transform the POST request's JSON data, and stream it into BigQuery.

    C. Transform the POST request's JSON data, and store it in a regional Cloud SQL cluster.

    D. Persist each POST request's JSON data as an individual file within Cloud Storage, with the file name containing the request identifier.

  • Question 36:

    You recently developed a new service on Cloud Run. The new service authenticates using a custom service and then writes transactional information to a Cloud Spanner database. You need to verify that your application can support up to 5,000 read and 1,000 write transactions per second while identifying any bottlenecks that occur. Your test infrastructure must be able to autoscale. What should you do?

    A. Build a test harness to generate requests and deploy it to Cloud Run. Analyze the VPC Flow Logs using Cloud Logging.

    B. Create a Google Kubernetes Engine cluster running the Locust or JMeter images to dynamically generate load tests. Analyze the results using Cloud Trace.

    C. Create a Cloud Task to generate a test load. Use Cloud Scheduler to run 60,000 Cloud Task transactions per minute for 10 minutes. Analyze the results using Cloud Monitoring.

    D. Create a Compute Engine instance that uses a LAMP stack image from the Marketplace, and use Apache Bench to generate load tests against the service. Analyze the results using Cloud Trace.

  • Question 37:

    You need to migrate a standalone Java application running in an on-premises Linux virtual machine (VM) to Google Cloud in a cost-effective manner. You decide not to take the lift-and-shift approach, and instead you plan to modernize the application by converting it to a container. How should you accomplish this task?

    A. Use Migrate for Anthos to migrate the VM to your Google Kubernetes Engine (GKE) cluster as a container.

    B. Export the VM as a raw disk and import it as an image. Create a Compute Engine instance from the Imported image.

    C. Use Migrate for Compute Engine to migrate the VM to a Compute Engine instance, and use Cloud Build to convert it to a container.

    D. Use Jib to build a Docker image from your source code, and upload it to Artifact Registry. Deploy the application in a GKE cluster, and test the application.

  • Question 38:

    You are developing an application that will allow clients to download a file from your website for a specific period of time. How should you design the application to complete this task while following Google-recommended best practices?

    A. Configure the application to send the file to the client as an email attachment.

    B. Generate and assign a Cloud Storage-signed URL for the file. Make the URL available for the client to download.

    C. Create a temporary Cloud Storage bucket with time expiration specified, and give download permissions to the bucket. Copy the file, and send it to the client.

    D. Generate the HTTP cookies with time expiration specified. If the time is valid, copy the file from the Cloud Storage bucket, and make the file available for the client to download.

  • Question 39:

    You are developing an application hosted on Google Cloud that uses a MySQL relational database schema. The application will have a large volume of reads and writes to the database and will require backups and ongoing capacity planning. Your team does not have time to fully manage the database but can take on small administrative tasks. How should you host the database?

    A. Configure Cloud SQL to host the database, and import the schema into Cloud SQL.

    B. Deploy MySQL from the Google Cloud Marketplace to the database using a client, and import the schema.

    C. Configure Bigtable to host the database, and import the data into Bigtable.

    D. Configure Cloud Spanner to host the database, and import the schema into Cloud Spanner.

    E. Configure Firestore to host the database, and import the data into Firestore.

  • Question 40:

    You have a mixture of packaged and internally developed applications hosted on a Compute Engine instance that is running Linux. These applications write log records as text in local files. You want the logs to be written to Cloud Logging. What should you do?

    A. Pipe the content of the files to the Linux Syslog daemon.

    B. Install a Google version of fluentd on the Compute Engine instance.

    C. Install a Google version of collectd on the Compute Engine instance.

    D. Using cron, schedule a job to copy the log files to Cloud Storage once a day.

Tips on How to Prepare for the Exams

Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Google exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your PROFESSIONAL-CLOUD-DEVELOPER exam preparations and Google certification application, do not hesitate to visit our Vcedump.com to find your solutions here.