Exam Details

  • Exam Code
    :ASSOCIATE-CLOUD-ENGINEER
  • Exam Name
    :Associate Cloud Engineer
  • Certification
    :Google Certifications
  • Vendor
    :Google
  • Total Questions
    :363 Q&As
  • Last Updated
    :May 08, 2024

Google Google Certifications ASSOCIATE-CLOUD-ENGINEER Questions & Answers

  • Question 11:

    You are building an application that will run in your data center. The application will use Google Cloud Platform (GCP) services like AutoML. You created a service account that has appropriate access to AutoML. You need to enable authentication to the APIs from your on-premises environment. What should you do?

    A. Use service account credentials in your on-premises application.

    B. Use gcloud to create a key file for the service account that has appropriate permissions.

    C. Set up direct interconnect between your data center and Google Cloud Platform to enable authentication for your on-premises applications.

    D. Go to the IAM and admin console, grant a user account permissions similar to the service account permissions, and use this user account for authentication from your data center.

  • Question 12:

    You are using Deployment Manager to create a Google Kubernetes Engine cluster. Using the same Deployment Manager deployment, you also want to create a DaemonSet in the kube-system namespace of the cluster. You want a solution that uses the fewest possible services. What should you do?

    A. Add the cluster's API as a new Type Provider in Deployment Manager, and use the new type to create the DaemonSet.

    B. Use the Deployment Manager Runtime Configurator to create a new Config resource that contains the DaemonSet definition.

    C. With Deployment Manager, create a Compute Engine instance with a startup script that uses kubectl to create the DaemonSet.

    D. In the cluster's definition in Deployment Manager, add a metadata that has kube-system as key and the DaemonSet manifest as value.

  • Question 13:

    You are using Container Registry to centrally store your company's container images in a separate project. In another project, you want to create a Google Kubernetes Engine (GKE) cluster. You want to ensure that Kubernetes can download images from Container Registry. What should you do?

    A. In the project where the images are stored, grant the Storage Object Viewer IAM role to the service account used by the Kubernetes nodes.

    B. When you create the GKE cluster, choose the Allow full access to all Cloud APIs option under `Access scopes'.

    C. Create a service account, and give it access to Cloud Storage. Create a P12 key for this service account and use it as an imagePullSecrets in Kubernetes.

    D. Configure the ACLs on each image in Cloud Storage to give read-only access to the default Compute Engine service account.

  • Question 14:

    For analysis purposes, you need to send all the logs from all of your Compute Engine instances to a BigQuery dataset called platform-logs. You have already installed the Stackdriver Logging agent on all the instances. You want to minimize cost. What should you do?

    A. 1. Give the BigQuery Data Editor role on the platform-logs dataset to the service accounts used by your instances.

    2. Update your instances' metadata to add the following value: logs-destination: bq://platform-logs.

    B. 1. In Stackdriver Logging, create a logs export with a Cloud Pub/Sub topic called logs as a sink.

    2.

    Create a Cloud Function that is triggered by messages in the logs topic.

    3.

    Configure that Cloud Function to drop logs that are not from Compute Engine and to insert Compute Engine logs in the platform-logs dataset.

    C. 1. In Stackdriver Logging, create a filter to view only Compute Engine logs.

    2.

    Click Create Export.

    3.

    Choose BigQuery as Sink Service, and the platform-logs dataset as Sink Destination.

    D. 1. Create a Cloud Function that has the BigQuery User role on the platform-logs dataset.

    2.

    Configure this Cloud Function to create a BigQuery Job that executes this query:

    INSERT INTO dataset.platform-logs (timestamp, log)

    SELECT timestamp, log FROM compute.logs

    WHERE timestamp > DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY)

    3.

    Use Cloud Scheduler to trigger this Cloud Function once a day.

  • Question 15:

    Your company has an existing GCP organization with hundreds of projects and a billing account. Your company recently acquired another company that also has hundreds of projects and its own billing account. You would like to consolidate all GCP costs of both GCP organizations onto a single invoice. You would like to consolidate all costs as of tomorrow. What should you do?

    A. Link the acquired company's projects to your company's billing account.

    B. Configure the acquired company's billing account and your company's billing account to export the billing data into the same BigQuery dataset.

    C. Migrate the acquired company's projects into your company's GCP organization. Link the migrated projects to your company's billing account.

    D. Create a new GCP organization and a new billing account. Migrate the acquired company's projects and your company's projects into the new GCP organization and link the projects to the new billing account.

  • Question 16:

    You built an application on Google Cloud Platform that uses Cloud Spanner. Your support team needs to monitor the environment but should not have access to table data. You need a streamlined solution to grant the correct permissions to your support team, and you want to follow Google-recommended practices. What should you do?

    A. Add the support team group to the roles/monitoring.viewer role

    B. Add the support team group to the roles/spanner.databaseUser role.

    C. Add the support team group to the roles/spanner.databaseReader role.

    D. Add the support team group to the roles/stackdriver.accounts.viewer role.

  • Question 17:

    You need to deploy an application, which is packaged in a container image, in a new project. The application exposes an HTTP endpoint and receives very few requests per day. You want to minimize costs. What should you do?

    A. Deploy the container on Cloud Run.

    B. Deploy the container on Cloud Run on GKE.

    C. Deploy the container on App Engine Flexible.

    D. Deploy the container on Google Kubernetes Engine, with cluster autoscaling and horizontal pod autoscaling enabled.

  • Question 18:

    You want to deploy an application on Cloud Run that processes messages from a Cloud Pub/Sub topic. You want to follow Google-recommended practices. What should you do?

    A. 1. Create a Cloud Function that uses a Cloud Pub/Sub trigger on that topic.

    2. Call your application on Cloud Run from the Cloud Function for every message.

    B. 1. Grant the Pub/Sub Subscriber role to the service account used by Cloud Run.

    2.

    Create a Cloud Pub/Sub subscription for that topic.

    3.

    Make your application pull messages from that subscription.

    C. 1. Create a service account.

    2.

    Give the Cloud Run Invoker role to that service account for your Cloud Run application.

    3.

    Create a Cloud Pub/Sub subscription that uses that service account and uses your Cloud Run application as the push endpoint.

    D. 1. Deploy your application on Cloud Run on GKE with the connectivity set to Internal.

    2.

    Create a Cloud Pub/Sub subscription for that topic.

    3.

    In the same Google Kubernetes Engine cluster as your application, deploy a container that takes the messages and sends them to your application.

  • Question 19:

    You are hosting an application on bare-metal servers in your own data center. The application needs access to Cloud Storage. However, security policies prevent the servers hosting the application from having public IP addresses or access to the internet. You want to follow Google-recommended practices to provide the application with access to Cloud Storage. What should you do?

    A. 1. Use nslookup to get the IP address for storage.googleapis.com.

    2.

    Negotiate with the security team to be able to give a public IP address to the servers.

    3.

    Only allow egress traffic from those servers to the IP addresses for storage.googleapis.com.

    B. 1. Using Cloud VPN, create a VPN tunnel to a Virtual Private Cloud (VPC) in Google Cloud Platform (GCP).

    2.

    In this VPC, create a Compute Engine instance and install the Squid proxy server on this instance.

    3.

    Configure your servers to use that instance as a proxy to access Cloud Storage.

    C. 1. Use Migrate for Compute Engine (formerly known as Velostrata) to migrate those servers to Compute Engine.

    2.

    Create an internal load balancer (ILB) that uses storage.googleapis.com as backend.

    3.

    Configure your new instances to use this ILB as proxy.

    D. 1. Using Cloud VPN or Interconnect, create a tunnel to a VPC in GCP.

    2.

    Use Cloud Router to create a custom route advertisement for 199.36.153.4/30. Announce that network to your on-premises network through the VPN tunnel.

    3.

    In your on-premises network, configure your DNS server to resolve *.googleapis.com as a CNAME to restricted.googleapis.com.

  • Question 20:

    Your organization is a financial company that needs to store audit log files for 3 years. Your organization has hundreds of Google Cloud projects. You need to implement a cost-effective approach for log file retention. What should you do?

    A. Create an export to the sink that saves logs from Cloud Audit to BigQuery.

    B. Create an export to the sink that saves logs from Cloud Audit to a Coldline Storage bucket.

    C. Write a custom script that uses logging API to copy the logs from Stackdriver logs to BigQuery.

    D. Export these logs to Cloud Pub/Sub and write a Cloud Dataflow pipeline to store logs to Cloud SQL.

Tips on How to Prepare for the Exams

Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Google exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your ASSOCIATE-CLOUD-ENGINEER exam preparations and Google certification application, do not hesitate to visit our Vcedump.com to find your solutions here.