You are deploying a PHP App Engine Standard service with SQL as the backend. You want to minimize the number of queries to the database.
What should you do?
A. Set the memcache service level to dedicated. Create a key from the hash of the query, and return database values from memcache before issuing a query to Cloud SQL.
B. Set the memcache service level to dedicated. Create a cron task that runs every minute to populate the cache with keys containing query results.
C. Set the memcache service level to shared. Create a cron task that runs every minute to save all expected queries to a key called "cached-queries".
D. Set the memcache service level to shared. Create a key called "cached-queries", and return database values from the key before using a query to Cloud SQL.
Your company is designing its data lake on Google Cloud and wants to develop different ingestion pipelines to collect unstructured data from different sources. After the data is stored in Google Cloud, it will be processed in several data pipelines to build a recommendation engine for end users on the website. The structure of the data retrieved from the source systems can change at any time. The data must be stored exactly as it was retrieved for reprocessing purposes in case the data structure is incompatible with the current processing pipelines. You need to design an architecture to support the use case after you retrieve the data. What should you do?
A. Send the data through the processing pipeline, and then store the processed data in a BigQuery table for reprocessing.
B. Store the data in a BigQuery table. Design the processing pipelines to retrieve the data from the table.
C. Send the data through the processing pipeline, and then store the processed data in a Cloud Storage bucket for reprocessing.
D. Store the data in a Cloud Storage bucket. Design the processing pipelines to retrieve the data from the bucket
You are tasked with building an online analytical processing (OLAP) marketing analytics and reporting tool.
This requires a relational database that can operate on hundreds of terabytes of data. What is the Google recommended tool for such applications?
A. Cloud Spanner, because it is globally distributed
B. Cloud SQL, because it is a fully managed relational database
C. Cloud Firestore, because it offers real-time synchronization across devices
D. BigQuery, because it is designed for large-scale processing of tabular data
You are developing your microservices application on Google Kubernetes Engine. During testing, you want to validate the behavior of your application in case a specific microservice should suddenly crash. What should you do?
A. Add a taint to one of the nodes of the Kubernetes cluster. For the specific microservice, configure a pod anti-affinity label that has the name of the tainted node as a value.
B. Use Istio's fault injection on the particular microservice whose faulty behavior you want to simulate.
C. Destroy one of the nodes of the Kubernetes cluster to observe the behavior.
D. Configure Istio's traffic management features to steer the traffic away from a crashing microservice.
Your company is planning to perform a lift and shift migration of their Linux RHEL 6.5+ virtual machines. The virtual machines are running in an on-premises VMware environment. You want to migrate them to Compute Engine following Google-recommended practices. What should you do?
A. 1. Define a migration plan based on the list of the applications and their dependencies.
2. Migrate all virtual machines into Compute Engine individually with Migrate for Compute Engine.
B. 1. Perform an assessment of virtual machines running in the current VMware environment. 2.Create images of all disks. Import disks on Compute Engine. 3.Create standard virtual machines where the boot disks are the ones you have imported.
C. 1. Perform an assessment of virtual machines running in the current VMware environment.
2. Define a migration plan, prepare a Migrate for Compute Engine migration RunBook, and execute the migration.
D. 1. Perform an assessment of virtual machines running in the current VMware environment. 2.Install a third-party agent on all selected virtual machines. 3.Migrate all virtual machines into Compute Engine.
Your company has just recently activated Cloud Identity to manage users. The Google Cloud Organization has been configured as wed. The security learn needs to secure protects that will be part of the Organization. They want to prohibit IAM users outside the domain from gaining permissions from now on. What should they do?
A. Configure an organization policy to restrict identities by domain
B. Configure an organization policy to block creation of service accounts
C. Configure Cloud Scheduler o trigger a Cloud Function every hour that removes all users that don't belong to the Cloud identity domain from all projects.
D. Create a technical user (e g . crawler@yourdomain com), and give it the protect owner rote at root organization level Write a bash script that ?Lists all me IAM rules of all projects within the organization ?Deletes all users that do not belong to the company domain Create a Compute Engine instance m a project within the Organization and configure gcloud to be executed with technical user credentials Configure a cron job that executes the bash script every hour.
You are helping the QA team to roll out a new load-testing tool to test the scalability of your primary cloud services that run on Google Compute Engine with Cloud Bigtable. Which three requirements should they include? Choose 3 answers
A. Ensure that the load tests validate the performance of Cloud Bigtable.
B. Create a separate Google Cloud project to use for the load-testing environment.
C. Schedule the load-testing tool to regularly run against the production environment.
D. Ensure all third-party systems your services use are capable of handling high load.
E. Instrument the production services to record every transaction for replay by the load-testing tool.
F. Instrument the load-testing tool and the target services with detailed logging and metrics collection.
Your company has a Google Cloud project that uses BigQuery for data warehousing on a pay-per-use basis. You want to monitor queries in real time to discover the most costly queries and which users spend the most. What should you do?
A. 1. Create a Cloud Logging sink to export BigQuery data access logs to Cloud Storage. 2. Develop a Dataflow pipeline to compute the cost of queries split by users.
B. 1. Create a Cloud Logging sink to export BigQuery data access logs to BigQuery.
2. Perform a BigQuery query on the generated table to extract the information you need.
C. 1. Activate billing export into BigQuery.
2. Perform a BigQuery query on the billing table to extract the information you need.
D. 1. In the BigQuery dataset that contains all the tables to be queried, add a label for each user that can launch a query.
2.
Open the Billing page of the project.
3.
Select Reports.
4.
Select BigQuery as the product and filter by the user you want to check.
Your company has a Google Cloud project that uses BigQuery for data warehousing They have a VPN tunnel between the on-premises environment and Google Cloud that is configured with Cloud VPN. The security team wants to avoid data exfiltration by malicious insiders, compromised code, and accidental oversharing. What should they do?
A. Configure Private Google Access for on-premises only.
B. Perform the following tasks: 1) Create a service account. 2) Give the BigQuery JobUser role and Storage Reader role to the service account. 3) Remove all other IAM access from the project.
C. Configure VPC Service Controls and configure Private Google Access.
D. Configure Private Google Access.
Your BigQuery project has several users. For audit purposes, you need to see how many queries each user ran in the last month.
A. Connect Google Data Studio to BigQuery. Create a dimension for the users and a metric for the amount of queries per user.
B. In the BigQuery interface, execute a query on the JOBS table to get the required information.
C. Use `bq show' to list all jobs. Per job, use `bq Is' to list job information and get the required information.
D. Use Cloud Audit Logging to view Cloud Audit Logs, and create a filter on the query operation to get the required information.
Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Google exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your PROFESSIONAL-CLOUD-ARCHITECT exam preparations and Google certification application, do not hesitate to visit our Vcedump.com to find your solutions here.