Exam Details

  • Exam Code
    :PROFESSIONAL-CLOUD-DATABASE-ENGINEER
  • Exam Name
    :Google Cloud Certified - Professional Cloud Database Engineer
  • Certification
    :Google Certifications
  • Vendor
    :Google
  • Total Questions
    :132 Q&As
  • Last Updated
    :May 20, 2025

Google Google Certifications PROFESSIONAL-CLOUD-DATABASE-ENGINEER Questions & Answers

  • Question 71:

    You are evaluating Cloud SQL for PostgreSQL as a possible destination for your on-premises PostgreSQL instances. Geography is becoming increasingly relevant to customer privacy worldwide. Your solution must support data residency requirements and include a strategy to:

    configure where data is stored

    control where the encryption keys are stored

    govern the access to data

    What should you do?

    A. Replicate Cloud SQL databases across different zones.

    B. Create a Cloud SQL for PostgreSQL instance on Google Cloud for the data that does not need to adhere to data residency requirements. Keep the data that must adhere to data residency requirements on-premises. Make application changes to support both databases.

    C. Allow application access to data only if the users are in the same region as the Google Cloud region for the Cloud SQL for PostgreSQL database.

    D. Use features like customer-managed encryption keys (CMEK), VPC Service Controls, and Identity and Access Management (IAM) policies.

  • Question 72:

    You need to provision several hundred Cloud SQL for MySQL instances for multiple project teams over a one-week period. You must ensure that all instances adhere to company standards such as instance naming conventions, database flags, and tags. What should you do?

    A. Automate instance creation by writing a Dataflow job.

    B. Automate instance creation by setting up Terraform scripts.

    C. Create the instances using the Google Cloud Console UI.

    D. Create clones from a template Cloud SQL instance.

  • Question 73:

    Your company wants to migrate its MySQL, PostgreSQL, and Microsoft SQL Server on-premises databases to Google Cloud. You need a solution that provides near-zero downtime, requires no application changes, and supports change data

    capture (CDC).

    What should you do?

    A. Use the native export and import functionality of the source database.

    B. Create a database on Google Cloud, and use database links to perform the migration.

    C. Create a database on Google Cloud, and use Dataflow for database migration.

    D. Use Database Migration Service.

  • Question 74:

    Your organization has an existing app that just went viral. The app uses a Cloud SQL for MySQL backend database that is experiencing slow disk performance while using hard disk drives (HDDs). You need to improve performance and

    reduce disk I/O wait times.

    What should you do?

    A. Export the data from the existing instance, and import the data into a new instance with solid-state drives (SSDs).

    B. Edit the instance to change the storage type from HDD to SSD.

    C. Create a high availability (HA) failover instance with SSDs, and perform a failover to the new instance.

    D. Create a read replica of the instance with SSDs, and perform a failover to the new instance

  • Question 75:

    You are responsible for designing a new database for an airline ticketing application in Google Cloud. This application must be able to:

    Work with transactions and offer strong consistency.

    Work with structured and semi-structured (JSON) data.

    Scale transparently to multiple regions globally as the operation grows.

    You need a Google Cloud database that meets all the requirements of the application.

    What should you do?

    A. Use Cloud SQL for PostgreSQL with both cross-region read replicas.

    B. Use Cloud Spanner in a multi-region configuration.

    C. Use Firestore in Datastore mode.

    D. Use a Bigtable instance with clusters in multiple regions.

  • Question 76:

    Your ecommerce website captures user clickstream data to analyze customer traffic patterns in real time and support personalization features on your website. You plan to analyze this data using big data tools. You need a low-latency solution that can store 8 TB of data and can scale to millions of read and write requests per second. What should you do?

    A. Write your data into Bigtable and use Dataproc and the Apache Hbase libraries for analysis.

    B. Deploy a Cloud SQL environment with read replicas for improved performance. Use Datastream to export data to Cloud Storage and analyze with Dataproc and the Cloud Storage connector.

    C. Use Memorystore to handle your low-latency requirements and for real-time analytics.

    D. Stream your data into BigQuery and use Dataproc and the BigQuery Storage API to analyze large volumes of data.

  • Question 77:

    Your company uses Cloud Spanner for a mission-critical inventory management system that is globally available. You recently loaded stock keeping unit (SKU) and product catalog data from a company acquisition and observed hot-spots in the Cloud Spanner database. You want to follow Google-recommended schema design practices to avoid performance degradation. What should you do? (Choose two.)

    A. Use an auto-incrementing value as the primary key.

    B. Normalize the data model.

    C. Promote low-cardinality attributes in multi-attribute primary keys.

    D. Promote high-cardinality attributes in multi-attribute primary keys.

    E. Use bit-reverse sequential value as the primary key.

  • Question 78:

    You released a popular mobile game and are using a 50 TB Cloud Spanner instance to store game data in a PITR-enabled production environment. When you analyzed the game statistics, you realized that some players are exploiting a loophole to gather more points to get on the leaderboard. Another DBA accidentally ran an emergency bugfix script that corrupted some of the data in the production environment. You need to determine the extent of the data corruption and restore the production environment. What should you do? (Choose two.)

    A. If the corruption is significant, use backup and restore, and specify a recovery timestamp.

    B. If the corruption is significant, perform a stale read and specify a recovery timestamp.Write the results back.

    C. If the corruption is significant, use import and export.

    D. If the corruption is insignificant, use backup and restore, and specify a recovery timestamp.

    E. If the corruption is insignificant, perform a stale read and specify a recovery timestamp.Write the results back.

  • Question 79:

    You are managing two different applications: Order Management and Sales Reporting. Both applications interact with the same Cloud SQL for MySQL database. The Order Management application reads and writes to the database 24/7, but the Sales Reporting application is read-only. Both applications need the latest data. You need to ensure that the Performance of the Order Management application is not affected by the Sales Reporting application. What should you do?

    A. Create a read replica for the Sales Reporting application.

    B. Create two separate databases in the instance, and perform dual writes from the Order Management application.

    C. Use a Cloud SQL federated query for the Sales Reporting application.

    D. Queue up all the requested reports in PubSub, and execute the reports at night.

  • Question 80:

    Your company wants to migrate an Oracle-based application to Google Cloud. The application team currently uses Oracle Recovery Manager (RMAN) to back up the database to tape for long-term retention (LTR). You need a cost-effective backup and restore solution that meets a 2-hour recovery time objective (RTO) and a 15-minute recovery point objective (RPO). What should you do?

    A. Migrate the Oracle databases to Bare Metal Solution for Oracle, and store backups on tapes on-premises.

    B. Migrate the Oracle databases to Bare Metal Solution for Oracle, and use Actifio to store backup files on Cloud Storage using the Nearline Storage class.

    C. Migrate the Oracle databases to Bare Metal Solution for Oracle, and back up the Oracle databases to Cloud Storage using the Standard Storage class.

    D. Migrate the Oracle databases to Compute Engine, and store backups on tapes on-premises.

Tips on How to Prepare for the Exams

Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Google exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your PROFESSIONAL-CLOUD-DATABASE-ENGINEER exam preparations and Google certification application, do not hesitate to visit our Vcedump.com to find your solutions here.