A health care provider wishes to use salesforce to track patient care. The following actions are in Salesforce
1.
Payment Providers: Orgas who pay for the care 2 patients.
2.
Doctors: They provide care plan for patients and need to support multiple patients, they are provided access to patient information.
3.
Patients: They are individuals who need care.
A data architect needs to map the actor to Sf objects. What should be the optimal selection by the data architect?
A. Patients as Contacts, Payment providers as Accounts, and Doctors as Accounts B. Patients as Person Accounts, Payment providers as Accounts, and Doctors as Contacts C. Patients as Person Accounts, Payment providers as Accounts, and Doctors as Person Account D. Patients as Accounts, Payment providers as Accounts, and Doctors as Person Accounts
C. Patients as Person Accounts, Payment providers as Accounts, and Doctors as Person Account explanation:
Explanation/Reference:
Patients as Person Accounts, Payment providers as Accounts, and Doctors as Person Accounts is the optimal selection by the data architect to map the actor to Salesforce objects. This is because Person Accounts are a special type of accounts that can store both business and personal information for individual customers. Payment providers are organizations that pay for the care of patients, so they can be modeled as Accounts. Doctors are also individuals who provide care plans for patients and need access to patient information, so they can also be modeled as Person Accounts.
Question 2:
A customer is operating in a highly reputated industry and is planning to implement SF. The customer information maintained in SF, includes the following:
Personally, identifiable information (PII)
IP restrictions on profiles organized by Geographic location Financial records that need to be private and accessible only by the assigned Sales associate.
User should not be allowed to export information from Salesforce. Enterprise security has mandate access to be restricted to users within a specific geography and detail monitoring of user activity.
Which 3 Salesforce shield capabilities should a data architect recommend? Choose 3 answers:
A. Event monitoring to monitor all user activities B. Restrict access to SF from users outside specific geography C. Prevent Sales users access to customer PII information D. Transaction security policies to prevent export of SF Data. E. Encrypt Sensitive Customer information maintained in SF.
B. Restrict access to SF from users outside specific geography D. Transaction security policies to prevent export of SF Data. E. Encrypt Sensitive Customer information maintained in SF. explanation:
Explanation/Reference:
The best Salesforce Shield capabilities for the customer are to restrict access to SF from users outside specific geography, implement transaction security policies to prevent export of SF data, and encrypt sensitive customer information maintained in SF. Salesforce Shield is a set of security features that help protect enterprise data on the Salesforce platform. It includes three components: Event Monitoring, Platform Encryption, and Field Audit Trail. Restricting access to SF from users outside specific geography can be done using network-based security features, such as IP whitelisting or VPN. Transaction security policies can be used to define actions or notifications based on user behavior patterns, such as exporting data or logging in from an unknown device. Platform Encryption can be used to encrypt data at rest using a tenant secret key that is controlled by the customer.
Question 3:
Which API should a data architect use if exporting 1million records from Salesforce?
A. Bulk API B. REST API C. Streaming API D. SOAP API
A. Bulk API explanation:
Explanation/Reference:
Using Bulk API to export 1 million records from Salesforce is the best option. Bulk API is a RESTful API that allows you to perform asynchronous operations on large sets of data. You can use Bulk API to create, update, delete, or query millions of records in batches. Bulk API is optimized for performance and scalability, and it can handle complex data loading scenarios.
Question 4:
US is implementing salesforce and will be using salesforce to track customer complaints, provide white papers on products and provide subscription (Fee) ?based support. Which license type will US users need to fulfil US's requirements?
A. Lightning platform starter license. B. Service cloud license. C. Salesforce license. D. Sales cloud license
B. Service cloud license. explanation:
Explanation/Reference:
The best license type to fulfil US's requirements is the Service Cloud license. Service Cloud licenses are designed for users who need access to customer service features, such as cases, solutions, knowledge articles, entitlements, service contracts, and service console. Service Cloud users can also access standard CRM objects, such as accounts, contacts, leads, opportunities, campaigns, and reports3. Lightning Platform Starter license is not a good option because it is intended for users who need access to one custom app and a limited set of standard objects. Salesforce license is not a specific license type, but rather a generic term for any license that grants access to the Salesforce platform. Sales Cloud license is not a good option because it is intended for users who need access to sales features, such as products, price books, quotes, orders, and forecasts.
Question 5:
Cloud Kicks needs to purge detailed transactional records from Salesforce. The data should be aggregated at a summary level and available in Salesforce. What are two automated approaches to fulfill this goal? (Choose two.)
A. Third-party Integration Tool (ETL) B. Schedulable Batch Apex C. Third-party Business Intelligence system D. Apex Triggers
A. Third-party Integration Tool (ETL) B. Schedulable Batch Apex explanation:
Explanation/Reference:
Both A and B are automated approaches to purge detailed transactional records from Salesforce and aggregate them at a summary level1. You can use a third- party integration tool (ETL) or schedulable batch Apex to perform these tasks. Option C is not correct because a third-party business intelligence system does not purge data from Salesforce, but only analyzes it. Option D is not correct because Apex triggers are not automated, but execute when a record is inserted, updated, deleted, or undeleted.
Question 6:
Get Cloudy Consulting monitors 15,000 servers, and these servers automatically record their status every 10 minutes. Because of company policy, these status reports must be maintained for 5 years. Managers at Get Cloudy Consulting
need access to up to one week's worth of these status reports with all of their details.
An Architect is recommending what data should be integrated into Salesforce and for how long it should be stored in Salesforce.
Which two limits should the Architect be aware of? (Choose two.)
A. Data storage limits B. Workflow rule limits C. API Request limits D. Webservice callout limits
A. Data storage limits C. API Request limits explanation:
Explanation/Reference:
Data storage limits and API request limits are two important factors that affect the data integration and storage in Salesforce. Data storage limits determine how much data can be stored in Salesforce, and API request limits determine how many API calls can be made to Salesforce in a 24-hour period. Both of these limits depend on the edition and license type of the Salesforce org. Workflow rule limits and webservice callout limits are not directly related to data integration and storage, but rather to business logic and external services.
Question 7:
Northern Trail Outfitters needs to implement an archive solution for Salesforce data. This archive solution needs to help NTO do the following:
1.
Remove outdated Information not required on a day-to-day basis.
2.
Improve Salesforce performance.
Which solution should be used to meet these requirements?
A. Identify a location to store archived data and use scheduled batch jobs to migrate and purge the aged data on a nightly basis, B. Identify a location to store archived data, and move data to the location using a time- based workflow. C. Use a formula field that shows true when a record reaches a defined age and use that field to run a report and export a report into SharePoint. D. Create a full copy sandbox, and use it as a source for retaining archived data.
A. Identify a location to store archived data and use scheduled batch jobs to migrate and purge the aged data on a nightly basis, explanation:
Explanation/Reference:
Identifying a location to store archived data and using scheduled batch jobs to migrate and purge the aged data on a nightly basis can be a way to meet the requirements for an archive solution. The article provides a use case of how to use Heroku Connect, Postgres, and Salesforce Connect to archive old data, free up space in the org, and still retain the option to unarchive the data if needed. The article also explains how this solution can improve Salesforce performance and meet data retention policies.
Question 8:
Universal Containers wishes to maintain Lead data from Leads even after they are deleted and cleared from the Recycle Bin. What approach should be implemented to achieve this solution?
A. Use a Lead standard report and filter on the IsDeleted standard field. B. Use a Converted Lead report to display data on Leads that have been deleted. C. Query Salesforce with the queryAll API method or using the ALL ROWS SOQL keywords. D. Send data to a Data Warehouse and mark Leads as deleted in that system.
C. Query Salesforce with the queryAll API method or using the ALL ROWS SOQL keywords. explanation:
Explanation/Reference:
According to the exam guide, one of the objectives is to "describe how to use queryAll() or ALL ROWS keywords to access deleted records in Apex and SOQL"1. This implies that option C is the correct way to access deleted records in Salesforce. Option D is not correct because sending data to a data warehouse does not maintain the data in Salesforce. Options A and B are not correct because they do not apply to deleted records2.
Question 9:
A company has 12 million records, and a nightly integration queries these records.
Which two areas should a Data Architect investigate during troubleshooting if queries are timing out? (Choose two.)
A. Make sure the query doesn't contain NULL in any filter criteria. B. Create a formula field instead of having multiple filter criteria. C. Create custom indexes on the fields used in the filter criteria. D. Modify the integration users' profile to have View All Data.
A. Make sure the query doesn't contain NULL in any filter criteria. C. Create custom indexes on the fields used in the filter criteria. explanation:
Explanation/Reference:
Making sure the query does not contain NULL in any filter criteria can avoid full table scans and leverage indexes more efficiently. Queries with NULL filters are not selective and can cause performance issues. Creating custom indexes on the fields used in the filter criteria can also enhance the query performance by reducing the number of records to scan.
Question 10:
Universal Containers (UC) maintains a collection of several million Account records that represent business in the United Sates. As a logistics company, this list is one of the most valuable and important components of UC's business, and the accuracy of shipping addresses is paramount. Recently it has been noticed that too many of the addresses of these businesses are inaccurate, or the businesses don't exist. Which two scalable strategies should UC consider to improve the quality of their Account addresses?
A. Contact each business on the list and ask them to review and update their address information. B. Build a team of employees that validate Accounts by searching the web and making phone calls. C. Integrate with a third-party database or services for address validation and enrichment. D. Leverage Data.com Clean to clean up Account address fields with the DandB database.
C. Integrate with a third-party database or services for address validation and enrichment. D. Leverage Data.com Clean to clean up Account address fields with the DandB database. explanation:
Explanation/Reference:
Integrating with a third-party database or service for address validation and enrichment is a scalable strategy that can improve the quality of the Account addresses by comparing them with a reliable source of data1. Leveraging Data.com Clean to clean up Account address fields with the DandB database is another scalable strategy that can automatically update and enrich Account records with verified information from Data.com2.
Nowadays, the certification exams become more and more important and required by more and more
enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare
for the exam in a short time with less efforts? How to get a ideal result and how to find the
most reliable resources? Here on Vcedump.com, you will find all the answers.
Vcedump.com provide not only Salesforce exam questions,
answers and explanations but also complete assistance on your exam preparation and certification
application. If you are confused on your DATA-ARCHITECT exam preparations
and Salesforce certification application, do not hesitate to visit our
Vcedump.com to find your solutions here.