Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Book Overview & Buying Salesforce Data Architect Certification Guide
  • Table Of Contents Toc
  • Feedback & Rating feedback
Salesforce Data Architect Certification Guide

Salesforce Data Architect Certification Guide

By : Aaron Allport
5 (4)
close
close
Salesforce Data Architect Certification Guide

Salesforce Data Architect Certification Guide

5 (4)
By: Aaron Allport

Overview of this book

The Salesforce Data Architect is a prerequisite exam for the Application Architect half of the Salesforce Certified Technical Architect credential. This book offers complete, up-to-date coverage of the Salesforce Data Architect exam so you can take it with confidence. The book is written in a clear, succinct way with self-assessment and practice exam questions, covering all the topics necessary to help you pass the exam with ease. You’ll understand the theory around Salesforce data modeling, database design, master data management (MDM), Salesforce data management (SDM), and data governance. Additionally, performance considerations associated with large data volumes will be covered. You’ll also get to grips with data migration and understand the supporting theory needed to achieve Salesforce Data Architect certification. By the end of this Salesforce book, you'll have covered everything you need to know to pass the Salesforce Data Architect certification exam and have a handy, on-the-job desktop reference guide to re-visit the concepts.
Table of Contents (23 chapters)
close
close
1
Section 1: Salesforce Data Architect Theory
9
Section 2: Salesforce Data Architect Design
15
Section 3: Applying What We've Learned – Practice Questions and Revision Aids

PK chunking to improve performance

PK chunking is designed as a mechanism to allow entire Salesforce table data to be extracted—for example, as part of a backup routine. PK chunking effectively adds record IDs as a WHERE clause parameter to query data from a Salesforce entity in batches.

In general, if an object in Salesforce has more than 10 million rows, you should use PK chunking when exporting its data. If you are finding that querying for data times out regularly, use PK chunking.

Given that PK chunking effectively separates one big query into separate queries by adding a WHERE clause and using a range of ordered IDs, the batch size can be set. This is defaulted to 100,000 (as in, 100,000 records will be returned by default for each batch) but can be as high as 250,000. Therefore, for a 10 million-row entity, a batch size of 250,000 would result in 40 data batches being returned.

In Chapter 7, Data Migration, we walked through a practical example of how PK chunking...

Unlock full access

Continue reading for free

A Packt free trial gives you instant online access to our library of over 7000 practical eBooks and videos, constantly updated with the latest in tech

Create a Note

Modal Close icon
You need to login to use this feature.
notes
bookmark search playlist download font-size

Change the font size

margin-width

Change margin width

day-mode

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Delete Bookmark

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete

Delete Note

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete

Edit Note

Modal Close icon
Write a note (max 255 characters)
Cancel
Update Note

Confirmation

Modal Close icon
claim successful

Buy this book with your credits?

Modal Close icon
Are you sure you want to buy this book with one of your credits?
Close
YES, BUY