Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Book Overview & Buying Salesforce Data Architect Certification Guide
  • Table Of Contents Toc
  • Feedback & Rating feedback
Salesforce Data Architect Certification Guide

Salesforce Data Architect Certification Guide

By : Aaron Allport
5 (4)
close
close
Salesforce Data Architect Certification Guide

Salesforce Data Architect Certification Guide

5 (4)
By: Aaron Allport

Overview of this book

The Salesforce Data Architect is a prerequisite exam for the Application Architect half of the Salesforce Certified Technical Architect credential. This book offers complete, up-to-date coverage of the Salesforce Data Architect exam so you can take it with confidence. The book is written in a clear, succinct way with self-assessment and practice exam questions, covering all the topics necessary to help you pass the exam with ease. You’ll understand the theory around Salesforce data modeling, database design, master data management (MDM), Salesforce data management (SDM), and data governance. Additionally, performance considerations associated with large data volumes will be covered. You’ll also get to grips with data migration and understand the supporting theory needed to achieve Salesforce Data Architect certification. By the end of this Salesforce book, you'll have covered everything you need to know to pass the Salesforce Data Architect certification exam and have a handy, on-the-job desktop reference guide to re-visit the concepts.
Table of Contents (23 chapters)
close
close
1
Section 1: Salesforce Data Architect Theory
9
Section 2: Salesforce Data Architect Design
15
Section 3: Applying What We've Learned – Practice Questions and Revision Aids

Loading massive amounts of data

When loading lots and lots of data into the Salesforce Platform, we’re essentially concerned with how we can get as much data as possible into our Salesforce instance, reliably, in the shortest time possible. Let’s imagine we have 20 million records to load. Thinking in terms of serial versus parallel processing, we can view our loading scenario in two ways:

  • Loading 20 million records sequentially (serial)
  • Loading 20 million records in parallel, by breaking down the 20 million records into smaller batches, inserting them in parallel, and taking less time

To load lots of data quickly, we need to optimize our parallel data loads. There are several steps we can take before loading any data to speed up the load operations. Deferring sharing calculations until after the load operation completes will mean sharing recalculations will be run once rather than on every record batch being processed. Disabling any logic that may...

Unlock full access

Continue reading for free

A Packt free trial gives you instant online access to our library of over 7000 practical eBooks and videos, constantly updated with the latest in tech

Create a Note

Modal Close icon
You need to login to use this feature.
notes
bookmark search playlist download font-size

Change the font size

margin-width

Change margin width

day-mode

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Delete Bookmark

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete

Delete Note

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete

Edit Note

Modal Close icon
Write a note (max 255 characters)
Cancel
Update Note

Confirmation

Modal Close icon
claim successful

Buy this book with your credits?

Modal Close icon
Are you sure you want to buy this book with one of your credits?
Close
YES, BUY