<rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom"
xmlns:cc="http://cyber.law.harvard.edu/rss/creativeCommonsRssModule.html" xmlns:media="http://search.yahoo.com/mrss/" version="2.0">
   <channel>
      <title>The Salesforce Data Blog</title>
	<description>The Salesforce Data Blog - Data Migrations and Integrationss with Salesforce</description>
      <link>http://salesforcedatablog.com</link>
	  
      <language>en-us</language>
      <generator>PHP/5.6.40</generator>
	  <atom:link href="http://salesforcedatablog.com/rss/" rel="self" type="application/rss+xml"/>      <item>
         <title>The CRM Success Show: Interviewing your Host Dave Masri</title>
         <link>https://gluon.digital/blog/52/The_CRM_Success_Show_Bonus_Episode_1_Interviewing_your_Host_Dave_Masri_Maz</link>
         <guid isPermaLink="true">https://gluon.digital/blog/52/The_CRM_Success_Show_Bonus_Episode_1_Interviewing_your_Host_Dave_Masri_Maz</guid>
         <pubDate>Tue, 27 Jan 2026 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/The_CRM_Success_Show_Interviewing_your_Host_Dave_Masri.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/The_CRM_Success_Show_Interviewing_your_Host_Dave_Masri.png"></br>
				<pre-wrap>In the latest episode of The CRM Success Show, I switched seats and became a guest on my own podcast, interviewed by my partner in the show, Khero.

We talked through my journey as an entrepreneur - from starting my own business, landing my first contracts, and building trust with clients, to hiring contractors and workers and scaling operations along the way.

It's a candid conversation about the realities of building a business from the ground up, the lessons learned the hard way, and what it really takes to grow sustainably in the CRM space.

If you missed it, here is the recording!</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Data Synchronization Patterns balancing simplicity with performance - Copenhagen Salesforce Architect Group</title>
         <link>https://gluon.digital/blog/51/Data_Synchronization_Patterns_Copenhagen_Salesforce_Architect_Group</link>
         <guid isPermaLink="true">https://gluon.digital/blog/51/Data_Synchronization_Patterns_Copenhagen_Salesforce_Architect_Group</guid>
         <pubDate>Thu, 27 Jul 2023 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/Data_Synchronization_Patterns_Copenhagen_Salesforce_Architect_Group.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/Data_Synchronization_Patterns_Copenhagen_Salesforce_Architect_Group.png"></br>
				<pre-wrap>Last week, I gave a talk at the Copenhagen Salesforce Architect Group Monthly virtual meetup on Data Synchronization Patterns: balancing performance and simplicity. In case you missed it here is the recording! </pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>On The Peiroll with Pei Mun Lim &amp; David Masri</title>
         <link>https://gluon.digital/blog/50/On_The_Peiroll_with_Pei_Mun_Lim_David_Masri</link>
         <guid isPermaLink="true">https://gluon.digital/blog/50/On_The_Peiroll_with_Pei_Mun_Lim_David_Masri</guid>
         <pubDate>Sun, 21 Aug 2022 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/onthepeiroll_podcast.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/onthepeiroll_podcast.png"></br>
				<pre-wrap>Last week I had a great time talking with Pei Mun Lim for her &quot;On The Pei Roll&quot; podcast available on Spotify.

We talked about my journey (career path) to the eventual founding of Gluon Digital. Some data migration best practices, general data security,  and then a bit of a deeper dive into the different aspects of handling sensitive data during the data migration process...</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Planning a Salesforce-to-Salesforce Data Migration (Or Salesforce Org Merging)</title>
         <link>https://gluon.digital/blog/49/Planning_a_salesforce_to_salesforce_data_migration</link>
         <guid isPermaLink="true">https://gluon.digital/blog/49/Planning_a_salesforce_to_salesforce_data_migration</guid>
         <pubDate>Sun, 12 Jun 2022 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/Planning_a_Salesforce_to_Salesforce_Data_Migration.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/Planning_a_Salesforce_to_Salesforce_Data_Migration.png"></br>
				<pre-wrap>I've been in the CRM space for nearly 15 years, when I started most of my projects where from clients looking to implement a divisional or enterprise-wide CRM system for the first time. Often, they had salespeople using scattered processes, tracking leads in and contacts in outlook or some other rolodex type software. More sophisticated sales people would have a personal CRM, usually ACT!. As time passed a larger percentage of project work was migrating off legacy CRMs that lost or failed in the marketplace, or simply moving away from bad implementations of more mature CRM systems.</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Announcing our partnership with VFG Consulting</title>
         <link>https://gluon.digital/blog/48/Gulon_Digital_VFG_Partnership</link>
         <guid isPermaLink="true">https://gluon.digital/blog/48/Gulon_Digital_VFG_Partnership</guid>
         <pubDate>Tue, 01 Feb 2022 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/Gulon_Digital_VFG.jpg" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/Gulon_Digital_VFG.jpg"></br>
				<pre-wrap>Gluon Digital is partnering with VFG Consulting! I'm proud to announce that I have joined VFG Consulting as Partner/Strategic advisor to help them build out there our Azure, Data and Analytics Practice.

I have been working with VFG for nearly a year now, and I'm thrilled to take this relationship to the next level. I have seen a huge uptick in the need for everything Azure and Salesforce integrations. This is a huge win for our (Gluon's) existing and future clients, as we help them to tackle the most complex Salesforce data projects.</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Anablock Podcast: Interview with David Masri, Data Architect, Author, and Founder of Gluon Digital</title>
         <link>https://gluon.digital/blog/47/Anablock_Podcast_Interview_with_David_Masri</link>
         <guid isPermaLink="true">https://gluon.digital/blog/47/Anablock_Podcast_Interview_with_David_Masri</guid>
         <pubDate>Mon, 19 Apr 2021 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/anablock.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/anablock.png"></br>
				<pre-wrap>I recently had the pleasure of sitting down with Vuk Dukic of Anablock for his podcast. We spoke about the work I'm doing at Gluon Digital, why I started the firm, my book, some data best practices, synchronization patterns, the Salesforce ecosystem, and a few other interesting topics.

Check it out! </pre-wrap>
			]]>
		</description>
	<dc:creator>Vuk Dukic</dc:creator>
      </item>      <item>
         <title>Salesforce Q&amp; A with David Masri, Founder @ Gluon Digital</title>
         <link>https://www.salesforcerepublic.co/salesforceqa-david-masri/</link>
         <guid isPermaLink="true">https://www.salesforcerepublic.co/salesforceqa-david-masri/</guid>
         <pubDate>Thu, 25 Mar 2021 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/Salesforce-QA-David-Masri-800x500.jpg" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/Salesforce-QA-David-Masri-800x500.jpg"></br>
				<pre-wrap>I recently had the pleasure of sitting down with Caitlin Edwards for a Q&amp;A as part of #Salesforce Republics Q&amp;A blog Series.

We discussed a wide range of topics: benefits of getting involved with the community, the main challenges of data migrations, some tips on starting your own business in the Salesforce ecosystem, and more.

Check it out!</pre-wrap>
			]]>
		</description>
	<dc:creator>Caitlin Edwards</dc:creator>
      </item>      <item>
         <title>Salesforce Republic Talk (10/7/2020): Winning the War against bad CRM Data</title>
         <link>https://gluon.digital/blog/45/Salesforce_Republic_Talk_10_7_2020_Winning_the_War_against_bad_CRM_Data</link>
         <guid isPermaLink="true">https://gluon.digital/blog/45/Salesforce_Republic_Talk_10_7_2020_Winning_the_War_against_bad_CRM_Data</guid>
         <pubDate>Tue, 20 Oct 2020 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/Bad_Data_Talk.PNG" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/Bad_Data_Talk.PNG"></br>
				<pre-wrap>A few weeks ago, I gave a talk at the UK Salesforce Republic meetup titled &quot;Winning the War against bad CRM Data&quot;.

I covered exactly that, first a discussion on the root causes of bad CRM Data and then a deep dive into a 5-step battle plan.

If you missed it, here is the recording!</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Xforce session: Salesforce Data Migrations and Attribute Driven Design</title>
         <link>https://gluon.digital/blog/44/Xforce_session_Salesforce_Data_Migrations_and_Attribute_Driven_Design</link>
         <guid isPermaLink="true">https://gluon.digital/blog/44/Xforce_session_Salesforce_Data_Migrations_and_Attribute_Driven_Design</guid>
         <pubDate>Thu, 01 Oct 2020 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/Xforce-Card-9-24-2020.PNG" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/Xforce-Card-9-24-2020.PNG"></br>
				<pre-wrap>If you missed my Xforce session talk last week (9/24/2020) the recording is now available online!

Salesforce Data Migrations and Attribute Driven Design.
</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>SalesforceWay.com Podcast: 75. Data migration and integration | David Masri</title>
         <link>https://gluon.digital/blog/43/Xi_Xiao_podcast_Salesforce_Way_data-migration-and-integration</link>
         <guid isPermaLink="true">https://gluon.digital/blog/43/Xi_Xiao_podcast_Salesforce_Way_data-migration-and-integration</guid>
         <pubDate>Thu, 17 Sep 2020 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/Salesforceway.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/Salesforceway.png"></br>
				<pre-wrap>I recently had the pleasure of being a guest on Xi Xiao's podcast &quot;Salesforce Way&quot;. 

We discussed a variety of topics related to Salesforce and data architecture.  A very interesting talk - Check it out! 
</pre-wrap>
			]]>
		</description>
	<dc:creator>Xi Xiao</dc:creator>
      </item>      <item>
         <title>Why I Started Gluon Digital</title>
         <link>https://gluon.digital/blog/42/Why_I_Started_Gluon_Digital</link>
         <guid isPermaLink="true">https://gluon.digital/blog/42/Why_I_Started_Gluon_Digital</guid>
         <pubDate>Sun, 13 Sep 2020 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/Why_I_Started_Gluon_Digital.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/Why_I_Started_Gluon_Digital.png"></br>
				<pre-wrap>From the moment I decided to change my major to computer information systems (away from finance) I knew I wanted to work with data. A big part of it was the data processing course I took. I found a beauty in data theory that was unmatched in my other computer science courses. It may very well be because at the time Baruch College (CUNY) decided to teach Java over C++, which was in its infancy, and the textbook assigned was horrendously full of errors (which could make any new programmer want to cry - there was no Stack overflow to help!). Contrary to the Java textbook, I found the data processing textbook to be brilliantly written. It is in fact the only one of my college text books I still have (and a quick look online shows that it lasted the test of time, it's up to the 15th edition!).</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>6 Steps to Integrating Google BigQuery with Salesforce Einstein Analytics</title>
         <link>https://www.plative.com/integrating-google-bigquery-with-salesforce-einstein-analytics/</link>
         <guid isPermaLink="true">https://www.plative.com/integrating-google-bigquery-with-salesforce-einstein-analytics/</guid>
         <pubDate>Mon, 17 Aug 2020 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/Integrating_Google_BigQuery_with_Salesforce_Einstein_Analytics.jpg" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/Integrating_Google_BigQuery_with_Salesforce_Einstein_Analytics.jpg"></br>
				<pre-wrap>Performing analytics on data warehouse data is a core use case for many of our (Plative's) clients who look to purchase Salesforce Einstein Analytics, so it's no wonder that Salesforce decided to introduce a native connector to Google's BigQuery as part of Einstein Analytics. This Article will walk you through configuring BigQuery and setting up the connection in Einstein Analytics.</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Integrating FAT (locally installed apps) with Salesforce using a Custom Protocol</title>
         <link>https://gluon.digital/blog/40/Integrating_FAT_locally_installed_apps_with Salesforce</link>
         <guid isPermaLink="true">https://gluon.digital/blog/40/Integrating_FAT_locally_installed_apps_with Salesforce</guid>
         <pubDate>Tue, 28 Jul 2020 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/CustomProtocal.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/CustomProtocal.png"></br>
				<pre-wrap>Ever need to integrate Salesforce with a local FAT application? This article will explain exactly how to do that with no backend code! That's right! You can a launch locally installed FAT apps directly from Salesforce! This article explains exactly how (using custom protocols).</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Introducing r/SFBlogs! (Aggregating the best Salesforce Blogs!)</title>
         <link>https://www.linkedin.com/pulse/introducing-rsfblogs-aggregating-best-salesforce-blogs-david-masri/</link>
         <guid isPermaLink="true">https://www.linkedin.com/pulse/introducing-rsfblogs-aggregating-best-salesforce-blogs-david-masri/</guid>
         <pubDate>Mon, 01 Jun 2020 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/Reddit_SFBlogs.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/Reddit_SFBlogs.png"></br>
				<pre-wrap>Introducing r/SFBlogs! I wrote a bot that monitors the RSS feeds of the best Salesforce blogs and automatically posts them to Reddit on the r/SFBlogs subreddit. This article explains exactly how and why I did it.</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Developing Salesforce Data Migrations and Integrations [Book Review]</title>
         <link>https://www.salesforceben.com/developing-salesforce-data-migrations-and-integrations-book-review/</link>
         <guid isPermaLink="true">https://www.salesforceben.com/developing-salesforce-data-migrations-and-integrations-book-review/</guid>
         <pubDate>Wed, 13 May 2020 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/SFB_title_Developing-Salesforce-Data-Migrations-and-Integrations-Book-Review.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/SFB_title_Developing-Salesforce-Data-Migrations-and-Integrations-Book-Review.png"></br>
				<pre-wrap>Here's my review of &quot;Developing Data Migrations and Integrations with Salesforce&quot;, a book I recommend to any Salesforce Architect (functional or technical) - or, in fact, for anyone planning a migration or building integration on the Salesforce platform.

David Masri, a Salesforce Data Strategy and Architecture Lead, shares all that he has learned in this enjoyable book. I have to commend him here for not adding a throwaway title to the rapidly growing inventory of Salesforce books. Not only is the topic increasingly important, but the content itself is also extremely useful - let's see how.</pre-wrap>
			]]>
		</description>
	<dc:creator>Phil Weinmeister</dc:creator>
      </item>      <item>
         <title>Visualize NetSuite Data in Salesforce with Einstein Analytics</title>
         <link>https://www.plative.com/salesforce-netsuite-einstein-analytics/</link>
         <guid isPermaLink="true">https://www.plative.com/salesforce-netsuite-einstein-analytics/</guid>
         <pubDate>Thu, 07 May 2020 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/NetSuite Data in Salesforce with Einstein.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/NetSuite Data in Salesforce with Einstein.png"></br>
				<pre-wrap>When Plative entered the ERP space, NetSuite was an obvious choice, it's a top tier ERP, its cloud-based, and from a user base perspective, it has significant overlap with Salesforce. As you can imagine we do quite a bit of NetSuiteSalesforce integration work and depending on the requirements, the project can be time-intensive. That's why we were thrilled when Salesforce announced the NetSuite connector for Einstein Analytics. Not only can we use that connector to build great dashboards using NetSuite data, but we can also mash that data up with Salesforce data to build cross-platform dashboards! Furthermore, we can surface that data anywhere in Salesforce, as it relates to any record, all without code!  This sometimes eliminates the need to build one-way NetSuite to Salesforce integrations.</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri &amp; Rob MacEwan</dc:creator>
      </item>      <item>
         <title>Highlights: MarTech Panel 4/30/2020</title>
         <link>https://gluon.digital/blog/36/Martech_panel_discussion</link>
         <guid isPermaLink="true">https://gluon.digital/blog/36/Martech_panel_discussion</guid>
         <pubDate>Wed, 06 May 2020 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/Martech.jpeg" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/Martech.jpeg"></br>
				<pre-wrap>Last week I had the honor of moderating a forum on #MarTech (Marketing Technology) as part of our very first virtual #SalesforceRepublic meetup - it was a lot of fun and a huge success, we had nearly 50 people join, and we learned a quite few great things from our panel of experts.

I would like to personally thank the panel, so Thank you - Blair Jones, Patrick Downing, Alarra Tozin,Tigh Loughhead and Pia Chon.
</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Push Salesforce to its Limit Without Code [New Book Review]</title>
         <link>https://www.salesforceben.com/practical-salesforce-development-book-review/?</link>
         <guid isPermaLink="true">https://www.salesforceben.com/practical-salesforce-development-book-review/?</guid>
         <pubDate>Sun, 02 Feb 2020 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/SFB_title_Push-Salesforce-to-its-Limit-Without-Code.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/SFB_title_Push-Salesforce-to-its-Limit-Without-Code.png"></br>
				<pre-wrap>Some six months ago,  I was thrilled when I heard that Phil Weinmeister was releasing a second updated edition of his book Practical Salesforce Development Without Code: Building Declarative Solutions on the Salesforce Platform. I trusted that Phil always creates solid content, having read his last book, Practical Guide to Salesforce Communities, (also worthy of 5 Stars, and highly recommended); not to mention Apress' reputation (the publisher) for quality.</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Bidirectional Data Synchronizations in Salesforce | Ask An Expert</title>
         <link>https://www.plative.com/bidirectional-data-synchronizations-in-salesforce-ask-an-expert/</link>
         <guid isPermaLink="true">https://www.plative.com/bidirectional-data-synchronizations-in-salesforce-ask-an-expert/</guid>
         <pubDate>Tue, 21 Jan 2020 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/BI-DIRECTIONAL-DATA-SYNC-IN-SALESFORCE-1.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/BI-DIRECTIONAL-DATA-SYNC-IN-SALESFORCE-1.png"></br>
				<pre-wrap>Like most multi-user systems, Salesforce does not block users from viewing a record via the UI while an update is in progress. This can cause an issue where two users are updating the same record at the same time. I don't mean &quot;Click Save&quot; at the same time, I am referring to an &quot;update interval&quot;, something like the following sequence of events:</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Migrating Email Messages to Salesforce with Attachments - A Complete How to Guide.</title>
         <link>https://gluon.digital/blog/33/Migrating Email Messages_to_Salesforce_with_Attachments</link>
         <guid isPermaLink="true">https://gluon.digital/blog/33/Migrating Email Messages_to_Salesforce_with_Attachments</guid>
         <pubDate>Tue, 03 Dec 2019 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/Migrate_eMail_Messages.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/Migrate_eMail_Messages.png"></br>
				<pre-wrap>Historically when migrating email messages to Salesforce we have been asked to load them as tasks, but with the release of the Enhanced Email functionality a while back, it's becoming increasingly common to want to migrate historical email data into the EmailMessage Objects.

Unfortunately loading these objects can be tricky, and worse, error messages returned are often obscure. Sometimes it will tell you that we have &quot;Insufficient Access Rights&quot; even if we are an admin, and other times it will just say &quot;An unexpected error occurred&quot;.   

But have no fear, this article will guide you through it!</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Data Migrations and The Importance of Centralizing Your Transformation Code</title>
         <link>https://www.plative.com/data-migration-and-the-importance-of-centralizing-your-transformation-code/</link>
         <guid isPermaLink="true">https://www.plative.com/data-migration-and-the-importance-of-centralizing-your-transformation-code/</guid>
         <pubDate>Tue, 17 Sep 2019 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/CodeCentral.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/CodeCentral.png"></br>
				<pre-wrap>When discussing data migrations, I often talk about the importance of &quot;repairability&quot;, the ability to fix data caused by an error in your data migration, weeks or even months post go live. In fact, some time ago I wrote an article on that exact subject. In that article I briefly mentioned the importance of centralizing your transformation code but didn't really delve into why. </pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Why I absolutely HATE the Apex Data Loader.</title>
         <link>https://gluon.digital/blog/31/Why_I_absolutely_HATE_the_Apex_Data_Loader</link>
         <guid isPermaLink="true">https://gluon.digital/blog/31/Why_I_absolutely_HATE_the_Apex_Data_Loader</guid>
         <pubDate>Tue, 20 Aug 2019 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/ApexDateLoaderHate.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/ApexDateLoaderHate.png"></br>
				<pre-wrap>I really hate the Salesforce's Apex Data Loader, I really do. Not because it's a bad app, it's quite a good one (at least as far as utilities go), the reason I hate it is simply because it exists. Let me explain, but first a bit of background.

I recently authored an article on Salesforce Ben titled: Choosing the Right ETL Tool or Middleware for Your Salesforce Data Migration or Integration, in that article I noted:

&quot;The APEX data loader is almost never the right tool for the job because it doesn't have any data transformation capabilities and is relatively difficult to automate.&quot;

In my book, (Developing Data Migrations and Integrations with Salesforce: Patterns and Best Practices), I introduce the data loader as follows:...</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Choosing the Right ETL Tool or Middleware for Your Salesforce Data Migration or Integration</title>
         <link>https://www.salesforceben.com/choosing-the-right-etl-tool-or-middleware-for-your-salesforce-data-migration-or-integration/</link>
         <guid isPermaLink="true">https://www.salesforceben.com/choosing-the-right-etl-tool-or-middleware-for-your-salesforce-data-migration-or-integration/</guid>
         <pubDate>Tue, 06 Aug 2019 12:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/Choosing-the-right-ETL.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/Choosing-the-right-ETL.png"></br>
				<pre-wrap>Often, when talking with people who have been put in charge of building a data migration or integration with Salesforce, the first question they ask is &quot;What tool should we use?&quot;. My response is always something along the lines of: &quot;whoa, slow down! We need to gather the requirements first, then have our requirements drive that decision.&quot; Naturally, the next question is: &quot;What are the key requirements that drive that decision?&quot;.

In this post, I will guide you to choosing the ETL Tool/Middleware that's right for you, by asking you to ask yourself these 9 questions.</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Winning the War against bad CRM data</title>
         <link>https://gluon.digital/blog/29/Winning_the_War_against_bad_CRM_data</link>
         <guid isPermaLink="true">https://gluon.digital/blog/29/Winning_the_War_against_bad_CRM_data</guid>
         <pubDate>Mon, 05 Aug 2019 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/WarBadData.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/WarBadData.png"></br>
				<pre-wrap>A few month ago I authored an article titled &quot;Why are CRM systems so susceptible to bad data?&quot;, and in that article I laid out what I felt (and still do feel) are the root causes of bad CRM data. To quickly review the 6 root causes I identified are as follows (see the article for details):...</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Understanding Salesforce record locking, and preventing them from killing our load performance (or causing errors)</title>
         <link>https://gluon.digital/blog/28/Understanding_Salesforce_record_locking_and_preventing_them_from_killing_our_load_performance
</link>
         <guid isPermaLink="true">https://gluon.digital/blog/28/Understanding_Salesforce_record_locking_and_preventing_them_from_killing_our_load_performance
</guid>
         <pubDate>Tue, 16 Jul 2019 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/Record_Locking.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/Record_Locking.png"></br>
				<pre-wrap>Just like most relational database systems, Salesforce locks records when they are being modified, to prevent two people from updating the same record simultaneously, which would result in a conflict. So when someone &quot;asks&quot; Salesforce to update a record, Salesforce first locks the record so no one else can update it until the lock is released. If someone tries to (asks for an) update the record while it's locked, Salesforce will try up to 10 times to obtain a record lock, before giving up, and throwing an error.

Record locking errors is a common source of headache for people coding Data Migrations or Integrations with Salesforce. The good news is that ...</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Book review : Developing Data Migrations and Integrations with Salesforce - Patterns and Best Practice (RadixBay)</title>
         <link>https://www.radixbay.com/salesforce-book-review/</link>
         <guid isPermaLink="true">https://www.radixbay.com/salesforce-book-review/</guid>
         <pubDate>Thu, 27 Jun 2019 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/RadixBay.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/RadixBay.png"></br>
				<pre-wrap>Experienced Salesforce techs know that a well-planned data migration is a key component of all successful implementations. But data migration action items are often overlooked during the planning phase. The activities are usually crammed into the production cutover section, without fully understanding the complexity they bring to the project. A quick Google search for &quot;Salesforce data migration best practices&quot; will return a handful of articles. Most of the articles are quite short and provide basic information.

Enter author and Salesforce expert, David Masri. David is the Technical Director of Data Strategy and Architecture at Capgemini Invent (Global Salesforce Practice). His recently published book titled Developing Data Migrations and Integrations with Salesforce: Patterns and Best Practices provides a wealth of information on Salesforce data migrations. David's book was recently recognized as one of BookAuthority's 13 Best New Salesforce Books To Read In 2019. BookAuthority is the world's leading site for nonfiction book recommendations...</pre-wrap>
			]]>
		</description>
	<dc:creator>Christen Sisler</dc:creator>
      </item>      <item>
         <title>FAQ: What is a Pk Chunk?</title>
         <link>https://gluon.digital/blog/26/FAQ_What_is_a_Pk_Chunk</link>
         <guid isPermaLink="true">https://gluon.digital/blog/26/FAQ_What_is_a_Pk_Chunk</guid>
         <pubDate>Wed, 26 Jun 2019 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/FAQ_8.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/FAQ_8.png"></br>
				<pre-wrap>PK Chunking is a feature that was added to the bulk API back in 2015, that when used, is supposed to improve the performance of  large data downloads from Salesforce. Most native objects and all custom objects are supported (official documentation with details here).

Basically when sending a Bulk API request, as part of the header we tell Salesforce to use PK Chunking and the Chunck Size. Then when processing the request, Salesforce will get the Min &amp; Max Ids (or Primary Keys- PKs) of the object, then create a set of SOQL statements, where each statement has a where clause specifying a &quot;Chunk&quot; of Primary Keys (PKs).</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Book review on: Developing Data Migrations and Integrations with Salesforce - Patterns and Best Practices (Christopher Hopper)</title>
         <link>https://www.linkedin.com/pulse/book-review-developing-data-migrations-integrations-hopper/</link>
         <guid isPermaLink="true">https://www.linkedin.com/pulse/book-review-developing-data-migrations-integrations-hopper/</guid>
         <pubDate>Sun, 23 Jun 2019 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/Hopper.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/Hopper.png"></br>
				<pre-wrap>Quality of data being in the right place, for the right person, at the right time so it's actionable is critical to the success of a CRM project.

While a nice user interface, fancy reports and workflow automation are also important elements in a CRM, these would significantly lose their value if the underlying data that supports these areas are problematic. 

Within this book, we are provided a comprehensive plan on how to turn the most critical and riskiest area of a project involving data migration and integration into a streamlined, low maintenance and high performing process...</pre-wrap>
			]]>
		</description>
	<dc:creator>Christopher A. Hopper</dc:creator>
      </item>      <item>
         <title>Heroes of Data History #1 - John Snow (and the London well)</title>
         <link>https://www.linkedin.com/pulse/heroes-data-history-1-john-snow-london-well-david-masri/</link>
         <guid isPermaLink="true">https://www.linkedin.com/pulse/heroes-data-history-1-john-snow-london-well-david-masri/</guid>
         <pubDate>Fri, 31 May 2019 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/John_Snow.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/John_Snow.png"></br>
				<pre-wrap>This is the first installment of a new blog series &quot;Heroes of Data History&quot;, this series will focus on commemorating some our great heroes as Data people. I'll take some creative freedom with regard to history, as all good writers do - the point here is to honor those great heroes, not teach a history lesson.

Today's Hero - John Snow, no not Jon Snow, John Snow, with an h. Unlike Jon, John was not a bastard, a Targaryen or a queen slayer - he was something much more, a forgotten Hero of Data History. So forgotten to history that if you Google his name, Google assumes you mean &quot;Jon Snow&quot; and fixes the error for you- because DUHH...</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>The Ultimate Salesforce Data Reference/Reading List</title>
         <link>https://www.linkedin.com/pulse/ultimate-salesforce-data-referencereading-list-david-masri/</link>
         <guid isPermaLink="true">https://www.linkedin.com/pulse/ultimate-salesforce-data-referencereading-list-david-masri/</guid>
         <pubDate>Tue, 28 May 2019 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/Reading_List.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/Reading_List.png"></br>
				<pre-wrap>Salesforce, and the Salesforce Ohana, do a great job providing lots of wonderful content and resources (specifically with the Trailhead and Developer sites) for self-training and learning. When I decided to write my book (Developing Data Migrations and Integrations with Salesforce: Patterns and Best Practices) , I knew that what I absolutely did not want to do was simply to organize and rehash information that was available elsewhere. If I was going to write a book, it was going to be one of original content, but at the same time, I did not want to ignore all the great content available online. I accomplished that goal via the aggressive use of footnotes. In this way, I could introduce a topic, give you a resource for deep technical information, then dive into practical usage - i.e. Patterns and Best Practices - where I felt there was lack of good content available...</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Tigh Loughhead on my talk about Salesforce Data Migrations and Attribute Driven Design</title>
         <link>https://www.b2bmarketingexpert.com/2019/04/salesforce-data-migrations-with-david-masri-capgemini.html</link>
         <guid isPermaLink="true">https://www.b2bmarketingexpert.com/2019/04/salesforce-data-migrations-with-david-masri-capgemini.html</guid>
         <pubDate>Thu, 04 Apr 2019 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/Meetup_1.jpeg" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/Meetup_1.jpeg"></br>
				<pre-wrap>Last night I visited Third Republic's new NYC headquarters, to visit a series of SFDC-focused events they're kicking off in 2019. There were a few friends I already know, but I was fascinated by the first discussion on the subject of Salesforce data migrations.
David Masri, the Technical Director of Data Strategy and Architecture at Capgemini Invent (Global Salesforce Practice) Capgemini gave an excellent talk about how we plan to migrate and import data, and then laid out the convoluted process we actually take really take in real life, using a messy mix of Excel and various other tools to sort of get the job done...
</pre-wrap>
			]]>
		</description>
	<dc:creator>Tigh Loughhead</dc:creator>
      </item>      <item>
         <title>What's that System? (A guide to the confusing world of enterprise systems!)</title>
         <link>https://www.linkedin.com/pulse/whats-system-guide-confusing-world-enterprise-systems-david-masri/</link>
         <guid isPermaLink="true">https://www.linkedin.com/pulse/whats-system-guide-confusing-world-enterprise-systems-david-masri/</guid>
         <pubDate>Tue, 26 Mar 2019 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/FAQ_7.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/FAQ_7.png"></br>
				<pre-wrap>As good Salesforce consultants we tend to live in the world of CRM, and we tend to focus on the big 3 components of CRM: Sales, Marketing and Customer Service. But as you move towards more Client-sideClient centric roles you are exposed to a much wider ecosystem of business applications. If you are a integration specialist it's your job is to  make sure these systems talk to each other.

As a Salesforce IntegrationData Architect, rarely does a week go by that I am not asked what one of these systems are. So, I decided to put together this reference guide to explain what the most common Enterprise Systems are and where they fit in an organization...</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Developing Data Migrations and Integrations with Salesforce - Best Practice 35: QA and UAT is for Testing Processes Too (Not Just Code).</title>
         <link>https://www.apress.com/us/blog/all-blog-posts/qa-and-uat-is-for-testing-processes-too--not-just-code-/16539092</link>
         <guid isPermaLink="true">https://www.apress.com/us/blog/all-blog-posts/qa-and-uat-is-for-testing-processes-too--not-just-code-/16539092</guid>
         <pubDate>Mon, 18 Mar 2019 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/BP_35.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/BP_35.png"></br>
				<pre-wrap>When I look at how Salesforce integrationmigration specialists (dev) tend to go about migrating data to Salesforce code, I often see a process that looks something like this:
1)	Data Is delivered 
2)	The dev then does some data analysis and authors some sort of data mapping document
3)	The dev Transforms the data for the first object (at the top of the hierarchy), 
4)	The dev loads that object to SF
5)	The dev fixes errored rows for that object 
6)	Rinse and repeat for the rest of the object to be loaded
7)	Once all Objects are loaded, they email the client and tell them to &quot;Test Away!&quot;
8)	The client does some testing &amp; logs some defects
9)	The dev fixes the data for the defects as they come in and then asks the client to &quot;retest it&quot;....
</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>FAQ: Why are CRM systems so susceptible to bad data?</title>
         <link>https://gluon.digital/blog/19/faq_Why_are_CRM_systems_so_susceptible_to_bad_data</link>
         <guid isPermaLink="true">https://gluon.digital/blog/19/faq_Why_are_CRM_systems_so_susceptible_to_bad_data</guid>
         <pubDate>Mon, 04 Mar 2019 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/FAQ_6.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/FAQ_6.png"></br>
				<pre-wrap>It's no secret that CRM systems are often plagued with bad data. Recently I asked myself &quot;Are CRM systems more susceptible to bad data then other Systems, and if so, why?&quot;. I think the answer is clearly YES. If I'm right (and I am), all this means is that when it comes to CRM systems you need a much stronger focus and resolve to keep data clean, using fairly traditional approaches (maybe I'll write about that some other time).

I do think it's important to understand why CRM systems are so susceptible to data quality issues. So, without further ado, here are what I believe are at the root of CRM data quality issues:...</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>FAQ: What is the difference between a Data Migration and a Data Integration?</title>
         <link>https://gluon.digital/blog/18/FAQ_What_is_the_difference_between_Data_Migration_and_Data_Integration</link>
         <guid isPermaLink="true">https://gluon.digital/blog/18/FAQ_What_is_the_difference_between_Data_Migration_and_Data_Integration</guid>
         <pubDate>Fri, 22 Feb 2019 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/FAQ_5.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/FAQ_5.png"></br>
				<pre-wrap>It's a surprisingly common question, I get asked it at least once a month: What is the difference between a Data Migration and a Data Integration?

From a strictly definitional perspective, a data migration is when we need to move data from one system (legacy or source) to another (target), where the target system will wholly replace the functionality of the source system. So, after the data is moved over and the new system goes live, there is no need to work in the legacy system any more (at least for the functionality that was migrated). So, this is a &quot;onetime&quot; data movement.

Data Integrations, from a strictly definitional perspective...</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Developing Data Migrations and Integrations with Salesforce - Best Practice #16: Best Practice 16: Don't Bury the Bodies; Expose Them.</title>
         <link>https://gluon.digital/blog/17/Best_Practice_16_Dont_Bury_the_Bodies_Expose_Them</link>
         <guid isPermaLink="true">https://gluon.digital/blog/17/Best_Practice_16_Dont_Bury_the_Bodies_Expose_Them</guid>
         <pubDate>Fri, 15 Feb 2019 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/BP_16.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/BP_16.png"></br>
				<pre-wrap>This is a lesson I learned early on in my career the hard way. Although I have no proof that people are doing this, I'm know it's incredibly common. Consider the following situation and think about how you would handle it. Be honest with yourself, (it's not like you have to tell anyone).

You are developing a complex integration to Salesforce and after weeks of hard work, you come across some edge case in your data that was not considered and has the potential to cause upheaval to your timeline. What do you do? Do you fess up, or ignore it and wait for the client to catch it in UAT?</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>The two things you need to get to the top of your field: Warmth and Competence.</title>
         <link>https://www.linkedin.com/pulse/two-things-you-need-get-top-your-field-warmth-competence-david-masri/</link>
         <guid isPermaLink="true">https://www.linkedin.com/pulse/two-things-you-need-get-top-your-field-warmth-competence-david-masri/</guid>
         <pubDate>Wed, 06 Feb 2019 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/TwoThingsCover.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/TwoThingsCover.png"></br>
				<pre-wrap>I know I usually write about Salesforce and Data, so this is somewhat off topic for me, but it is such an importantinteresting subject that I feel can help a lot of people. It's also around that time that people forget about their New Year's resolutions and need to be reminded to think about self-betterment. Regardless I use my blog to expand on topics that I briefly touch on in my book, and I do touch on this in it, so it's fair game...
The Stereotype Content Model

If you have studied or taken an interest in social psychology you may have heard about Gordon Allport's work on social stereotypes which was eventually built upon by later psychologists into the Stereotype Content Model. The Stereotype Content Model is based on the idea that both individual people and social groups are assessed (or stereotyped) primarily on two things, warmth and competence...</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>FAQ: What do I need to know about Salesforce Users &amp; Licensing before coding my integration or migration?</title>
         <link>https://gluon.digital/blog/15/FAQ_What_do_I_need_to_know_about_Salesforce_Users_Licensing_before_coding_my_integration_or_migration</link>
         <guid isPermaLink="true">https://gluon.digital/blog/15/FAQ_What_do_I_need_to_know_about_Salesforce_Users_Licensing_before_coding_my_integration_or_migration</guid>
         <pubDate>Thu, 31 Jan 2019 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/FAQ_4.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/FAQ_4.png"></br>
				<pre-wrap>In this article I outline everything you need to (at least) be aware of about setting up users prior to coding your data migration or integration with Salesforce.

Let's get started!

The Salesforce User Object supports user External Ids, you should use it.

Even though you can create an External Id on the User Object, you can't always use it when Upserting (polymorphic fields are not supported - for example: Activity WhoIDs and custom object owners). But you should still set them for all users that exists in the source system, for two reasons:..</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Introducing The Salesforce Data Blog (A Distributed Blog) OR Choosing a Blogging Platform.</title>
         <link>https://www.linkedin.com/pulse/introducing-salesforce-data-blog-distributed-choosing-david-masri/</link>
         <guid isPermaLink="true">https://www.linkedin.com/pulse/introducing-salesforce-data-blog-distributed-choosing-david-masri/</guid>
         <pubDate>Tue, 22 Jan 2019 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/SalesforceDataBlog.PNG" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/SalesforceDataBlog.PNG"></br>
				<pre-wrap>Not many people know this, but I have been a on and off again blogger for almost 20 years, I have had at least four blogs on various sites, as well as participated as part of a team of bloggers on a very well-known blog. The reason no one really knows about this is because it has always been under various pseudonyms. This worked well for me because I valued my privacy, and was not in any way looking to self-promote, I was just looking for a creative outlet.       

So, about a month or so before my book (Data Migrations and Integrations with Salesforce) was released, I met with Susan McDermott, Apress' cloud software acquisition editor, (by this time we had a long working relationship but this was the first time I met her in person)  to discuss marketing. One of the things she mentioned was the value of blogging. So, I said sure &quot;I can do that&quot;....</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>FAQ: One bad record is causing a batch of records to fail! How can I prevent triggers from impacting my ETL code?</title>
         <link>https://gluon.digital/blog/13/faq_bad_record_causing_batch_of_records_to_fail_prevent_triggers_from_impacting_my_ETL_code</link>
         <guid isPermaLink="true">https://gluon.digital/blog/13/faq_bad_record_causing_batch_of_records_to_fail_prevent_triggers_from_impacting_my_ETL_code</guid>
         <pubDate>Thu, 17 Jan 2019 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/FAQ_3.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/FAQ_3.png"></br>
				<pre-wrap>Salesforce triggers need to be coded in such a way as to be able to process batches of data. This process is referred to as &quot;bulkifying&quot; the trigger. If a Salesforce trigger is not properly bulkified, anytime we push a batch with more than one record, it will fail. In addition, with bulkified triggers, if one record fails, the entire batch may fail, so every record in the batch will fail each with an obscure error message. This is usually because of some aggregation done within the trigger.

For Example: Suppose we have a (before update) trigger on the Contact object that is properly bulkified. The trigger takes all the contacts in a batch, groups them by account, does some math to calculate a data point, and then does a single update to the Account record. If even one of the contact updates fails (because of an invalid e-mail address or some validation rule), the entire batch fails with generic error messages. This makes sense because the trigger can't calculate the Account level data point -  it's all or nothing!

This is a nightmare for the people who monitor the row-level error logs!...</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Developing Data Migrations and Integrations with Salesforce: Thank You!</title>
         <link>https://www.linkedin.com/pulse/developing-data-migrations-integrations-salesforce-thank-david-masri/</link>
         <guid isPermaLink="true">https://www.linkedin.com/pulse/developing-data-migrations-integrations-salesforce-thank-david-masri/</guid>
         <pubDate>Tue, 15 Jan 2019 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/ThankYou.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/ThankYou.png"></br>
				<pre-wrap>It has now been about 3 weeks since the official release of my book &quot;Developing Data Migrations and Integrations with Salesforce: Patterns and Best Practices&quot; (The response from the Salesforce Ohana has been so great, I almost regret not being more involved in the community sooner.) I want to publicly thank everyone who helped make this book possible.

Unfortunately, LinkedIn does not allow for tagging people in articles, so, in leu of that, I will at-mention them in the comments section below.

The following is the Acknowledgments section as published in the front of the book:
</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>reddit AMA: Best Questions!</title>
         <link>https://www.linkedin.com/pulse/reddit-ama-best-questions-david-masri/</link>
         <guid isPermaLink="true">https://www.linkedin.com/pulse/reddit-ama-best-questions-david-masri/</guid>
         <pubDate>Thu, 10 Jan 2019 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/Reddit_AMA.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/Reddit_AMA.png"></br>
				<pre-wrap>Earlier this week I did an AMA (Ask Me Anything) on reddits's rsalesforce subreddit, this article is a paraphrased recap of the best questions. You can find the full AMA here.

I'm curious about MuleSoft, do you cover MuleSoft in the book and what Salesforce's long-term strategy is for integrations?
The book is ETLMiddleware agnostic - It's a patterns and Best practices book - so although there is some code in the book, code is not the focus (the code pattern is)...</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>How to Plan for any Salesforce Data Migration that Wins Every time</title>
         <link>https://www.salesforceben.com/how-to-plan-for-any-salesforce-data-migration-that-wins-every-time/</link>
         <guid isPermaLink="true">https://www.salesforceben.com/how-to-plan-for-any-salesforce-data-migration-that-wins-every-time/</guid>
         <pubDate>Mon, 07 Jan 2019 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/SFB_title_data-migration.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/SFB_title_data-migration.png"></br>
				<pre-wrap>I spend a great deal of time dwelling on what makes a good Data Migration.

I identified the six attributes of a good data migration, which form the foundation of my book &quot;Developing Data Migrations and Integrations with Salesforce: Patterns and Best Practices&quot;. These characteristics are what I call the &quot;why&quot; behind the best practices that I discuss later in the book.

The principle attribute is &quot;well planned&quot;, for one simple reason:...</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>PUBLISHED: Developing Data Migrations and Integrations with Salesforce!!</title>
         <link>https://www.linkedin.com/pulse/published-developing-data-migrations-integrations-salesforce-masri/</link>
         <guid isPermaLink="true">https://www.linkedin.com/pulse/published-developing-data-migrations-integrations-salesforce-masri/</guid>
         <pubDate>Thu, 03 Jan 2019 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/Me2-small.jpg" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/Me2-small.jpg"></br>
				<pre-wrap>I'm proud to announce that my book &quot;Developing Data Migrations and Integrations with Salesforce: Patterns and Best Practices&quot; has been officially released and is available for sale just about everywhere! (Direct from Apress, BN.Com and of course Amazon. )

Generally, its tradition to print the back-cover text with such announcements, but you can view that text at each of the links above, so instead I will include the &quot;How This Book Is Structured&quot; section from the book's introduction....</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>FAQ: I have a field that got migrated incorrectly. It wasn't discovered until a after go-live, and users updated some records. How do I fix it?</title>
         <link>https://gluon.digital/blog/8/faq_field_migrated_incorrectly_after_go_live_how_to_fix_it</link>
         <guid isPermaLink="true">https://gluon.digital/blog/8/faq_field_migrated_incorrectly_after_go_live_how_to_fix_it</guid>
         <pubDate>Mon, 31 Dec 2018 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/FAQ_2.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/FAQ_2.png"></br>
				<pre-wrap>When you are migrating your data to Salesforce it's not uncommon to be migrating dozens of objects and hundreds of fields. Regardless of how thorough your testing is, defects may be found weeks, even months after go-live. Your users are going to want you to fix the data, but if you simply Upsert over the bad field, all changes made by users since go live will be lost (at least for that field).

The key here is to identify the effected records, and then Identify which of the effected records had the field modified by users. (If you are tracking field level changes for the field in question, this becomes a non-issue)

Here's what you need to do:...</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>&quot;inNote - Keep Notes on your LinkedIn Contacts!&quot; or &quot;Building Chrome Extensions for App Integrations&quot;</title>
         <link>https://gluon.digital/blog/7/inNote_Keep_Notes_on_your_LinkedIn_Contacts_or_Building_Chrome_Extensions_for_App_Integrations</link>
         <guid isPermaLink="true">https://gluon.digital/blog/7/inNote_Keep_Notes_on_your_LinkedIn_Contacts_or_Building_Chrome_Extensions_for_App_Integrations</guid>
         <pubDate>Thu, 27 Dec 2018 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/inNote.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/inNote.png"></br>
				<pre-wrap>This article will walk you through my experience codeing a simple Chrome extension (inNote) as a learning exercise. It allows you to enter notes against LinkedIn Contacts and Companies. You can download it here for free. 

In my book, when discussing integrations, I categorize integrations in one of 2 ways, as either a &quot;data integration&quot; or an &quot;Application UI automation integration&quot;, and define them as follows:

Data integrations: When the integration is centered around a data interchange. Data moves back and forth and may or may not be saved to Salesforce, but the data is displayed using the native Salesforce UI.

Application UI automation integrations: When the integration is centered around surfacing another system's UI in Salesforce or automating the application in some way driven off Salesforce actions or data. </pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Developing Data Migrations and Integrations with Salesforce - Best Practice #31: Every Record You Insert or Update Should Have a Job ID.</title>
         <link>https://gluon.digital/blog/6/Best Practice_31_Every_Record_You_Insert_Should_Have_Job ID
</link>
         <guid isPermaLink="true">https://gluon.digital/blog/6/Best Practice_31_Every_Record_You_Insert_Should_Have_Job ID
</guid>
         <pubDate>Thu, 20 Dec 2018 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/BP_31.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/BP_31.png"></br>
				<pre-wrap>Most Salesforce integration specialists know to always mark data loaded to Salesforce with an external Id even if they are performing an Insert or Update as opposed to an Upsert. This makes perfectly good sense; an external Id gives you a mechanism to track back specific records in Salesforce to specific records in some other system, plus if you enforce uniqueness on it, it will prevent you from accidentally inserting a duplicate. But not many people think about marking records with a JobId that identifies exactly which data load job created the record...</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Developing Data Migrations and Integrations with Salesforce - Best Practice #19: Limit the Number of Intermediaries (Layers of Abstraction)</title>
         <link>https://gluon.digital/blog/5/Best_Practice_19_Limit_Number_of_Intermediaries_Layers_of_Abstraction</link>
         <guid isPermaLink="true">https://gluon.digital/blog/5/Best_Practice_19_Limit_Number_of_Intermediaries_Layers_of_Abstraction</guid>
         <pubDate>Thu, 13 Dec 2018 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/BP_19.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/BP_19.png"></br>
				<pre-wrap>Anytime you move data from one system to another or from one format to another, you run the risk that something will be lost or modified in the translation. This situation is a classic case of leaky abstractions. If the data is delivered as .csv files and you open them in Excel, Excel drops the leading zeros or may convert text to numbers, adding decimals. Oracle may convert NULL dates to January 1, 1900. Exporting to .csv files may result in files that are not formatted properly (particularly if you have commas, double quotes or line breaks in the data). You may lose text formatting. Your ETL tool may limit large text fields to 4,000 characters. The risk of this happening may be low, but with each layer, the risk adds up.</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>FAQ: What is the biggest mistake you can make when migrating data to Salesforce?</title>
         <link>https://gluon.digital/blog/4/faq_what_biggest-mistake_you_can_make_when_migrating_data
</link>
         <guid isPermaLink="true">https://gluon.digital/blog/4/faq_what_biggest-mistake_you_can_make_when_migrating_data
</guid>
         <pubDate>Tue, 11 Dec 2018 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/FAQ_1.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/FAQ_1.png"></br>
				<pre-wrap>This is a question I get asked a lot, and my answer is always the same: &quot;By far, the biggest mistake you can make when migrating data to Salesforce, is thinking of your data migration as a one-time task.&quot;</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Developing Data Migrations and Integrations with Salesforce Best Practice #10: Don't Hard-code Salesforce IDs; They Change with Environments</title>
         <link>https://gluon.digital/blog/2/Best_Practice_10_Dont_Hard_code SalesforceIDs_They_Change_with_Environments
</link>
         <guid isPermaLink="true">https://gluon.digital/blog/2/Best_Practice_10_Dont_Hard_code SalesforceIDs_They_Change_with_Environments
</guid>
         <pubDate>Thu, 06 Dec 2018 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/BP_10.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/BP_10.png"></br>
				<pre-wrap>When coding data migrations or integrations with Salesforce it's not uncommon to have a need to specify specific record Ids in your code...</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Developing Data Migrations and Integrations with Salesforce - Best Practice #4: Start Early.</title>
         <link>https://gluon.digital/blog/1/Best_Practice_4_Start_Early
</link>
         <guid isPermaLink="true">https://gluon.digital/blog/1/Best_Practice_4_Start_Early
</guid>
         <pubDate>Tue, 04 Dec 2018 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/BP_4.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/BP_4.png"></br>
				<pre-wrap>The nature of data migrations is such that it's often not on the critical path, except as a dependency to start of QAUAT. Because of this, early in the project, data tasks are often looked at as lower priority, and not set to start until midway through. This is a mistake, for a few reasons...</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>      <item>
         <title>Developing Data Migrations and Integrations with Salesforce - Best Practice #24: Fix code not data.</title>
         <link>https://gluon.digital/blog/3/Best_Practice_24_Fix_code_not_data
</link>
         <guid isPermaLink="true">https://gluon.digital/blog/3/Best_Practice_24_Fix_code_not_data
</guid>
         <pubDate>Tue, 04 Dec 2018 00:00:00 +0000</pubDate>
		 <media:content medium="image" url="http://salesforcedatablog.com/rss/img/BP_24.png" width="450" height="275" />
		<description>
			<![CDATA[
				<img src="http://salesforcedatablog.com/rss/img/BP_24.png"></br>
				<pre-wrap>When coding your migrations or integrations you'll come across bad data. It's better to alter your transformation code to fix that data rather than fix the data itself...</pre-wrap>
			]]>
		</description>
	<dc:creator>David Masri</dc:creator>
      </item>
   </channel>
</rss>