This content and service to the Salesforce community are essential to us here at CapStorm. When it comes to helping maximize your data, we feel a sense of duty to help others solve some of the most common and complex challenges related to their Salesforce data. Join us on LinkedIn, Youtube, Twitter, or CapStorm.com!
Episode 5 of Data Unleashed dives into two considerations your organization should take when deciding between a SaaS or self-hosted Salesforce backup and restore solution. The first one involves asking yourself if the solution is accessible and easily verifiable. When it comes to the SaaS option, there is fewer data autonomy when it comes to verifying things such as records and incremental backups. The second consideration is the idea that any restore solution can be used for more than just disaster recovery. Taking a proactive approach to testing your processes will be beneficial in the event there is a disaster one day.
With the self-hosted approach, your data lives on-premises and behind your firewall, where you can easily access it no matter what happens. A self-hosted solution works with both considerations, as you can verify and test as needed and push your data to different downstream integrations following a disaster recovery scenario.
Tune in each Tuesday for more episodes of Data Unleashed, and discover all the tips and tricks to help you get more value from your investment in Salesforce. If you are looking for a fast, easy, and highly-secure way to protect your Salesforce data & metadata, we would love to hear from you! Reach out to an SFDC data expert or send us a message on LinkedIn!
My name is Drew Niermann, and this is Data Unleashed, the video blog series dedicated to helping you get more out of your investment in Salesforce.
So in one of our previous episodes, I differentiated between some of the SaaS vendors on the market today as one approach to Salesforce backup and recovery versus the self-hosted approach towards Salesforce backup and recovery. Today, we’re going to talk about two different things you’ll probably want to consider if your organization is looking for this type of solution.
So the first idea is that any Salesforce backup solution should be fully accessible and easily verifiable. One of the questions that we like to ask Salesforce customers is how do you know that backups working if you don’t store the data? How can you validate that all the records are there? How can you know that everything that’s supposed to be running is running on the incremental schedule that it is supposed to be running off, especially if you’re entrusting it to a SaaS vendor where you can’t get a query and validate if it is actually working?
So philosophy number two is that any restore solution should be useful for more than just disaster recovery. In short, you don’t want to wait until after the disaster happens to actually test your disaster recovery process.
So I would submit to you that there’s a way to accomplish this and fulfill both of these philosophies’ obligations. Under a self-hosted approach, with a self-hosted model, you’ve got the copy of your Salesforce or living behind your firewall in your own relational database that your database teams can query, analyze, integrate with, and check on anytime they want. So you can verify and test anytime.
When it comes to the recovery process, the self-hosted solution will often allow you to take data from the backup database that, again, you own behind your firewall and push the data to different downstream integrations or maybe to populate sandboxes. But the idea is the same process that you use or would use for a disaster scenario. You’ve tested, validated, and verified every single day by populating the sandbox environments or recovering metadata deployed metadata to those various sandboxes.
So if you’ve never thought about this before, we would love to talk with you. Drop me a note on LinkedIn. I’d be happy to open a conversation. Thank you so much for watching. My name is Drew Niermann. And this is Data Unleashed