Customer Engagement & Dynamics CRM Forum

Expand all | Collapse all

Best Practices when Updating Sandbox

  • 1.  Best Practices when Updating Sandbox

    Posted Jan 20, 2020 08:32 AM
    We have been updating our Sandbox environment on a fairly regular basis to enable more realistic process testing. To do this we are replicating our production environment but as you would imagine with this comes a complete replica of production data.

    What are some of the recommendations this group has to manage the size of the Sandbox?

    So far we a few bulk delete jobs we run to truncate a few ClickDimension files that we wouldn't use for testing and we clear the AsyncOperationsBase and WorkflowLogBase when we create the environment--which saves a lot of space and ensure logs are from tests we run over this environment.

    We haven't yet started clearing our files or logs but will be tackling this next.

    Most testing we would do in the Sandbox would be related to contacts and cases so we would tend to leave those intact for now.

    Thanks in advance!

    Marc Rohde
    IT Director
    Andis Company

  • 2.  RE: Best Practices when Updating Sandbox

    Posted Jan 20, 2020 11:05 AM
    Just a few thoughts:
    • Turn off Auditing and delete audit history (
    • Bulk delete old activities (especially email) - Since activities have a Parental relationship to Notes for attachments, this will also delete all email attachments.
    • Just in case you have other record types with attachments, Bulk delete any remaining notes with attachments (notes where filesize > 0)
    • Set up Recurring Bulk Delete of System Jobs older than a few days

    If this answered your question, please click on the arrow button next to Reply Inline and choose 'Make Best Answer.'
    Nelson Johnson, Solution Architect
    BroadPoint, Inc., Bethesda MD
    Link with me!

  • 3.  RE: Best Practices when Updating Sandbox

    Posted Jan 21, 2020 01:45 PM
    What I used to do was to create a minimal copy - so without any production data - and just once create a set of .csv files with sample data that I would import each time. This may mean that I have to adjust the sample data occasionally to include data from new fields but overall I was able to keep the Sandbox size very low.

    Sissy Bottcher
    San Diego CA

  • 4.  RE: Best Practices when Updating Sandbox

    Posted Jan 27, 2020 05:41 PM
    Hi @Marc Rohde

    We actually follow a similar process like @Sissy Bottcher

    1. We use the minimal copy feature to start with to establish the environment with a match solution set as Production.
    2. We use KingswaySoft and a suite of SSIS packages to then load in a scrambled dataset from production on a select set of core entities for full data and date filters around other entities like activities, opportunities, cases, etc as required to execute the QA teams core regression suite. We excluded any transnational data like Campaigns, Lists, etc. For test cases in those areas QA teams create data for it.

    With this approach we keep our Sandbox foot print below the 20 GB mark.

    One thing we're looking at doing in our next stage is to have 1 sandbox environment auto refreshed on a bi-weekly basis to align with sprint deployments sort of like a Staging Environment.  That way we can then use this Staging environment to do Full copies to Other Sandbox environments for QA when they want to do testing.  This approach would greatly reduce the down time for QA as the environment is being refreshed.

    Hope this helps.


    Todd Mercer
    Dynamics CRM Technical Lead
    MD Financial Management
    Ottawa ON

  • 5.  RE: Best Practices when Updating Sandbox

    Posted Jan 27, 2020 07:08 PM
    Yes, being able to automate the test data import is a great benefit.

    Sissy Bottcher
    San Diego CA

If you've found this thread useful, dive deeper into User Group community content by role