[SalesForce] Ignoring Validation rules when deploying code

Scenario: Production org has sufficient code coverage with unit-tests. Someone adds validation rule(s) which cause the tests to break. Minimum 75% code coverage is no longer there.
We come along to add new code (were not involved in the orginal code). When we go to deploy our code which itself has sufficient coverage the deployment fails due to the newly added validation rules breaking the tests on the legacy code.

Options here are

  1. Review all the legacy code written by a third party and update it
    and its tests so it handles the newly added validation rules – not
    really viable in some cases, neither the client nor ourselves want
    that expense.
  2. Manually disable the validation rules which are
    breaking the legacy code unit-tests for the duration of the
    deployment – not ideal, but the route I've been forced to take on
    occasion.
  3. ???

Best Answer

I have seen a solution that uses a Custom Setting of ValidationRuleEnabled.

ALL validation rules set up have the && $Setup.CustomSetting__c.ValidationRuleEnabled__c added.

When you want to deploy any code then the administrator changes the Custom Setting to FALSE, deploy the new code; don't forget to re-enable the Custom Setting!

Again this is not ideal as the 'legacy' code should be updated to accommodate the new validation rules; ideally at the time of creating the new validation rules (but who checks code coverage after making a small change like a validation rule?)

Related Topic