[SalesForce] How frequently does Salesforce refresh DNS information

I have a managed package deployed that is calling web services supplied by another party.

On Tuesday last week all calls to this web service started throwing a:

System.CalloutException: IO Exception: Unable to tunnel through proxy. Proxy returns "HTTP/1.0 503 Service Unavailable"

It turns out the domain registration used by the web service expired and was subsequently renewed after the fact. Part of this process required the DNS entry to be updated. I'm not sure, but I believe this changed the IP address the domain was registered against.

I could successfully call the web service using SOAP UI, but calls from Salesforce were failing with the message above. I checked with the other party, they have no IP restrictions on the web services.

By Friday evening the web service calls started working again. I assume the DNS records had been refreshed. This may have been the result of the support cases that were raised or it may have been a general refresh by Salesforce of the DNS records.

In general, how often would Salesforce pickup on DNS changes?


There are a lot of assumptions about what was happening above as I didn't have access to the Salesforce support cases or the web service hosting details. I'll clear them up if I can get some more concrete details.

Best Answer

The value on the DNS records for the time to live is controlled by the Authoritative DNS host, not Salesforce. That said, each DNS relay server in the chain between the 'client' and the authoritative server can skew the actual TTL by a few seconds - but it should still be close to the original expiration time.

Most DNS relay servers don't deliberately modify TTL values on records which are retrieved so that they live longer, but anything is possible. Expiring a record and querying for new data is just a suggestion so that everything continues to work in the ever changing land of the internet.

Generally DNS Records have TTL values of 1 day. If you're performing High Availability and failover stuff, you can set the TTL as low as 30 seconds but in the day-to-day life of a DNS record, 24 hours is a fair amount of time for a record to live in the cache of DNS servers around the world at all of the different resolvers that previously looked it up.

The registrar, when the domain expired, could have had a TTL on their own records which were longer than what they have on their customer DNS records - but that's not terribly likely. The reason you were able to hit it with SOAPUI is most likely because your ISP's cache probably hadn't yet been 'polluted' with the bad value and you received a response with the proper host record(s), where the DNS servers at Salesforce were waiting for the TTL to expire on the bad records which it had already retrieved.

In any case, the location where the DNS records are defined is the source for the amount of time a DNS record lives. The host should be able to tell you what the value is and if you can't find it on their website, you could execute an nslookup against their servers/domains and see what the TTL is on the domain A and CNAME records.

(This won't tell you what the TTL was during the period when the domain registration was expired. Maybe you can find a parked / expired domain and check the TTL on those responses.)