I had a batch job that I tried to start. Its "startup" method that launches it checks to see if it's already running by querying the asyncjob object, then starts it only if it's not running already AND not scheduled.
After passing those checks, it schedules the job. Here I was getting the error saying that the job with that name was already scheduled.
This happened consistently, but neither the "scheduled jobs" page in the UI menu nor a query could find the job.
To get it to run, I had to switch the name of the job to something else.
I opened a case but have heard nothing of use.
Here's where I look for scheduled jobs (finds none) and proceeds to try to schedule it.
List<CronTrigger> jobs = [SELECT Id, CronJobDetail.Name, State, NextFireTime
FROM CronTrigger where CronJobDetail.Name='async_example_schedule2'];
and here's where I schedule it
try{
System.schedule('async_example_schedule2',
GetSchedulerExpression(DateTime.Now().addSeconds(3)),
new ScheduledDispatcher());
}catch (exception e){//fail silently}
Anyone seen this before? I used this batch as an endless loop, so it's very important that it stay running.
As an update, the workaround is to rename the job in the apex class. Then it works for for a few weeks until this same thing happens (we're now on async_example_schedule3 🙂
Best Answer
I have seen similar behavior before. I've also run into two related issues when running self-scheduling jobs that run a lot:
The last time I encountered this was a while ago, so API changes may have affected behavior since then, but the approach that has worked with us since is:
That works for us and has been reliably working for at least a year.
There is another issue you may run into with jobs who run on a one-off (rather than recurring) cron schedule, which is that during major SFDC maint outages/upgrades I've seen what appears to be the symptom of in-progress batch jobs get terminated or not run, leading to a failure to reschedule. To solve that related issue, I'd recommend the approach that Stephen Willcock presented in answer to this question.