You need indexes for all of the fields you're using if you're going to use SOQL. This means that they must all be marked as External ID or have custom indexes created.
Failing that, you may want to use SOSL, because the search indexer (not to be confused with a database index) will include all of the email fields in the search.
Example:
FIND :email IN EMAIL FIELDS RETURNING Contact(Id, npe01__HomeEmail__c, npe01__AlternateEmail__c, email, npe01__WorkEmail__c, Alternate_Email_2__c, Alternate_Email_3__c)
You can also use multiple statements in a search term. See the FIND {searchTerm} page for details.
I think your Order object contains more than 200,000 records. That's why it is complaining.
Secondly you are using SOQL inside a for loop, that's also need to be avoided.
More Efficient SOQL Queries
For best performance, SOQL queries must be selective, particularly for queries inside triggers. To avoid long execution times, the system can terminate nonselective SOQL queries. Developers receive an error message when a non-selective query in a trigger executes against an object that contains more than 200,000 records. To avoid this error, ensure that the query is selective.
Selective SOQL Query Criteria
A query is selective when one of the query filters is on an indexed field and the query filter reduces the resulting number of rows below a system-defined threshold. The performance of the SOQL query improves when two or more filters used in the WHERE clause meet the mentioned conditions.
The selectivity threshold is 10% of the first million records and less than 5% of the records after the first million records, up to a maximum of 333,333 records. In some circumstances, for example with a query filter that is an indexed standard field, the threshold can be higher. Also, the selectivity threshold is subject to change.
Refer Working with Very Large SOQL Queries
Approach will be like this:
Also it be recommended to create a Trigger handler to put your processing logic.
trigger StartFlow on Order__c (after update )
{
Set<Id> opptyIds = new Set<Id>();
public Flow.Interview.Create_Renewal_OppLineItems_from_Invoice_record DummyFlow {get; set;}
if(Trigger.isAfter && Trigger.isUpdate)
{
for(Order__c o : Trigger.New)
{
//put the comparison criteria for records to be filtered.
opptyIds.add(o.Opportunity_Name__c);
}
List<Order__c> Ord =[SELECT ID,Account_Name__r.Customer_Profile_Pricebook__c,End_Date__c,Year_Enddate_1_day__c,
Journal_Reader_Code__c,Number_of_Users__c,Renewal_Number__c, Opportunity_Name__c,
Product__r.Id,Delegate_Admin__c,CurrencyIsoCode
FROM Order__c
WHERE Opportunity_Name__c IN:opptyIds];
if(Ord.size()>0)
{
string[] value0 = new string[]{Ord[0].Account_Name__r.Customer_Profile_Pricebook__c};
date[] value1 = new date[]{Ord[0].End_Date__c};
double[] value2 = new double[]{Ord[0].Year_Enddate_1_day__c};
string[] value3 = new string[]{Ord[0].Journal_Reader_Code__c};
double[] value4 = new double[]{Ord[0].Number_of_Users__c};
double[] value5 = new double[]{Ord[0].Renewal_Number__c};
Id[] value6 = new Id[]{Ord[0].Opportunity_Name__r.Id};
Id[] value7 = new Id[]{Ord[0].Product__r.Id};
string[] value8 = new string[]{Ord[0].Delegate_Admin__c};
string[] value9 = new string[]{Ord[0].CurrencyIsoCode};
Map<String, Object> myMap = new Map<String, Object>();
myMap.put('VarCustomerProfile', value0);//Customer_Profile_Pricebook__c
myMap.put('VarInvoiceEnddate', value1); //End_Date__c
myMap.put('VarInvoiceEnddateYear', value2); //Year_Enddate_1_day__c
myMap.put('VarInvoiceJRC', value3); //Journal_Reader_Code__c
myMap.put('VarInvoiceNumberOfUsers', value4); //Number_of_Users__c
myMap.put('VarInvoiceRenewalNumber', value5);//Renewal_Number__c
myMap.put('VarOppId', value6);//OpportinityID
myMap.put('VarProductId', value7);//ProductId
myMap.put('VarDelegateAdmin', value8);//Delegate_Admin__c
myMap.put('VarInvoiceCurIsoCode ', value9);//CurrencyIsoCode
DummyFlow = new Flow.Interview.Create_Renewal_OppLineItems_from_Invoice_record(myMap);
}
}
}
Best Answer
First, a full recycle bin can cause problems, because IsDeleted is not indexed. Consider hard-deleting records as you delete them with the emptyRecycleBin function. I suspect the Queueable is probably running into similar problems because you're stacking the recycle bin full of data (it's meant to handle only a few thousand records per user).
Second, consider a smarter integration strategy. Deleting the entire database every day just to regenerate is wasteful and unnecessary. I'd recommend a mark-and-sweep type approach. Set a flag on every single record, then update all records using upsert commands (while unmarking in the process), then finally delete every marked record.
If your external system could track modifications and deletions, that'd be even better, as you won't be writing records constantly that don't need to be written.