I would look at using Batch Apex. It can handle the large data sets. I have found that the code is generally cleaner to write than triggers (e.g., different logic for insert/update, etc.) when the data model is very complex.
Even if your list size is greater than 200, Salesforce will break up your lists into chunks of 200 for trigger processing. Each chunk contributes to the same shared limit, though. If you were to perform one bulkified SOQL query in a single trigger you'd be able to process 100 * 200 = 20,000 records of that object in a single transaction until you hit the 101 max. If you are just using DML you'd have 150 * 200 = 30,000. That only accounts for a situation where a single query or single dml statement is issued.
In a complex data model you can definitely have many more queries occuring in your triggers, because your single save will trigger saves on other related records (e.g., roll-up summary triggering save in parent). It is worth keeping in mind what happens when you save a record when you design. If you design your solution to have triggers on every object it can be cumbersome to figure out why something is getting updated on a related record and can be harder to maintain.
On top of all of that, if your data model is really that complex you have to consider what will happen when you set up unit tests. If you have triggers on every object with many queries and updates you can definitely get into a situation where you approach the 101 SOQL Query limit when you are setting up the unit test data.
There's a developer force blog post that has a some high level summaries on working with large data sets and links to other resources.
Salesforce.com calculates the total amount of time it takes to process the page, including querying the database and rendering the page. Using JavaScript remoting will drastically reduce the time spent rendering the page server-side by offloading this responsibility to the client. Furthermore, using static resources instead of inline JavaScript will also reduce the calculated usage time, because data fetched from a cache (such as the CDN) doesn't count against your total server time. Consider designing your page to use the following code:
<apex:page controller="MySiteController">
<apex:includeScript value="{!$Resource.mySiteJS}"/>
<div id="content"></div>
</apex:page>
Your controller can be completely written as remote actions. Note that I have zero view state and only one expression to evaluate. Here's some possible code that you might use in your controller:
public with sharing class MySiteController {
@RemoteAction
public static SomeData getSomeData(SomeParam param) {
// Do stuff here, return SomeData.
}
// More remote action functions here
}
The client-side rendering is taken care of inside MySiteJS:
(function() {
function init() {
// when the page loads
}
// Other functions here. We can also use jQuery, etc...
addEventListener('load', init, true);
}());
Using this design structure, you should be able to get your Visualforce page loads down to about 20ms, and your remoting actions should typically return in far less than 1000ms even if you have a ton of data you're throwing down the wire. And remember, CDN (your JavaScript) is free of charge, so leverage that fact to reduce your total page time. If you're providing localization strings, I would store the languages in various static resources as well; have the page load the appropriate language pack based on the user's settings.
Best Answer
When you hit a
LimitException
, it terminates the entire transaction. No otherApex
code you have written will run within the same transaction, because it is over.