There are a number of visual and audio settings that are commonly tweaked by competitive players.
Visuals
Many players in UT2004's old competitive community used the mutator UTComp to force a default player model + a brightskin. This is of course only available on servers using the mutator.
On servers not running the mutator, users can force a default model for all opponents via the User.ini file :
[XGame.xPawn]
PlacedCharacterName=Gorge
PlacedFemaleCharacterName=Rylisa
Only certain models can be forced, and you must play as one of models you force. I believe these are some of the available options:
Males: Gorge, Xan, Malcolm, Brock
Females: Rylisa, Tamika, Sapphire, Enigma, Ophelia
I know many competitive players who would set lower their texture quality to the very lowest, to reduce clutter. I don't believe this was allowed in tournament play, but that would vary by tourny rules.
Audio
On servers with UTComp, many players use CPMA style (damage dependent pitch) hitsounds.
Almost all players turn off their music, and some adjust their ambient volume and roll-off settings in UT2004.ini :
[ALAudio.ALAudioSubsystem]
AmbientVolume=0.125000
RollOff=0.400000
As you increase ambient volume, you can hear locational sounds such lifts and creaky boards more clearly, but there is also more environmental noise such as thunder, water, etc. So, it's a trade-off.
Reducing roll-off is likened to cheating, because decreasing the value decreases the dissipation of sound over distance; sounds travel farther. Competitive servers often use a 'lock roll-off' mutator that limits the value to 0.4 (default is 0.5.)
Other
Check out TweakGuides' UT2004 guide for additional information on configuring your .ini files.
"Up until the driver update it worked", roll back your driver. Make note of any additional settings that disappear, and any different settings.
Skyrim: how to fix input lag and vertical sensitivity for mouse and 360 controller (PC) -YouTube
Most of these are ini tweaks. Try lowering your pre-rendered frames in the Nvidia control panel to 1. Cap the frame rate at 50fps.
Alternatively: (GameFAQs, --Helfire-1)
Enable adaptive Vsync (half refresh rate)
Enable triple frame buffering
Pre-render 3 frames
In the ini file: Mouse acceleration off {add bMouseAcceleration=0 to [Controls]}
In the standard options:
Distant object detail (medium)
My settings: (Like in the video, it also took me 9+ hours to get Skyrim playable and like you I forgot half of what I'd done)
fMouseCursorSpeed=2.000 (fixes annoyingly slow menu cursor)
fMouseHeadingSensitivity=0.2000
fMouseHeadingXScale=0.2000
fMouseHeadingYScale=0.2000
The above allows you to make 180 degree turns in a single gesture, also:
bFXAAEnabled=0
RadialBlurLevel=0
bDoDepthofField=0
bDoHighDynamicRange=0
bUseBlurShader=0
bGamepadEnabled=0
Interestingly, I find no MouseAcceleration setting in my ini (I'm unsure how many problems SkyUI and SKSE have solved for me.)
If you have applied any ini tweaks, DO NOT adjust anything from the in-game menus.
Best Answer
Automatic graphic performance detection implemented by games are often innacurate and in the case of Bethesda games (like Skyrim) "almost always" would often be more appropriate. I don't have any experience with Saints Row 3, however it may suffer from similar problems.
In the case of your GeForce 550M I must say that it cannot be considered a mid/high-end card but it is instead part of the low-end card segment.
You can forget about playing Metro 2033 at Maximum settings. Can you run it? is generally not a bad service but their results are not always 100% accurate.
At 1366x768 these should be more accurate FPS values for your graphics card when running Metro 2033.
Source
TR;DL There is nothing wrong in your settings, your GeForce 550M cannot do more than this. Try to lower resolution or graphic settings to increase performance.