[RPG] In what paradigm is +1 to hit the same as improving your chance to hit by 5%

dungeons-and-dragonsstatisticsterminology

People all the time are saying that +1 to hit is a 5% better chance of hitting. I assume this is because when you roll to hit you roll a d20 and 1 is 5% of 20, but that doesn't really make a whole lot of sense.

If you have a +33 attack bonus v.s. an AC 18 creature and then you get another +1, you have increased your chance of hitting the creature from 95% of the numbers on the die to 95% of the numbers on the die, which is a 0% increase. Similarly, if you have a +0 attack bonus v.s. an AC 30 creature an additional +1 provides no meaningful percentage increase in hit chance.

It's not just in extreme cases that this doesn't add up, though. Let's say that you hit a target on a 16+. If you get a +1, you now hit them on a 15+, which is a 25% increase in your chance of hitting, and a 6.25% reduction in your chance of missing.

It seems like the only time the first partial derivative of your chance to hit with respect to your attack bonus is 5% per point of difference is when you have just barely achieved a 100% chance to hit. In that situation, a 1-point decrease changes your chance of hitting from 100% to 95%, which is, in fact, a 5% decrease. That's an extremely unusual situation, given that it requires you to have some method of not critically missing when rolling a one, so it seems unlikely that that's a default situation in everyone's heads or something.

Nonetheless, I regularly am told that +1 on a d20 is a 5% increase in chance of success, especially in the context of attack rolls and AC. What do people mean when they say this, and is what they mean true?

Equivalent example for early edition players:

You are fighting an Orc with an AC of 7. You have a THAC0 of 15. This means you have a 40% chance to hit the Orc. If you had a +1 bonus to hit (for example from a magic weapon), you would have a THAC0 of 14 and a 45% chance of hitting the same orc, which would be 12.5% more likely to hit and 8.3̅3̅3% less likely to miss. How is this 5% better?

Best Answer

It's not just in extreme cases that this doesn't add up, though. Let's say that you hit a target on a 16+. If you get a +1, you now hit them on a 17+, which is a 6.25% increase in your chance of hitting, and a 25% reduction in your chance of missing.

People are using "increased by a percent" sloppily — or, I guess, if you sigh and admit that language represents how people use it, not logic, people are "using 'percent increase' in a non-technical sense".

More formally, what people mean when they say "increased by 5%" is increased by 5 percentage points. Percentage points are units used to describe the arithmetic difference between two percentages of the same thing. This resolves the ambiguity we run into otherwise:

enter image description here

If there are 4 million voters in Senator Grayton's state, his initial polling at 20% put him at 800,000 supporters. If this "plunged by 19%", that would be a decrease of 152,000, down to 648,000. Of course, given the described circumstances, support presumably actually plunged by 19 percentage points, from 20% to 1%, leaving 40,000 oddly dedicated supporters.

So, it sounds kind of pedantic, but it's one of those areas where when we are talking about precise things like game rules, it'd probably be better for everyone to be precise. On the other hand, the more loose language is widespread, and I think it's a losing battle to try to correct it everywhere. Instead, just be aware of the ambiguity — it's usually clear what is actually meant.

Anyway, after all of that, the "+1 = 5 percentage points" statement also assumes that the to-hit chance is somewhere near the middle of the range, not only-hit-on-20 or only-miss-on-1. Especially with 5E, that's a fair assumption for most comparisons of choices, due to the bounded accuracy design principle.

Thinking about +1s on a d20 in terms of +5 percentage points is useful because it's easy to go from there to expected damage (or similar). That is, if a successful attack does 10 points of damage, expected damage with a 50% chance to hit is 5; or with a 55% chance, 5.5. That change in damage (in this case .5) is the same even if the initial to-hit is 40% or 60%. And the relative change is the same even if the attack damage is 5 or 50. Of course, advantage, disadvantage, critical hits, and auto-fails complicate this, but in the middle of the range, thinking in terms of the percentage-point change is simple.