Trying to understand and maybe “backsolve” the basic attack DPS calculation. Ultimately theorycraft with a lot of different mechanics included but first i need to know the really simple basics. Hopefully @ehg_staff can maybe give some feedback on whether i am on right track here. I am premising the initial simple calculation on
So i made a brand new char (a Primalist but maybe need to try on other classes as well) then ran to first base without killing anything. Bought 2 Broad Sword’s and crafted +4 Melee Phys on second one.
DPS 1 = 21 on char sheet … Calculation = (12 + 0) x 1.15 x ( 1 + 5% x 200%) = 15.18
DPS 2 = 28 on char sheet … Calculation = (12 + 4) x 1.15 x ( 1 + 5% x 200%) = 20.24
So in both cases the calculation need to be increased by a factor of 1.3834 to get to the value indicated on character sheet (ignoring any potential rounding). I have no idea what they are out by the same factor and a very strange / off factor as well.
thats not how crit chance and crit damage is normally interpreted. 200% crit multiplier should mean that when you crit you do 3x damage ie 1 + 200% damage because when you dont crit it is 1x being 1+0%
So if we have 10 base damage, 1 attack per second and 5% crit chance and 200% crit multiplier then in 20 attacks it should be:
19 x 10 = 190
1 x 30 = 30
Total = 220 in 20 attacks so 11 on average
the maths then being >>>>> Basic Attack DPS = 10 x 1.0 x (1 + 5% x 200%) = 11
i agree this would be very odd if there was an inherent 1.47 attack speed multiplicative with stated attack speed. I also cant see how animations per second (ie attacks per second or APS) can or should affect basic dps calculations. APS and FPS mostly interact to create attack speed breakpoints for “on hit” effects where you get no real damage benefit until your increased attack speed creates a lower FPA / frames per attack (because FPA can only be in whole numbers hence the breakpoint result).
I would expect that each weapon type to have an inherent min and max DPS which we cannot see currently. This mechanism would create differences in values seen per hit as each hit would be a randomised starting value between the inherent min and max (you will see different white (non-crit) and yellow (crit values) values when hitting dummy or monsters).
If this was the case then the following 2 examples would work out fine but they dont (assume inherent min/max avg for bronze sceptre is 6)
Bronze Sceptre 1 (plain white item) : (6+10+0) x 0.98 x 1.1 = 17.248 >>>> sheet says 15
Bronze Sceptre 2 (has +3 melee phys crafted) : (6+10+3) x 0.98 x 1.1 = 20.482 >>>> sheet says 20
So TL:DR i am very confused on how @EHG are doing their damage calculations. Would be useful if @ehg_staff could comment / give some input please.
I’ve had a “discussion” here about the crit damage multi & it’s x2, not 200% increased…
I meant, it’s odd that the “base” attack speed isn’t 1 per second & the weapon’s implicit modifier plus all the other sources of attack speed are applied to that. Granted a base attack speed of 1 per second is as arbitrary as any other number, but still.
But that’s not how it works either. The only hidden modifier is that the damage has a random modifier of between +25% & -25% applied when you hit. So if you have a damage of 100 after all other modifiers, the damage that the mob actually takes is somewhere between 75 & 125. DoTs don’t get this randomised modifier.
APS breakpoints were a thing back when frames had an impact on the animation (as you probably remember from Diablo 1 & 2), so getting more attack speed if you didn’t then hit the required threshold to remove a frame from the animation was pointless. I’m not an animator (or developer in any way) but I don’t think that’s how the animations are done in LE, I assume that they’re smoothly “sped up”. If the animations and damage calculations are decoupled from the fps, then you don’t need APS breakpoints.
Thanks for all the contributions everyone. It has helped me understand the basics.
For those interested i have done the following basic empirical tests and all come out fine. Just to note that LE display truncates and doesn’t round (19.9 displays as 19) and the calculation has a 1.5 factor built in. I don’t know why it’s there but it gives the right answer it appears.
DoTs should be fairly simple since they have their own base damage (which nothing should add to) & are affected by the relevant affixes (ignite is affected by global damage, fire damage, elemental damage, elemental damage over time, damage over time & the attribute for the skill that proc’d it, as well as ignite duration & I think there are some Mage/Sorcerer passives that give ignite effectiveness or something similar).
Normal hit damage is a bit different since there are (generally speaking) two types of modifier - more & increased. All of the different sources of “more” (and I’m including different nodes on a single skill as different sources) should be applied separately (so a skill that does 100 damage & has 2x nodes that do +20% more would give 100 x 1.2 x 1.2 = 144 damage, not 100 x 1.4 = 140), then all of the “increased” should be added together & applied.
What I’m not sure about, however, is whether the various different sources of “increased” (passives, gear, attributes & global effects from other skills) are added together into a single figure then applied. What I’ve heard is that they aren’t all added into a single figure & applied together & I’m not sure what ones are added together…
Edit: Doing the testing on what sources of “increased” are added together might be easiest done with DoTs since they don’t have the random +/- 25% variation.
whats is quite strange is that 1.5 gives the right answer (20.07) for one of the Bronze Sceptres above but 1.47 gives 19.66 which would show as 19 in sheet but it shows 20. Maybe there are multiple layers of rounding.
when i try the calc with my Werebear / Sovnya i get the correct answer using 1.46666 but incorrect with 1.5
Maybe 1.47 changes depending on weapon base attack rate ?