I'm trying to make sense of geth's hashrate and Ethereum's difficulty setting. I'm running a private test net and set the difficulty to a fixed value (to big.NewInt(250)
) as described by https://ethereum.stackexchange.com/a/7159/5293.
I start the CPU mining in geth using one thread with miner.start(1)
. Then I measure the hashrate with eth.hashrate
, it's usually a value between 8 and 12 (let's say h = 10 H/s on average). I measured the average blocktime by mining for 15 minutes and it was around 65 seconds.
Now, I'd like to use the difficulty setting and hashrate to calculate the expected blocktime. If I'd calculate it in the way the terms 'difficulty' (d) and 'hashrate' (h) are used in Bitcoin, I'd get
t = 2^32 * d / h,
which is not even close to 65 seconds.
I'm running geth on a Intel(R) Core(TM) i5 CPU, with 2.80GHz. I assume that the value h = 10 H/s is rather useless as it should be much higher (but where does it come from then)? With my processor, I'd assume that I cannot get higher than maybe h = 50 kH/s. However, to yield the 65 seconds, I'd need a hash rate of
h = 2^32 * d / t = 1.7 * 10^10
which is much higher than my maximum hashrate.
Best Answer
You're not correct.
Exact formula you're asking is t = d / h.