Please have a look at this project in which steps to maximize throughput has been used.
Ok, but there is no formula to get the value of the throughput ?
And how connection really works? For example, if connection interval is 7.5ms, this means that it sends packtes during 7.5 ms and sleep for another 7.5ms?
7.5ms is the time between each packet transmission; This means that you can send a packet of data every 7.5ms. So, if you use the extended data you could get up to 512 bytes per 7.5ms or about 68266.7~ bytes per second, but that is assuming that the whole 512 bytes are sent in the same packet.
The best way to measure would be to test with a project trying to achieve maximum throughput, and then count the number of bytes received over the time it takes to get the throughput. Empirical testing is much more accurate than running calculations on the operation of the device.