I want to know exact formula to generate experience.txt.

I found this generator:

viewtopic.php?t=38560

The attachment is unfortunately gone but the author left formula in his post:

However when I tried this formula with max level 99 it did not give me exact experience.txt numbers. Has anyone ever managed to find a formula that generates the exact experience.txt numbers? [0, 500, 1500, 3750, ...]Experience = (FFFFFFFF-256*Lvl)*(Lvl/MaxLvl)^4+256*Lvl

I know that at some point 1024 ExpRatio kicks in but I am interested to find formula that can generate those exact numbers as long as ExpRatio is 1024 (which is up to level 69).

Update 22-04-2024: (writing down my progress so I can come back later)

I asked from Discord about this topic. Necrolis suggested that I should plot it and regress it.

I later learned that plotting may not be needed and I can do just regression:you can plot it and regress it

then you either get an exact formula or an equation thats pretty close

https://www.statology.org/exponential-regression-excel/

So far I have also tried asking ChatGPT and it thought it is a*x**4 + b*x**3 + c*x**2 + d*x + e but I couldn't come up with a,b,c,d,e values. I am also not sure if this is the correct idea at all. I am just leaving it here in case it gives someone ideas.plotting is normally how you do regressions by hand, so its just a habit of mine

Update 23-04-2024:

I was looking into experience.txt generation algorithm again. This time I looked how many times the previous experience do you need to level up. For example to get level 2 you need 3 times the experience of level 1. To get level 3 you need 2.5 times the experience of level 2 and so on. Seems like it is some kind of logarithmic function? It seems to slowly decay to 1.09. Any ideas how to make such function that goes through these points no further than rounding error away? If could figure out such function it I could already try to write code that generates experience.txt that is 99.9 % accurate.

I also learned that there is post by Myhrginoc where he reached to the similar conclusion about exponent: viewtopic.php?t=5808

Update 27-04-2024:

I tried to investigate how many times the experience you need. For example for level 2 you need 1500/500 = 3x more experience than for level 2, for level 3 you need 3750/1500 = 2.5x more experience than for level 2 and so on.

So I came up with this code:

Code: Select all

```
import numpy as np
from scipy.optimize import curve_fit
import matplotlib.pyplot as plt
x = np.array([1, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45,
46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 64])
y = np.array([3, 2.5, 2.1, 1.8, 1.6, 1.45, 1.35, 1.3, 1.25, 1.25, 1.25, 1.25, 1.25, 1.25, 1.25, 1.25, 1.25, 1.25, 1.25, 1.25, 1.25, 1.25, 1.25, 1.25, 1.218, 1.195, 1.178,
1.165, 1.154, 1.146, 1.138, 1.133, 1.128, 1.123, 1.12, 1.117, 1.114, 1.111, 1.109, 1.107, 1.106, 1.104, 1.103, 1.102, 1.101, 1.1, 1.099, 1.098, 1.097, 1.097, 1.096, 1.095, 1.095, 1.095, 1.094, 1.094, 1.093, 1.093, 1.093, 1.093, 1.092, 1.092, 1.092, 1.092])
print(len(x))
print(len(y))
# Take the natural log of x and y values
x_log = np.log(x)
y_log = np.log(y)
# Define the linear function for the log-transformed values
def linear_function(x, a, b):
return a + b * x
# Perform curve fitting
parameters, _ = curve_fit(linear_function, x_log, y_log)
a, b = parameters
# Extract the original exponential function parameters
a = np.exp(a)
# Plot the original data and the fitted exponential curve
x_fit = np.linspace(min(x), max(x), 100)
y_fit = a * x_fit**b
plt.plot(x, y, 'o')
plt.plot(x_fit, y_fit, 'r-')
plt.title('Exponential Fit')
plt.xlabel('x')
plt.ylabel('y')
plt.show()
# Display the exponential function coefficients
print("Coefficients:")
print("a = ", a)
print("b = ", b)
```

Looking at dots which represent actual data it looks like it's not single formula but rather various ones for a set of level ranges.

Later I also tried to plot actual experience values using the follwing code:

Code: Select all

```
import numpy as np
import matplotlib.pyplot as plt
from scipy.optimize import curve_fit
x = np.array([1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69])
y = np.array([500, 1500, 3750, 7875, 14175, 22680, 32886, 44396, 57715, 72144, 90180, 112725, 140906, 176132, 220165, 275207, 344008, 430010, 537513, 671891, 839864, 1049830, 1312287, 1640359, 2050449, 2563061, 3203826, 3902260, 4663553, 5493363, 6397855, 7383752, 8458379, 9629723, 10906488, 12298162, 13815086, 15468534, 17270791, 19235252, 21376515, 23710491, 26254525, 29027522, 32050088, 35344686, 38935798, 42850109, 47116709, 51767302, 56836449, 62361819, 68384473, 74949165, 82104680, 89904191, 98405658, 107672256, 117772849, 128782495, 140783010, 153863570, 168121381, 183662396, 200602101, 219066380, 239192444, 261129853, 285041630])
def power_function(x, a, b):
return a * x**b
popt_power, pcov_power = curve_fit(power_function, x, y)
def logarithmic_function(x, a, b):
return a + b * np.log(x)
popt_log, pcov_log = curve_fit(logarithmic_function, x, y)
def exponential_function(x, a, b, c):
return a * np.exp(b * x + c)
popt_exp, pcov_exp = curve_fit(exponential_function, x, y, maxfev=10000)
plt.figure(figsize=(10, 6))
plt.plot(x, power_function(x, *popt_power), label='Power Function')
plt.plot(x, logarithmic_function(x, *popt_log), label='Logarithmic Function')
plt.plot(x, exponential_function(x, *popt_exp), label='Exponential Function')
plt.scatter(x, y, label='Original Data')
plt.legend(loc='upper right')
plt.title('Data Fitting')
plt.xlabel('x')
plt.ylabel('y')
plt.show()
```

So it seems like in big picture power function and exponential function fit well for experience. If anyone could help me come up with code to make Logarithmic Function fit also that would be nice.