ML_CTIFOR value selection

Queries about input and output files, running specific calculations, etc.


Moderators: Global Moderator, Moderator

Locked
Message
Author
maxime_legrand
Newbie
Newbie
Posts: 4
Joined: Wed Oct 23, 2024 11:17 am

ML_CTIFOR value selection

#1 Post by maxime_legrand » Tue Feb 11, 2025 12:27 pm

Hello there,

I have a question regarding the ML_CTIFOR value on continuation runs. Should this value be retained only when continuing a previous training run, or is it acceptable to keep it even after changing a parameter ?

For example, if I perform an NVT training at 200K and then start a new training at 250K for the exact same system, can I keep the ML_CTIFOR value from the 200K run as the starting value for the 250K run ? Could this be beneficial ?

Thanks in advance !


ferenc_karsai
Global Moderator
Global Moderator
Posts: 475
Joined: Mon Nov 04, 2019 12:44 pm

Re: ML_CTIFOR value selection

#2 Post by ferenc_karsai » Tue Feb 11, 2025 1:11 pm

The most important thing to look for is that ML_CTIFOR can be overcome so that sampling of training structures can be done.
Usually, the maximum Bayesian error and also ML_CTIFOR which is calculated from the last N steps where sampling was done, increases with increasing temperature. So when going from lower temperatures to higher it is absolute safe to use the previous ML_CTIFOR. So your suggestion works.

What could be a real problem for learning is going from high temperature to low. So example using the value for ML_CTIFOR obtained at 250K possibly would result no learning steps when going to 200K, because ML_CTIFOR is so high that all predicted errors are below it and hence the on-the-fly algorithm sees no necessity for sampling.


maxime_legrand
Newbie
Newbie
Posts: 4
Joined: Wed Oct 23, 2024 11:17 am

Re: ML_CTIFOR value selection

#3 Post by maxime_legrand » Wed Feb 12, 2025 10:23 am

Thanks for your answer ! :)


Locked