I have tried changing my code; both as you suggest (using a variable in constant), and resetting the value to model directory, or just calling dump(model_directory; still no dice.) (directory does not exist).
What to do? Try using an explicit path that include ./resources?)
Thanks for your help. Is there an example around that computes a model in the model phase, saves it, and restores it in infer.
train: skip param with default value: model_directory_path=./model_directory
So you were overwriting the value.
This warning is visible if you enable advanced logs, but should still visible when running a local test.
Don’t set any value and let the system provide it for you.
def train(
...,
model_directory_path: str, # DON'T SET A DEFAULT VALUE!!
):
Constants.MODEL_DIRECTORY = model_directory_path
...
On another note, I just saw that you did not upload any resources directory with your submission, so you could have just changed the constant’s value to be ./resources directly.