Can you clarify is there a way to decompose python code into logical modules (i.e.
feature_engineering.py, etc.) stored as local python package (with
__init__.py within ) within the same directory as my submission dir (i.e.
Please note that all of these modules are built on top of whitelisted packages (i.e. pandas, scikit-learn), my submission passes local tests in a notebook, however, the it does not pass internal validation on
submission page here, as there are “forbidden” libraries found (i.e.
So, the question is, are there other ways beyond just copy-pasting all the code back again from modules into a “long, ugly” jupyter notebook just to make a valid submission?
i.e. to copy such .py files to a “resources” directory on a server first?
And also a small side question - is there a way to specify exact library versions so they remain the same in requirements.txt ? I wonder whether Crunch uses latest available versions at the time of prediction, which (technically) may cause incompatibilities in the future (i.e. API changes in pandas) leading our models, trained using older package versions, to crush during 2nd stage of a competition?
or, in opposite, what library versions to use locally to make it consistent with the server ones?
Many thanks in advance for your hints and answers!