Hi, my own model requres sth like GPU to run the model, so how can I submit the results directly instead of selecting paths? Thanks.
I am not sure to understand?
When creating a Run, you can ask for a GPU.
Hi, thanks for your reply. I have my own algorithm to run the code for inference. Do you mean that the church system will call my local environment but cloud gpu for running?
My further question is about the testing stage. I think we have different spatial samples but a single scRNA-seq dataset. Therefore, should we only be allowed to submit only one model or multipl models for different samples? Thanks.
Yes, crunch.test
will use your local environment.
Once your submit, you will be able to choose a GPU runner if you wish.
You can have multiple models and load them based on the target_name
parameter.
It will contain the sample name (without the .zarr
extension).
Here is the list of all available parameters.
Sounds good, thanks a lot.