-
Notifications
You must be signed in to change notification settings - Fork 136
Description
Hi all, I am new to KServe and Modelmesh Serving and try to get to know the software. I am trying to combine Kubeflow with Modelmesh Serving by using the following pipeline in Kubeflow:
- Preprocess data
- Train model
- Deploy model on Modelmesh
With Kubeflow the output of each of the steps is a .tgz file having the model data that Modelmesh requires. The problem is that serving the model is not working if you point directly to the .tgz file. I am guessing that Modelmesh cannot handle that format. The problem, however, is that I have a full repository of models already in s3 which are compressed in tar.gz.
It would be very useful for me if Modelmesh can handle these extensions. I am not sure if I am missing something, and that it should actually work. Does anyone know if this is possible currently?
Note: I can see in KServe's documentation (https://kserve.github.io/website/0.8/modelserving/storage/uri/uri/#train-and-freeze-the-model_1) that it would be possible to do that.