Tensorflow Serve | multiple models same server

There is a way to serve multiple models within one Tensorflow Serve server providing config file.

model_config_list: {
  config: {
    name: "model1",
    base_path: "/models/model1",
    model_platform: "tensorflow"
  },
  config: {
    name: "model2",
    base_path: "/models/model2",
    model_platform: "tensorflow"
  }
}
More detailed information possible to get here.
What I would like to know how convenient is this solution? Is it possible to upload new versions? Edit config?

Comments

Popular posts from this blog

Install Kubeflow locally

RabbitMQ and OpenShift