Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

How to host tensorflow model online

I’m trying to serve an ML Model as REST API using TensorFlow Serving.

I want to know whether there’s a way to host the model online rather than locally?

Thanks a lot in advance.

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

I need to host a ML model where there’s a mapping going with a string id when doing predictions.

The model is a .h5 file.

The program runs in a notebook. But I don’t know how to do the hosting, when developing the mobile app.

>Solution :

Probably the easiest solution would be to pack your model into a docker container and host it on any VPS like AWS EC2 or DigitalOcean.

There is a medium article that describes the process of creating the container. It is rather old, but should be mostly relevant even today.

Serving ML Quickly with TensorFlow Serving and Docker

After you have your container, you can follow any guide that describes how to publish it to the cloud.

For example:

Deploy to AWS, Docker in 10 Minutes! | by Milan McGraw | Geek Culture | Medium

Also, if you do not strictly need to use tf serving, you could look into some projects specifically designed to ease the deployment process like BentoML. I am confident there are plenty of guides online that describe how to host the app on any platform.

BentoML: Build, Ship, Scale AI Applications

Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading