We are going to build a Docker image to expose a machine learning api. The api call takes a text as argument and predicts the personality of the person who wrote it.
The model has been build using the following dataset:
This is a small dataset so, we can't take too seriously the results but, it's a starting point to build more complex systems. The Machine Learning model is serialized and included into a docker image.
docker build -t ml-api-docker:latest .
The last built image can be found in the following repository on Docker Hub:
https://hub.docker.com/r/lbcommer/ml-api-docker/
docker run -d -p 5000:5000 ml-api-docker:latest
To test the api service is working: http://localhost:5000
It will be possible to send a text (msg variable): http://localhost:5000/predict?msg=very%20happy%20experience
- Add a model to predict the sentiment of the text
- To use WSGI for production environment
- Create as an alternative a serverless version (to be deployed in AWS or GCP)