Deployment

Deployments #

FastAPI#

FastAPI wraps the Satellighte library to serve as RESTful API

From root directory of the repository run followings,

Install Dependency For FastAPI#

pip install fastapi==0.74.1
pip install "uvicorn[standard]"==0.17.5
pip install python-multipart

Run AI Service#

python deployment/fastapi/service.py

Build AI Service As Docker Image#

From root directory of the repository run followings,

docker build -t satellighte-fastapi deployment/fastapi/

Run AI Service As Docker Container#

if gpu enabled, run with

docker run -d --name satellighte-service --rm -p 8080:8080 --gpus all satellighte-fastapi

if gpu disabled, run with

docker run -d --name satellighte-service --rm -p 8080:8080 satellighte-fastapi

ONNX#

ONNX Runtime inference can lead to faster customer experiences and lower costs.

From root directory of the repository run followings,

Install Dependency For ONNX#

pip install onnx~=1.11.0
pip install onnxruntime~=1.10.0

Convert Model to ONNX#

python deployment/onnx/export.py
#  python deployment/onnx/export.py --model_name mobilenetv2_default_eurosat --version 0

ONNX Runtime#

python deployment/onnx/runtime.py
#  python deployment/onnx/runtime.py -m satellighte/models/mobilenetv2_default_eurosat/v0/mobilenetv2_default_eurosat.onnx -s src/eurosat_samples/AnnualCrop.jpg

DeepSparse#

Neural Magic’s DeepSparse Engine is able to integrate into popular deep learning libraries allowing you to leverage DeepSparse for loading and deploying sparse models with ONNX.

From root directory of the repository run followings. We need the ONNX model to use it. Create your onnx model from the above steps. Next,

Install Dependency For DeepSparse#

pip install deepsparse~=1.0.2

DeepSparse Runtime#

python deployment/deepsparse/runtime.py
# python deployment/deepsparse/runtime.py -m -m satellighte/models/mobilenetv2_default_eurosat/v0/mobilenetv2_default_eurosat.onnx -s src/eurosat_samples/AnnualCrop.jpg

TensorFlow#

TensorFlow is a free and open-source software library for machine learning and artificial intelligence.

From root directory of the repository run followings,

Install Dependency For TensorFlow#

pip install onnx-tf~=1.10.0
pip install tensorflow~=2.9.1
pip install tensorflow-probability~=0.17.0

From root directory of the repository run followings. We need the ONNX model to use it. Create your onnx model from the above steps. Next,

Convert ONNX Model to TensorFlow#

python deployment/tensorflow/export.py
# python deployment/tensorflow/export.py -m satellighte/models/mobilenetv2_default_eurosat/v0/mobilenetv2_default_eurosat.onnx

TensorFlow Runtime#

python deployment/tensorflow/runtime.py
# python deployment/tensorflow/runtime.py -m satellighte/models/mobilenetv2_default_eurosat/v0/mobilenetv2_default_eurosat_tensorflow  -s src/eurosat_samples/AnnualCrop.jpg -l "AnnualCrop,PermanentCrop,Forest,HerbaceousVegetation,Highway,Industrial,Pasture,Residential,River,SeaLake"

TensorFlow Lite#

TensorFlow Lite is a mobile library for deploying models on mobile, microcontrollers and other edge devices.

From root directory of the repository run followings,

Install Dependency For TensorFlow Lite#

pip install onnx-tf~=1.10.0
pip install tensorflow~=2.9.1
pip install tensorflow-probability~=0.17.0

From root directory of the repository run followings. We need the TensorFlow model to use it. Create your tensorflow model from the above steps. Next,

Convert TensorFlow Model to TensorFlow Lite#

python deployment/tensorflow_lite/export.py
# python deployment/tensorflow_lite/export.py -m satellighte/models/mobilenetv2_default_eurosat/v0/mobilenetv2_default_eurosat_tensorflow

TensorFlow Lite Runtime#

python deployment/tensorflow_lite/runtime.py
# python deployment/tensorflow_lite/runtime.py -m satellighte/models/mobilenetv2_default_eurosat/v0/mobilenetv2_default_eurosat_tensorflow.tflite -s satellighte/src/eurosat_samples/AnnualCrop.jpg -l "AnnualCrop,PermanentCrop,Forest,HerbaceousVegetation,Highway,Industrial,Pasture,Residential,River,SeaLake"