-
Book Overview & Buying
-
Table Of Contents
-
Feedback & Rating

Real-World Edge Computing
By :

Having understood the general outline of the inferencing pipeline, let’s delve into the details of containers, services, and policies as applicable to this example application.
Invariably, in a practical deployment of a containerized application, you are expected to provide external inputs to set up and customize the application. Open Horizon provides this flexibility using a User Input file that works with the service definition of each of the services. This is somewhat of a new concept in this book, and we will go into the details of User Input later in this section.
This application has three containerized services to build the inferencing pipeline – infer
, http
, and mms
. Each service is deployed using its own container image. We will introduce each of these three services and describe them in detail later when they are deployed and used.
The infer
service is the main service of the inferencing pipeline...