# video-analytics-serving
**Repository Path**: certus/video-analytics-serving
## Basic Information
- **Project Name**: video-analytics-serving
- **Description**: No description available
- **Primary Language**: Unknown
- **License**: BSD-3-Clause
- **Default Branch**: master
- **Homepage**: None
- **GVP Project**: No
## Statistics
- **Stars**: 0
- **Forks**: 0
- **Created**: 2020-07-07
- **Last Updated**: 2020-12-18
## Categories & Tags
**Categories**: Uncategorized
**Tags**: None
## README
# Video Analytics Serving
| [Getting Started](#getting-started) | [Documentation](#further-reading) | [Reference Guides](#further-reading) | [Related Links](#related-links) | [Known Issues](#known-issues) |
Video Analytics Serving is a python package and microservice for
deploying optimized media analytics pipelines. It supports pipelines
defined in
[GStreamer](https://gstreamer.freedesktop.org/documentation/?gi-language=c)*
or [FFmpeg](https://ffmpeg.org/)* and provides APIs to discover, start,
stop, customize and monitor pipeline execution. Video Analytics
Serving is based on [OpenVINO™ Toolkit DL
Streamer](https://github.com/opencv/gst-video-analytics) and [FFmpeg
Video Analytics](https://github.com/VCDP/FFmpeg-patch).
## Features Include:
| | |
|---------------------------------------------|------------------|
| **Customizable Media Analytics Containers** | Scripts and dockerfiles to build and run container images with the required dependencies for hardware optimized media analytics pipelines. |
| **No-Code Pipeline Definitions and Templates** | JSON based definition files, a flexible way for developers to define and parameterize pipelines while abstracting the low level details from their users. |
| **Deep Learning Model Integration** | A simple way to package and reference [OpenVINO™](https://software.intel.com/en-us/openvino-toolkit) based models in pipeline definitions. The precision of a model can be auto-selected at runtime based on the chosen inference device. |
| **Video Analytics Serving Python API** | A python module to discover, start, stop, customize and monitor pipelines based on their no-code definitions. |
| **Video Analytics Serving Microservice** | A RESTful microservice providing endpoints and APIs matching the functionality of the python module. |
> **IMPORTANT:** Video Analytics Serving is provided as a _sample_. It
> is not intended to be deployed into production environments without
> modification. Developers deploying Video Analytics Serving should
> review it against their production requirements.
# Getting Started
The sample microservice includes three media analytics pipelines.
| | |
|---------------------------------------------|---------|
| **object_detection** | Detect and label objects such as bottles and bicycles.
| **emotion_recognition** | Detect the emotions of a person within a video stream.
| **audio_detection** | Analyze audio streams for events such as breaking glass or barking dogs.
## Prerequisites
| | |
|---------------------------------------------|------------------|
| **Docker** | Video Analytics Serving requires Docker for it's build, development, and runtime environments. Please install the latest for your platform. [Docker](https://docs.docker.com/install). |
| **bash** | Video Analytics Serving's build and run scripts require bash and have been tested on systems using versions greater than or equal to: `GNU bash, version 4.3.48(1)-release (x86_64-pc-linux-gnu)`. Most users shouldn't need to update their version but if you run into issues please install the latest for your platform. Instructions for macOS®* users [here](docs/installing_bash_macos.md). |
| **curl** | The samples below use the `curl` command line program to issue standard HTTP requests to the microservice. Please install the latest for your platform. Note: any other tool or utility that can issue standard HTTP requests can be used in place of `curl`. |
## Building the Microservice
Build the sample microservice with the following command:
```bash
./docker/build.sh
```
The script will automatically include the sample models, pipelines and
required dependencies.
> **Note:** When running this command for the first time, the default
> base image for Video Analytics Serving will take a long time to
> build (likely over an hour). For instructions on how to re-use
> pre-built base images to speed up the build time please see the
> following [documentation](docs/building_video_analytics_serving.md#using-pre-built-media-analytics-base-images).
To verify the build succeeded execute the following command:
```bash
docker images video-analytics-serving-gstreamer:latest
```
Expected output:
```bash
REPOSITORY TAG IMAGE ID CREATED SIZE
video-analytics-serving-gstreamer latest f51f2695639f 2 minutes ago 1.39GB
```
## Running the Microservice
Start the sample microservice with the following command:
```bash
./docker/run.sh -v /tmp:/tmp
```
This script issues a standard docker run command to launch the
container, run a Tornado based web service on port 8080, and mount the
`/tmp` folder. The `/tmp` folder is mounted to share sample
results with the host and is optional in actual deployments.
Expected output:
```
{"levelname": "INFO", "asctime": "2020-08-06 12:37:12,139", "message": "=================", "module": "pipeline_manager"}
{"levelname": "INFO", "asctime": "2020-08-06 12:37:12,139", "message": "Loading Pipelines", "module": "pipeline_manager"}
{"levelname": "INFO", "asctime": "2020-08-06 12:37:12,139", "message": "=================", "module": "pipeline_manager"}
(gst-plugin-scanner:14): GStreamer-WARNING **: 12:37:12.476: Failed to load plugin '/root/gst-video-analytics/build/intel64/Release/lib/libvasot.so': libopencv_video.so.4.4: cannot open shared object file: No such file or directory
{"levelname": "INFO", "asctime": "2020-08-06 12:37:13,207", "message": "FFmpeg Pipelines Not Enabled: ffmpeg not installed\n", "module": "pipeline_manager"}
{"levelname": "INFO", "asctime": "2020-08-06 12:37:13,208", "message": "Loading Pipelines from Config Path /home/video-analytics-serving/pipelines", "module": "pipeline_manager"}
{"levelname": "INFO", "asctime": "2020-08-06 12:37:13,223", "message": "Loading Pipeline: audio_detection version: 1 type: GStreamer from /home/video-analytics-serving/pipelines/audio_detection/1/pipeline.json", "module": "pipeline_manager"}
{"levelname": "INFO", "asctime": "2020-08-06 12:37:13,230", "message": "Loading Pipeline: object_detection version: 1 type: GStreamer from /home/video-analytics-serving/pipelines/object_detection/1/pipeline.json", "module": "pipeline_manager"}
{"levelname": "INFO", "asctime": "2020-08-06 12:37:13,240", "message": "Loading Pipeline: emotion_recognition version: 1 type: GStreamer from /home/video-analytics-serving/pipelines/emotion_recognition/1/pipeline.json", "module": "pipeline_manager"}
{"levelname": "INFO", "asctime": "2020-08-06 12:37:13,241", "message": "===========================", "module": "pipeline_manager"}
{"levelname": "INFO", "asctime": "2020-08-06 12:37:13,241", "message": "Completed Loading Pipelines", "module": "pipeline_manager"}
{"levelname": "INFO", "asctime": "2020-08-06 12:37:13,241", "message": "===========================", "module": "pipeline_manager"}
{"levelname": "INFO", "asctime": "2020-08-06 12:37:13,333", "message": "Starting Tornado Server on port: 8080", "module": "__main__"}
```
## Detecting Objects in a Video
### Example Request:
| Endpoint | Verb | Request | Response |
|---|---|---|---|
| pipelines/object_detection/1 | POST |
JSON
{
"source": {
"uri": "https://example.mp4",
"type": "uri"
},
"destination": {
"type": "file",
"path": "/tmp/results_objects.txt",
"format": "json-lines"
}
}
|
200 Pipeline Instance Id |
| Endpoint | Verb | Request | Response |
|---|---|---|---|
| /pipelines/emotion_recognition/1 | POST |
JSON
{
"source": {
"uri": "https://example.mp4",
"type": "uri"
},
"destination": {
"type": "file",
"path": "/tmp/results_emotions.txt",
"format": "json-lines"
}
}
|
200 Pipeline Instance Id |
| Endpoint | Verb | Request | Response |
|---|---|---|---|
| /pipelines/audio_detection/1 | POST |
JSON
{
"source": {
"uri": "https://example.wav",
"type": "uri"
},
"destination": {
"type": "file",
"path": "/tmp/results_audio_events.txt",
"format": "json-lines"
}
}
|
200 Pipeline Instance Id |