ie

package
v0.36.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 5, 2024 License: Apache-2.0 Imports: 2 Imported by: 7

README

Using the Intel OpenVINO Inference Engine

The Intel OpenVINO Inference Engine is a set of libraries for executing convolutional neural networks.

GoCV support for the Intel OpenVINO Inference Engine will be able to be found here in the "gocv.io/x/gocv/openvino/ie" package.

How It Works

Support in GoCV for the Intel OpenVINO Inference Engine requires version 2019 R3+ in order to work.

How to use

This code loads a Caffe model, and then uses OpenVINO inference engine to prepare it for execution on the GPU:

net := gocv.ReadNet("/path/to/your/model.caffemodel", "/path/to/your/config.proto")
if net.Empty() {
    fmt.Println("Error reading network model")
    return
}
// GPU usage
net.SetPreferableBackend(gocv.NetBackendType(gocv.NetBackendOpenVINO))
net.SetPreferableTarget(gocv.NetTargetType(gocv.NetTargetFP16))

// Intel Neural Compute Stick 2 usage
net.SetPreferableBackend(gocv.NetBackendType(gocv.NetBackendOpenVINO))
net.SetPreferableTarget(gocv.NetTargetType(gocv.NetTargetVPU))

Documentation

Overview

Package ie is the GoCV wrapper around the Intel OpenVINO toolkit's Inference Engine.

For further details, please see: https://software.intel.com/en-us/openvino-toolkit

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func Version

func Version() string

Version returns the current Inference Engine library version

Types

This section is empty.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL