onnx_cpu

package module
v0.0.0-...-eab1c32 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 19, 2024 License: Apache-2.0 Imports: 11 Imported by: 0

README

onnx-cpu modular resource

This module allows you to deploy ONNX models which you can then use with the Viam mlmodel vision service on Android.

Requirements

Before configuring your ML model, you must create a machine.

Build and run

To use this module, follow these instructions to add a module from the Viam Registry and select the viam-labs:mlmodel:onnx-cpu model from the onnx-cpu module.

Configure your onnx-cpu ML model

Navigate to the Config tab of your machine's page in the Viam app. Click on the Services subtab and click Create service. Select the mlmodel type, then select the onnx-cpu:timesyncsensor model. Click Add module, then enter a name for your ML model and click Create.

On the new component panel, copy and paste the following attribute template into your ML model’s Attributes box:

{
  "model_path": "/path/to/onnx_file/detector_googlenet.onnx",
  "label_path": "/path/to/labels.txt"
}

[!NOTE] For more information, see Configure a Machine.

Attributes

The following attributes are available for the viam-labs:mlmodel:onnx-cpu ML model service:

Name Type Inclusion Description
model_path string Required The full path to the ONNX file.
label_path string Optional The full path to the names of the category labels.
Example configuration
    {
      "name": "onnx",
      "type": "mlmodel",
      "model": "viam-labs:mlmodel:onnx-cpu",
      "attributes": {
        "model_path": "/path/to/onnx_file/detector_googlenet.onnx"
        "label_path": "/path/to/labels.txt"
      }
    }
Configure a vision service

The module allows you to deploy an ONNX ML model with the ML model service it provides. Viam's vision service allows you to make use of the deployed detection or classification model.

Navigate to the Config tab of your machine's page in the Viam app. Click on the Services subtab and click Create service. Select the vision type, then select the mlmodel model. Enter a name for your ML model and click Create.

On the new service panel, select the onnx-cpu model you configured.

You will most likely need to rename the input and outputs tensors coming from the ONNX file to use the tensor names that the vision service requires. To rename the tensors, go to your Raw JSON configuration and add the remap_output_names and remap_input_names fields to the attributes of your vision service config. For more information see the documentation for remapping tensor names.

Here is an example:

   {
      "type": "vision",
      "model": "mlmodel",
      "attributes": {
        "mlmodel_name": "onnx",
        "remap_output_names": {
          "detection_classes": "category",
          "detection_boxes": "location",
          "detection_scores": "score"
        },
        "remap_input_names": {
          "input_tensor": "image"
        }
      },
      "name": "onnx-vision"
    }

Save your config, before testing your vision service.

Next steps

You have now configured a vision service to use an ONNX ML model. Follow these steps to test your mlmodel vision service.

Local Development

This module is written in Go. If you need to package it up into a binary to create a module, run the following commands:

go build -o module cmd/module/main.go
chmod a+x module
tar -czf module.tar.gz module third_party/

third_party contains all of the .so files for the ONNX runntime. You can package only the one you need.

For Android, the Makefile will create the tar file.

You can then add the module to app.viam.com locally. the model triplet is viam-labs:mlmodel:onnx-cpu.

License

Copyright 2021-2023 Viam Inc.
Apache 2.0

Documentation

Overview

package onnx_cpu is mlmodel service module that can run ONNX files on the CPU using an external runtime environment.

Index

Constants

This section is empty.

Variables

DataTypeMap maps the long ONNX data type labels to the data type as written in Go.

View Source
var Model = resource.ModelNamespace("viam-labs").WithFamily("mlmodel").WithModel("onnx-cpu")

Model is the name of the module

Functions

This section is empty.

Types

type Config

type Config struct {
	ModelPath string `json:"model_path"`
	LabelPath string `json:"label_path"`
}

Config only needs the path to the .onnx file as well as an optional path to the file of labels.

func (*Config) Validate

func (cfg *Config) Validate(path string) ([]string, error)

Validate makes sure that the required model path is not empty

Directories

Path Synopsis
cmd

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL