perf

package
v0.0.0-...-683b059 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 23, 2022 License: BSD-3-Clause Imports: 13 Imported by: 0

Documentation

Overview

Package perf provides utilities to build a JSON file that can be uploaded to Chrome Performance Dashboard (https://chromeperf.appspot.com/).

Measurements processed by this package are stored in tests/<test-name>/results-chart.json in the Tast results dir. The data is typically read by the Autotest TKO parser. In order to have metrics uploaded, they have to be listed here: src/third_party/autotest/files/tko/perf_upload/perf_dashboard_config.json

Chrome Performance Dashboard docs can be found here: https://github.com/catapult-project/catapult/tree/master/dashboard // nocheck

Usage example:

pv := perf.NewValues()
pv.Set(perf.Metric{
    Name:       "mytest_important_quantity"
    Unit:       "gizmos"
    Direction:  perf.BiggerIsBetter
}, 42)
if err := pv.Save(s.OutDir()); err != nil {
    s.Error("Failed saving perf data: ", err)
}

Remote usage example

Protocol buffer definition:

import "values.proto";
service ExampleService {
    rpc Method (google.protobuf.Empty)
        returns (tast.common.perf.perfpb.Values) {}
}

In order to "import values.proto", add a -I argument pointing at src/chromiumos/tast/common/perf/perfpb/ to the protoc command in your service's gen.go file. See src/chromiumos/tast/services/cros/arc/gen.go for an example.

Service:

import "chromiumos/tast/common/perf"
import "chromiumos/tast/common/perf/perfpb"
func (s *ExampleService) Method() (*perfpb.Values, error) {
    p := perf.NewValues()
    ... // Do some computation that generates perf values in p.
    return p.Proto(), nil
}

Test:

import "chromiumos/tast/common/perf"
func TestMethod(ctx context.Context, s *testing.State) {
    ... // Set up gRPC, ExampleServiceClient.
    res, err := service.Method()
    if err != nil {
        s.Fatal("RPC failed: ", err)
    }
    if err := perf.NewValuesFromProto(res).Save(s.OutDir()); err != nil {
        s.Fatal("Failed to save perf results: ", err)
    }
}

Index

Constants

View Source
const DefaultVariantName = "summary"

DefaultVariantName is the default variant name treated specially by the dashboard.

Variables

This section is empty.

Functions

This section is empty.

Types

type Clock

type Clock interface {
	Sleep(ctx context.Context, d time.Duration) error
	Now() time.Time
}

Clock implementations are used for waiting a certain time duration. In unit tests, fake Clocks should be used to avoid race conditions.

type Direction

type Direction int

Direction indicates which direction of change (bigger or smaller) means improvement of a performance metric.

const (
	// SmallerIsBetter means the performance metric is considered improved when it decreases.
	SmallerIsBetter Direction = iota

	// BiggerIsBetter means the performance metric is considered improved when it increases.
	BiggerIsBetter
)

type Format

type Format int

Format describes the output format for perf data.

const (
	// Crosbolt is used for Chrome OS infra dashboards (go/crosbolt).
	Crosbolt Format = iota
	// Chromeperf is used for Chrome OS infra dashboards (go/chromeperf).
	Chromeperf
)

type Metric

type Metric struct {
	// Name is the name of the chart this performance metric appears in.
	Name string

	// Variant is the name of this performance metric in a chart. If this is empty,
	// DefaultVariantName is used. It is treated specially by the dashboard.
	// Charts containing only one performance metric should stick with the default.
	Variant string

	// Unit is a unit name to describe values of this performance metric.
	Unit string

	// Direction indicates which direction of change (bigger or smaller) means improvement
	// of this performance metric.
	Direction Direction

	// Multiple specifies if this performance metric can contain multiple values at a time.
	Multiple bool
}

Metric defines the schema of a performance metric.

type NewTimelineOption

type NewTimelineOption func(*NewTimelineOptions)

NewTimelineOption sets an optional parameter of NewTimeline.

func Interval

func Interval(interval time.Duration) NewTimelineOption

Interval sets the interval between two subsequent metric snapshots.

func Prefix

func Prefix(prefix string) NewTimelineOption

Prefix sets prepends all metric names with a given string.

func WithClock

func WithClock(clock Clock) NewTimelineOption

WithClock sets a Clock implementation.

type NewTimelineOptions

type NewTimelineOptions struct {
	// A prefix that is added to all metric names.
	Prefix string
	// The time duration between two subsequent metric snapshots. Default value is 10 seconds.
	Interval time.Duration
	// A different Clock implementation is used in Timeline unit tests to avoid sleeping in test code.
	Clock Clock
}

NewTimelineOptions holds all optional parameters of NewTimeline.

type Timeline

type Timeline struct {
	// contains filtered or unexported fields
}

Timeline collects performance metrics periodically on a common timeline.

func NewTimeline

func NewTimeline(ctx context.Context, sources []TimelineDatasource, setters ...NewTimelineOption) (*Timeline, error)

NewTimeline creates a Timeline from a slice of TimelineDatasources. Metric names may be prefixed and callers can specify the time interval between two subsequent snapshots. This method calls the Setup method of each data source.

func (*Timeline) Start

func (t *Timeline) Start(ctx context.Context) error

Start starts metric collection on all datasources.

func (*Timeline) StartRecording

func (t *Timeline) StartRecording(ctx context.Context) error

StartRecording starts capturing metrics in a goroutine. The sampling interval is specified as a parameter of NewTimeline. StartRecording may not be called twice, unless StopRecording is called in-between.

func (*Timeline) StopRecording

func (t *Timeline) StopRecording(ctx context.Context) (*Values, error)

StopRecording stops capturing metrics and returns the captured metrics.

type TimelineDatasource

type TimelineDatasource interface {
	Setup(ctx context.Context, prefix string) error
	Start(ctx context.Context) error
	Snapshot(ctx context.Context, values *Values) error
	Stop(ctx context.Context, values *Values) error
}

TimelineDatasource is an interface that is implemented to add a source of metrics to a Timeline.

type Values

type Values struct {
	// contains filtered or unexported fields
}

Values holds performance metric values.

func NewValues

func NewValues() *Values

NewValues returns a new empty Values.

func NewValuesFromProto

func NewValuesFromProto(vs ...*perfpb.Values) *Values

NewValuesFromProto creates a Values from a perfpf.Values.

func (*Values) Append

func (p *Values) Append(s Metric, vs ...float64)

Append appends performance metrics values. It can be called only for multi-valued performance metrics.

func (*Values) Merge

func (p *Values) Merge(vs ...*Values)

Merge merges all data points of vs into this Values structure.

func (*Values) Proto

func (p *Values) Proto() *perfpb.Values

Proto converts this Values to something that can be passed in a gRPC call.

func (*Values) Save

func (p *Values) Save(outDir string) error

Save saves performance metric values as a JSON file named and formatted for crosbolt. outDir should be the output directory path obtained from testing.State.

func (*Values) SaveAs

func (p *Values) SaveAs(ctx context.Context, outDir string, format Format) error

SaveAs saves performance metric values in the format provided to outDir. outDir should be the output directory path obtained from testing.State. format must be either Crosbolt or Chromeperf.

func (*Values) Set

func (p *Values) Set(s Metric, vs ...float64)

Set sets a performance metric value(s).

Directories

Path Synopsis
Package perfpb defines helpers for adding perf.Values to gRPC calls.
Package perfpb defines helpers for adding perf.Values to gRPC calls.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL