Documentation ¶
Overview ¶
Package perf provides utilities to build a JSON file that can be uploaded to Chrome Performance Dashboard (https://chromeperf.appspot.com/).
Measurements processed by this package are stored in tests/<test-name>/results-chart.json in the Tast results dir. The data is typically read by the Autotest TKO parser. In order to have metrics uploaded, they have to be listed here: src/third_party/autotest/files/tko/perf_upload/perf_dashboard_config.json
Chrome Performance Dashboard docs can be found here: https://github.com/catapult-project/catapult/tree/master/dashboard // nocheck
Usage example:
pv := perf.NewValues() pv.Set(perf.Metric{ Name: "mytest_important_quantity" Unit: "gizmos" Direction: perf.BiggerIsBetter }, 42) if err := pv.Save(s.OutDir()); err != nil { s.Error("Failed saving perf data: ", err) }
Remote usage example ¶
Protocol buffer definition:
import "values.proto"; service ExampleService { rpc Method (google.protobuf.Empty) returns (tast.common.perf.perfpb.Values) {} }
In order to "import values.proto", add a -I argument pointing at src/chromiumos/tast/common/perf/perfpb/ to the protoc command in your service's gen.go file. See src/chromiumos/tast/services/cros/arc/gen.go for an example.
Service:
import "chromiumos/tast/common/perf" import "chromiumos/tast/common/perf/perfpb" func (s *ExampleService) Method() (*perfpb.Values, error) { p := perf.NewValues() ... // Do some computation that generates perf values in p. return p.Proto(), nil }
Test:
import "chromiumos/tast/common/perf" func TestMethod(ctx context.Context, s *testing.State) { ... // Set up gRPC, ExampleServiceClient. res, err := service.Method() if err != nil { s.Fatal("RPC failed: ", err) } if err := perf.NewValuesFromProto(res).Save(s.OutDir()); err != nil { s.Fatal("Failed to save perf results: ", err) } }
Index ¶
Constants ¶
const DefaultVariantName = "summary"
DefaultVariantName is the default variant name treated specially by the dashboard.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Clock ¶
Clock implementations are used for waiting a certain time duration. In unit tests, fake Clocks should be used to avoid race conditions.
type Direction ¶
type Direction int
Direction indicates which direction of change (bigger or smaller) means improvement of a performance metric.
type Metric ¶
type Metric struct { // Name is the name of the chart this performance metric appears in. Name string // Variant is the name of this performance metric in a chart. If this is empty, // DefaultVariantName is used. It is treated specially by the dashboard. // Charts containing only one performance metric should stick with the default. Variant string // Unit is a unit name to describe values of this performance metric. Unit string // Direction indicates which direction of change (bigger or smaller) means improvement // of this performance metric. Direction Direction // Multiple specifies if this performance metric can contain multiple values at a time. Multiple bool }
Metric defines the schema of a performance metric.
type NewTimelineOption ¶
type NewTimelineOption func(*NewTimelineOptions)
NewTimelineOption sets an optional parameter of NewTimeline.
func Interval ¶
func Interval(interval time.Duration) NewTimelineOption
Interval sets the interval between two subsequent metric snapshots.
func Prefix ¶
func Prefix(prefix string) NewTimelineOption
Prefix sets prepends all metric names with a given string.
func WithClock ¶
func WithClock(clock Clock) NewTimelineOption
WithClock sets a Clock implementation.
type NewTimelineOptions ¶
type NewTimelineOptions struct { // A prefix that is added to all metric names. Prefix string // The time duration between two subsequent metric snapshots. Default value is 10 seconds. Interval time.Duration // A different Clock implementation is used in Timeline unit tests to avoid sleeping in test code. Clock Clock }
NewTimelineOptions holds all optional parameters of NewTimeline.
type Timeline ¶
type Timeline struct {
// contains filtered or unexported fields
}
Timeline collects performance metrics periodically on a common timeline.
func NewTimeline ¶
func NewTimeline(ctx context.Context, sources []TimelineDatasource, setters ...NewTimelineOption) (*Timeline, error)
NewTimeline creates a Timeline from a slice of TimelineDatasources. Metric names may be prefixed and callers can specify the time interval between two subsequent snapshots. This method calls the Setup method of each data source.
func (*Timeline) StartRecording ¶
StartRecording starts capturing metrics in a goroutine. The sampling interval is specified as a parameter of NewTimeline. StartRecording may not be called twice, unless StopRecording is called in-between.
type TimelineDatasource ¶
type TimelineDatasource interface { Setup(ctx context.Context, prefix string) error Start(ctx context.Context) error Snapshot(ctx context.Context, values *Values) error Stop(ctx context.Context, values *Values) error }
TimelineDatasource is an interface that is implemented to add a source of metrics to a Timeline.
type Values ¶
type Values struct {
// contains filtered or unexported fields
}
Values holds performance metric values.
func NewValuesFromProto ¶
NewValuesFromProto creates a Values from a perfpf.Values.
func (*Values) Append ¶
Append appends performance metrics values. It can be called only for multi-valued performance metrics.
func (*Values) Save ¶
Save saves performance metric values as a JSON file named and formatted for crosbolt. outDir should be the output directory path obtained from testing.State.