compressio

package
v0.0.0-...-9c7a659 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 24, 2024 License: Apache-2.0, MIT Imports: 11 Imported by: 2

Documentation

Overview

Package compressio provides parallel compression and decompression, as well as optional SHA-256 hashing. It also provides another storage variant (nocompressio) that does not compress data but tracks its integrity.

The stream format is defined as follows.

/------------------------------------------------------\ | chunk size (4-bytes) | +------------------------------------------------------+ | (optional) hash (32-bytes) | +------------------------------------------------------+ | compressed data size (4-bytes) | +------------------------------------------------------+ | compressed data | +------------------------------------------------------+ | (optional) hash (32-bytes) | +------------------------------------------------------+ | compressed data size (4-bytes) | +------------------------------------------------------+ | ...... | \------------------------------------------------------/

where each subsequent hash is calculated from the following items in order

compressed data
compressed data size
previous hash

so the stream integrity cannot be compromised by switching and mixing compressed chunks.

Index

Constants

This section is empty.

Variables

View Source
var ErrHashMismatch = errors.New("hash mismatch")

ErrHashMismatch is returned if the hash does not match.

Functions

This section is empty.

Types

type Reader

type Reader struct {
	// contains filtered or unexported fields
}

Reader is a compressed reader.

func NewReader

func NewReader(in io.Reader, key []byte) (*Reader, error)

NewReader returns a new compressed reader. If key is non-nil, the data stream is assumed to contain expected hash values, which will be compared against hash values computed from the compressed bytes. See package comments for details.

func (*Reader) Read

func (r *Reader) Read(p []byte) (int, error)

Read implements io.Reader.Read.

type SimpleReader

type SimpleReader struct {
	// contains filtered or unexported fields
}

SimpleReader is a reader from uncompressed image.

func NewSimpleReader

func NewSimpleReader(in io.Reader, key []byte) (*SimpleReader, error)

NewSimpleReader returns a new (uncompressed) reader. If key is non-nil, the data stream is assumed to contain expected hash values. See package comments for details.

func (*SimpleReader) Read

func (r *SimpleReader) Read(p []byte) (int, error)

Read implements io.Reader.Read.

type SimpleWriter

type SimpleWriter struct {
	// contains filtered or unexported fields
}

SimpleWriter is a writer that does not compress.

func NewSimpleWriter

func NewSimpleWriter(out io.Writer, key []byte) (*SimpleWriter, error)

NewSimpleWriter returns a new non-compressing writer. If key is non-nil, hash values are generated and written out for compressed bytes. See package comments for details.

func (*SimpleWriter) Close

func (w *SimpleWriter) Close() error

Close implements io.Closer.Close.

func (*SimpleWriter) Write

func (w *SimpleWriter) Write(p []byte) (int, error)

Write implements io.Writer.Write.

type Writer

type Writer struct {
	// contains filtered or unexported fields
}

Writer is a compressed writer.

func NewWriter

func NewWriter(out io.Writer, key []byte, chunkSize uint32, level int) (*Writer, error)

NewWriter returns a new compressed writer. If key is non-nil, hash values are generated and written out for compressed bytes. See package comments for details.

The recommended chunkSize is on the order of 1M. Extra memory may be buffered (in the form of read-ahead, or buffered writes), and is limited to O(chunkSize * [1+GOMAXPROCS]).

func (*Writer) Close

func (w *Writer) Close() error

Close implements io.Closer.Close.

func (*Writer) Write

func (w *Writer) Write(p []byte) (int, error)

Write implements io.Writer.Write.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL