s3: github.com/lye/s3 Index | Files

package s3

import "github.com/lye/s3"

s3 provides a very simple interface for reading/writing blobs from S3 using the multipart REST API (which allows for >5GB uploads).


Package Files

doc.go error.go multipart.go s3.go

type S3 Uses

type S3 struct {
    // contains filtered or unexported fields

S3 provides a wrapper around your S3 credentials. It carries no other internal state and can be copied freely.

func NewS3 Uses

func NewS3(bucket, accessId, secret string) *S3

NewS3 allocates a new S3 with the provided credentials.

func (*S3) Get Uses

func (s3 *S3) Get(path string) (io.ReadCloser, http.Header, error)

Get fetches content from S3, returning both a ReadCloser for the data and the HTTP headers returned by S3. You can use the headers to extract the Content-Type that the data was sent with.

func (*S3) Head Uses

func (s3 *S3) Head(path string) (http.Header, error)

Head is similar to Get, but returns only the response headers. The response body is not transferred across the network. This is useful for checking if a file exists remotely, and what headers it was configured with.

func (*S3) Put Uses

func (s3 *S3) Put(r io.Reader, size int64, path string, md5sum []byte, contentType string) error

Put uploads content to S3. The length of r must be passed as size. md5sum optionally contains the MD5 hash of the content for end-to-end integrity checking; if omitted no checking is done. contentType optionally contains the MIME type to send to S3 as the Content-Type header; when files are later accessed, S3 will echo back this in their response headers.

If the passed size exceeds 3GB, the multipart API is used, otherwise the single-request API is used. It should be noted that the multipart API uploads in 7MB segments and computes checksums of each one -- it does NOT use the passed md5sum, so don't bother with it if you're uploading huge files.

func (*S3) StartMultipart Uses

func (s3 *S3) StartMultipart(path string) (*S3Multipart, error)

StartMultipart initiates a multipart upload.

func (*S3) Test Uses

func (s3 *S3) Test() error

Test attempts to write and read back a single, short file from S3. It is intended to be used to test runtime configuration to fail quickly when credentials are invalid.

type S3Error Uses

type S3Error struct {
    Code        int
    ShouldRetry bool
    Body        []byte

func (*S3Error) Error Uses

func (err *S3Error) Error() string

type S3Multipart Uses

type S3Multipart struct {
    // contains filtered or unexported fields

S3Multipart tracks the state of a multipart upload, and provides an interface for streaming data to S3 in chunks. All methods on S3Multipart are mutually locked to ensure state doesn't become corrupt.

func (*S3Multipart) Abort Uses

func (mp *S3Multipart) Abort() error

Abort cancels the upload. If an upload is started but not completed, the storage space will be counted against your AWS account (and getting rid of it is difficult), so you should make sure either Abort or Complete is called.

For your convenience, Abort is set as the finalizer for S3Multipart objects as a failsafe, but you shouldn't rely on that.

func (*S3Multipart) AddPart Uses

func (mp *S3Multipart) AddPart(r io.Reader, size int64, md5sum []byte) error

AddPart uploads the contents of r to S3. The number of bytes that r will read must be passed as size (otherwise the request cannot be signed). Optionally, you can pass the md5sum of the bytes which will be verified on S3's end; if md5sum is nil no end-to-end integrity checking is performed. As per S3's API, size must always exceed 5MB (1024 * 1024 * 5) bytes, except for the last part. This is not enforced locally.

func (*S3Multipart) Complete Uses

func (mp *S3Multipart) Complete(contentType string) error

Complete finalizes the upload, and should be called after all parts have been added.

type S3NewEndpointError Uses

type S3NewEndpointError struct {
    Code     string
    Message  string
    Bucket   string
    Endpoint string

Package s3 imports 17 packages (graph). Updated 2019-08-28. Refresh now. Tools for package owners. This is an inactive package (no imports and no commits in at least two years).