multipartdownloader

package module
v0.0.0-...-5ff9879 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 27, 2015 License: MIT Imports: 12 Imported by: 0

README

Multi-part file downloader

Download a file with multiple connections and multiple sources simultaneously.

Build Status Doc Status

Installation

# Build the executable
make

# Install as library
go get github.com/alvatar/multipart-downloader

Usage

godl [flags ...] [urls ...]

Flags:
    -n      Number of concurrent connections
    -S      A SHA-256 string to check the downloaded file
    -E      Verify using Etag as MD5
    -t      Timeout for all connections in milliseconds (default 5000)
    -o      Output file
    -v      Verbose output, show progress bars

Usage as library

urls := []string{
    "https://raw.githubusercontent.com/alvatar/multipart-downloader/master/test/quijote.txt",
    "https://raw.githubusercontent.com/alvatar/multipart-downloader/master/test/quijote2.txt",}
nConns := 2
timeout := time.Duration(5000) * time.Millisecond
dldr := md.NewMultiDownloader(urls, nConns, timeout)

// Gather info from all sources
_, err := dldr.GatherInfo()

// Prepare the file to write downloaded blocks on it
_, err = dldr.SetupFile(*output)

// Perform download
err = dldr.Download(func(feedback []md.ConnectionProgress) {
		log.Println(feedback)
	})

err = dldr.CheckSHA256("1e9bb1b16f8810e44d6d5ede7005258518fa976719bc2ed254308e73c357cfcc")
err = dldr.CheckMD5("45bb5fc96bb4c67778d288fba98eee48")

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func SetVerbose

func SetVerbose(verb bool)

Set verbosity for all log actions

Types

type Chunk

type Chunk struct {
	Begin int64
	End   int64
}

Chunk boundaries

type ConnectionProgress

type ConnectionProgress struct {
	Id      int
	Begin   int64
	End     int64
	Current int64
}

Progress feedback type

type MultiDownloader

type MultiDownloader struct {
	ETag string // ETag (if available) of the file
	// contains filtered or unexported fields
}

The file downloader

func NewMultiDownloader

func NewMultiDownloader(urls []string, nConns int, timeout time.Duration) *MultiDownloader

func (*MultiDownloader) CheckMD5

func (dldr *MultiDownloader) CheckMD5(md5sum string) (err error)

Check MD5SUM of downloaded file

func (*MultiDownloader) CheckSHA256

func (dldr *MultiDownloader) CheckSHA256(sha256hash string) (err error)

Check SHA-256 of downloaded file

func (*MultiDownloader) Download

func (dldr *MultiDownloader) Download(feedbackFunc func([]ConnectionProgress)) (err error)

Perform the multipart download

This algorithm handles download splitting the file into n blocks. If a connection fails, it will try with other sources (as different sources may have different connection limits) then, if it still fails, it will wait until other process is done. Thus, nConns really means the MAXIMUM allowed connections, which will be tried at first and then adjusted. The alternative approach of dividing into nSize blocks and spawn threads requests from a pool of tasks has been discarded to avoid the overhead of performing potentially too many HTTP requests, as a result of each thread performing many requests instead of the minimum necessary.

The designed algorithm tries to minimize the amount of successful HTTP requests.

As a result of the approach taken, the number of concurrent connections can drop if no source is available to accomodate the request. In any case, setting a reasonable limit is left to the Take into consideration that some servers may ban your IP for some amount of time if you flood them with too many requests.

func (*MultiDownloader) GatherInfo

func (dldr *MultiDownloader) GatherInfo() (chunks []Chunk, err error)

Get the info of the file, using the HTTP HEAD request

func (*MultiDownloader) SetupFile

func (dldr *MultiDownloader) SetupFile(filename string) (os.FileInfo, error)

Prepare the file used for writing the blocks of data

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL