Documentation ¶
Overview ¶
Package remote allows for migrations to be run from a remote source, such as S3.
To configure migrations to work with S3, use the migrations package like normal, but call `remote.InitS3`. If you're using spf13/cobra, you might put it in a PersistentPreRun function:
rootCmd = &cobra.Command{ PersistentPreRun: func(cmd *cobra.Command, args []string) { remote.InitS3(viper.GetString("s3.region")) }, // ...
Make sure your S3 credentials are defined in ~/.aws/credentials, per the Amazon AWS instructions.
You may copy the SQL migrations to an S3 bucket manually, or use the S3 push function:
if err := remote.PushS3(viper.GetString("migrations), viper.GetString("region"), viper.GetString("bucket")); err != nil { fmt.Fprintf(os.Stderr, "Failed to push migrations to S3: %s\n", err) os.Exit(1) }
The default "db create" command creates migrations in a local directory, in expectation you'll use a "db push" command based on the above to push the migrations to S3.
After the InitS3 call, you can run the remote migrations the same way you run standard, disk-based migrations, but pass in the bucket name instead of a migrations directory:
conn, err := sql.Open("postgres", "postgres://....") if err != nil { return err } if err := migrations.Migrate(conn, viper.GetString("bucket"), viper.GetInt("revision")); err != nil { fmt.Fprintf(os.Stderr, "Failed to migrate: %s\n", err) os.Exit(1) }
See the remote/cmd package for examples (or feel free to use them in your own spf13/cobra and spf13/viper applications).
Index ¶
- Variables
- func InitS3(region string) error
- func PushS3(local, region, bucket string) error
- type S3Reader
- func (s3r *S3Reader) CreateDirectory(bucket string) error
- func (s3r *S3Reader) Exists(bucket, migration string) (time.Time, error)
- func (s3r *S3Reader) Files(bucket string) ([]string, error)
- func (s3r *S3Reader) PushS3(local, bucket string) error
- func (s3r *S3Reader) Read(path string) (io.Reader, error)
- func (s3r *S3Reader) WriteMigration(bucket, filename string, migration []byte) error
Constants ¶
This section is empty.
Variables ¶
var ( // ErrInvalidPath returned if the migration path is in an unexpected, // unparseable format.' ErrInvalidPath = errors.New("invalid path; expects bucket/migration") // ErrNotFound returned if the object or bucket doesn't exist. ErrNotFound = errors.New("not found") )
Functions ¶
Types ¶
type S3Reader ¶
type S3Reader struct {
// contains filtered or unexported fields
}
S3Reader implements the migrations ReadWrite IO interface to support migrations in an S3 bucket.
func NewS3Reader ¶
NewS3Reader constructs a new S3 IO interface for SQL migrations.
func (*S3Reader) CreateDirectory ¶
CreateDirectory creates an S3 bucket for the migrations if not already present.
func (*S3Reader) Exists ¶
Exists checks if the migration file exists in the bucket. If it does, returns the size of the migration. If not, returns ErrNotFound.
func (*S3Reader) PushS3 ¶
PushS3 will copy the local migration files to the S3 bucket. If the file doesn't exist on S3 or the local timestamp is newer than the S3 timestamp, will push the migration to the S3 bucket.
If the bucket doesn't exist, will create it.
Directories ¶
Path | Synopsis |
---|---|
Package cmd for remote migrations provides some github.com/spf13/cobra commands that may be shared and used in components using the migrations remote package.
|
Package cmd for remote migrations provides some github.com/spf13/cobra commands that may be shared and used in components using the migrations remote package. |