cockroach: Index | Files

package importccl

import ""


Package Files

exportcsv.go import_processor.go import_stmt.go import_table_creation.go load.go read_import_avro.go read_import_base.go read_import_csv.go read_import_mysql.go read_import_mysqlout.go read_import_pgcopy.go read_import_pgdump.go read_import_workload.go


var NoFKs = fkHandler{/* contains filtered or unexported fields */}

NoFKs is used by formats that do not support FKs.

func Load Uses

func Load(
    ctx context.Context,
    db *gosql.DB,
    r io.Reader,
    database, user, externalIODir, uri string,
    ts hlc.Timestamp,
    loadChunkBytes int64,
) (backupccl.BackupManifest, error)

Load converts r into SSTables and backup descriptors. database is the name of the database into which the SSTables will eventually be written. - uri is the storage location (e.g. nodelocal://0/my/dir). - ts is the time at which the MVCC data will be set. - loadChunkBytes is the size at which to create a new SSTable (which will translate into a new range during restore); set to 0 to use the zone's default range max / 2.

func MakeSimpleTableDescriptor Uses

func MakeSimpleTableDescriptor(
    ctx context.Context,
    semaCtx *tree.SemaContext,
    st *cluster.Settings,
    create *tree.CreateTable,
    parentID, parentSchemaID, tableID sqlbase.ID,
    fks fkHandler,
    walltime int64,
) (*sqlbase.MutableTableDescriptor, error)

MakeSimpleTableDescriptor creates a MutableTableDescriptor from a CreateTable parse node without the full machinery. Many parts of the syntax are unsupported (see the implementation and TestMakeSimpleTableDescriptorErrors for details), but this is enough for our csv IMPORT and for some unit tests.

Any occurrence of SERIAL in the column definitions is handled using the CockroachDB legacy behavior, i.e. INT NOT NULL DEFAULT unique_rowid().

func TestingGetDescriptorFromDB Uses

func TestingGetDescriptorFromDB(
    ctx context.Context, db *gosql.DB, dbName string,
) (*sqlbase.ImmutableDatabaseDescriptor, error)

TestingGetDescriptorFromDB is a wrapper for getDescriptorFromDB.

func TestingSetParallelImporterReaderBatchSize Uses

func TestingSetParallelImporterReaderBatchSize(s int) func()

TestingSetParallelImporterReaderBatchSize is a testing knob to modify csv input reader batch size. Returns a function that resets the value back to the default.

type WorkloadKVConverter Uses

type WorkloadKVConverter struct {
    // contains filtered or unexported fields

WorkloadKVConverter converts workload.BatchedTuples to []roachpb.KeyValues.

func NewWorkloadKVConverter Uses

func NewWorkloadKVConverter(
    fileID int32,
    tableDesc *sqlbase.TableDescriptor,
    rows workload.BatchedTuples,
    batchStart, batchEnd int,
    kvCh chan row.KVBatch,
) *WorkloadKVConverter

NewWorkloadKVConverter returns a WorkloadKVConverter for the given table and range of batches, emitted converted kvs to the given channel.

func (*WorkloadKVConverter) Worker Uses

func (w *WorkloadKVConverter) Worker(ctx context.Context, evalCtx *tree.EvalContext) error

Worker can be called concurrently to create multiple workers to process batches in order. This keeps concurrently running workers ~adjacent batches at any given moment (as opposed to handing large ranges of batches to each worker, e.g. 0-999 to worker 1, 1000-1999 to worker 2, etc). This property is relevant when ordered workload batches produce ordered PK data, since the workers feed into a shared kvCH so then contiguous blocks of PK data will usually be buffered together and thus batched together in the SST builder, minimzing the amount of overlapping SSTs ingested.

This worker needs its own EvalContext and DatumAlloc.

Package importccl imports 81 packages (graph) and is imported by 23 packages. Updated 2020-08-05. Refresh now. Tools for package owners.