blast

command module
v0.11.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 8, 2023 License: Apache-2.0 Imports: 5 Imported by: 0

README ยถ

Transform, validate and run your data pipelines using SQL and Python.


Blast is a command-line tool for validating and running data transformations on SQL, similar to dbt. On top, Blast can also run Python assets within the same pipeline.

  • โœจ run SQL transformations on BigQuery/Snowflake

  • ๐Ÿ run Python in isolated environments

  • ๐Ÿ’… built-in data quality checks

  • ๐Ÿš€ Jinja templating language to avoid repetition

  • โœ… validate data pipelines end-to-end to catch issues early on via dry-run on live

  • ๐Ÿ“ table/view materialization

  • โž• incremental tables

  • ๐Ÿ’ป mix different technologies + databases in a single pipeline, e.g. SQL and Python in the same pipeline

  • โšก blazing fast pipeline execution: Blast is written in Golang and uses concurrency at every opportunity

Blast CLI

Join our community

We are excited to have you as part of our growing community! Connect with fellow users, share your experiences, and contribute to the development of Blast CLI. Here's how you can get involved:

  • Join our Blast Slack workspace to connect with other users, ask questions, and share your experiences on all things data.
  • Contribute to the Blast CLI repository, report issues, or suggest new features by creating a pull request or opening an issue.

We look forward to having you in our community!

Installation

You need to have Golang installed in the first place, then you can run the following command:

go install github.com/datablast-analytics/blast@latest

Please make sure to add GOPATH to your executable path.

Getting Started

All you need is a simple pipeline.yml in your Git repo:

name: blast-example
schedule: "daily"
start_date: "2023-03-01"

default_connections:
  google_cloud_platform: "gcp"

create a new folder called assets and create your first asset there assets/blast-test.sql:

-- @blast.name: dataset.blast_test
-- @blast.type: bq.sql
-- @blast.materialization.type: table

SELECT 1 as result

Blast will take this result, and will create a dataset.blast_test table on BigQuery. You can also use view materialization type instead of table to create a view instead.

Snowflake assets If you'd like to run the asset on Snowflake, simply replace the bq.sql with sf.sql, and define snowflake as a connection instead of google_cloud_platform.

Then let's create a Python asset assets/hello.py:

# @blast.name: hello
# @blast.type: python
# @blast.depends: dataset.blast_test

print("Hello, world!")

Once you are done, run the following command to validate your pipeline:

blast validate .

You should get an output that looks like this:

Pipeline: blast-example (.)
  No issues found

โœ“ Successfully validated 2 tasks across 1 pipeline, all good.

If you have defined your credentials, Blast will automatically detect them and validate all of your queries using dry-run.

Environments

Blast allows you to run your pipelines / assets against different environments, such as development or production. The environments are managed in the .blast.yml file.

The following is an example configuration that defines two environments called default and production:

environments:
  default:
    connections:
      google_cloud_platform:
        - name: "gcp"
          service_account_file: "/path/to/my/key.json"
          project_id: "my-project-dev"
      snowflake:
        - name: "snowflake"
          username: "my-user"
          password: "my-password"
          account: "my-account"
          database: "my-database"
          warehouse: "my-warehouse"
          schema: "my-dev-schema"
  production:
    connections:
      google_cloud_platform:
        - name: "gcp"
          service_account_file: "/path/to/my/prod-key.json"
          project_id: "my-project-prod"
      snowflake:
        - name: "snowflake"
          username: "my-user"
          password: "my-password"
          account: "my-account"
          database: "my-database"
          warehouse: "my-warehouse"
          schema: "my-prod-schema" 

You can simply switch the environment using the --environment flag, e.g.:

blast validate --environment production . 

Running the pipeline

Blast CLI can run the whole pipeline or any task with the downstreams:

blast run .
Starting the pipeline execution...

[2023-03-16T18:25:14Z] [worker-0] Running: dashboard.blast-test
[2023-03-16T18:25:16Z] [worker-0] Completed: dashboard.blast-test (1.681s)
[2023-03-16T18:25:16Z] [worker-4] Running: hello
[2023-03-16T18:25:16Z] [worker-4] [hello] >> Hello, world!
[2023-03-16T18:25:16Z] [worker-4] Completed: hello (116ms)

Executed 2 tasks in 1.798s

You can also run a single task:

blast run assets/hello.py                            
Starting the pipeline execution...

[2023-03-16T18:25:59Z] [worker-0] Running: hello
[2023-03-16T18:26:00Z] [worker-0] [hello] >> Hello, world!
[2023-03-16T18:26:00Z] [worker-0] Completed: hello (103ms)


Executed 1 tasks in 103ms

You can optionally pass a --downstream flag to run the task with all of its downstreams.

Upcoming Features

  • Secrets for Python assets
  • More databases: Postgres, Redshift, MySQL, and more

Disclaimer

Blast is still in its early stages, so please use it with caution. We are working on improving the documentation and adding more features.

If you are interested in a cloud data platform that does all of these & more as a managed service check out Blast Data Platform.

Documentation ยถ

The Go Gopher

There is no documentation for this package.

Directories ยถ

Path Synopsis
pkg
git

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL