folx

package
v0.7.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jan 14, 2020 License: Apache-2.0 Imports: 7 Imported by: 0

Documentation

Overview

Entry point for Folx API providing a convenience abstraction for lex, rule and lint packages. Folx is a linter for text (prose) that uses cloud.google.com/natural-language/ to perform the lexical analysis. Folx primary purpose is to identify issues with text relating to diversity and inclusion e.g. gender, race, disability. Customizable rule sets in the rule package define what constitutes an issue. For finer control of the linting process use functionality in the child packages.

Index

Examples

Constants

View Source
const DefaultAnalyseTimeout = 2 * time.Minute

Variables

This section is empty.

Functions

func Analyse

func Analyse(source string, rules rule.Book) (*lint.Report, error)

Analyse a source string against a rule book to generate a lint report using a sensible default timeout

Example
// Under the hood Analyse uses the cloud.google.com/natural-language/ API for lexical analysis
// The GCP client expects to find a local environment variable that points to a valid GCP key file
// I.E. os.Setenv("GOOGLE_APPLICATION_CREDENTIALS", PathToYourGCPKeyFile)

// Analyse takes a string value for linting
source := "This is the source text to analyse using Folx."

// Analyse requires a style guide containing rules to evaluate during the linting process
// In this example we use the default RegexChecker which accepts a regex pattern
// Other types of rules and functions for loading from a YAML file are available in the style package
book := rule.Book{}
rule, err := book.NewCategorySet().NewRule(`(?i)\b(folx)\b`)
if err != nil {
	log.Fatalf("failed to create new rule %v", rule)
}

// Analyse() produces a report of the linting process with details on failed rules
// AnalyseWebPage() extracts visible text content from a given URL and return a report
// AnalyseWebSite() crawls a domain to extract articles, and returns a report for each page
report, err := Analyse(source, book)
if err != nil {
	log.Fatalf("failed to analyse source text %s", source)
}

fmt.Printf("Failed rule count: %v", report.FailedRules)
Output:

Failed rule count: 1

func AnalyseWebPage

func AnalyseWebPage(url string, rules rule.Book) (*lint.Report, error)

Analyse the visible text content (article) on a web page against a style guide to generate a lint report.

func AnalyseWebSite

func AnalyseWebSite(ex extract.ArticleExtractor, sourceURL string, rules rule.Book, onRep OnNewReportCallback) (*lint.ReportSet, error)

Analyse all the pages on a web site that contain an article, against a rule book to generate a lint report. This flavour of Analyse will crawl the domain of the given URL to collect articles for linting. For the crawling rules see the extract package. A report for each article (web page) is returned in a ReportSet.

Types

type OnNewReportCallback

type OnNewReportCallback func(*lint.Report)

OnNewReportCallback is used by AnalyseWebSite to raise events when a new report is generated during a site crawl

Directories

Path Synopsis
lex
Package lex contains base types to tokenize text for the linter.
Package lex contains base types to tokenize text for the linter.
ggl
Package ggl contains an implementation of lex.Lexer using cloud.google.com/natural-language/.
Package ggl contains an implementation of lex.Lexer using cloud.google.com/natural-language/.
Package lint contains core functionality used by Folx to check a text for compliance to a style guide.
Package lint contains core functionality used by Folx to check a text for compliance to a style guide.
Package rule contains functionality to create rules for evaluation by a linter
Package rule contains functionality to create rules for evaluation by a linter

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL