robotstxt

package module
v0.0.0-...-523570d Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Nov 14, 2019 License: MIT Imports: 6 Imported by: 0

README

Robots Parser Build Status Coverage Status Go Report Card GoDoc

A robots.txt parser written in Go, based on the Node.js robots-parser package.

It currently supports:

  • User-agent:
  • Allow:
  • Disallow:
  • Sitemap:
  • Crawl-delay:
  • Host:
  • URL encoded & UTF-8 paths
  • Paths with wildcards (*) and EOL matching ($)

Installation

Go get:

go get github.com/samclarke/robotstxt

Usage

import (
    "log"
    "github.com/samclarke/robotstxt"
)

func main() {
    url := "http://www.example.com/robots.txt"
    contents := `
        User-agent: *
        Disallow: /dir/
        Disallow: /test.html
        Allow: /dir/test.html
        Allow: /test.html
        Crawl-delay: 1
        Sitemap: http://example.com/sitemap.xml
        Host: example.com
    `

    robots, err := Parse(contents, url)
    if err != nil {
        log.Fatalln(err.Error())
    }

    allowed, _ := robots.IsAllowed("Sams-Bot/1.0", "http://www.example.com/test.html")
    if !allowed {
        println("Not allowed to crawl: /test.html")
    }

    allowed, _ := robots.IsAllowed("Sams-Bot/1.0", "http://www.example.com/dir/test.html")
    if allowed {
        println("Allowed to crawl: /dir/test.html")
    }

    // 1
    println("Crawl delay: " + robots.CrawlDelay("Sams-Bot/1.0"))

    // [http://example.com/sitemap.xml]
    println("Sitemaps: " + strings.Join(robots.Sitemaps(), ","))

    // example.com
    println("Preferred host: " + robots.Host())
}

License

The MIT License (MIT)

Copyright (c) 2017 Sam Clarke

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.

Documentation

Overview

Package robotstxt parses robots.txt files

Aims to follow the Google robots.txt specification, see: https://developers.google.com/search/reference/robots_txt for more information.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func NormaliseUserAgent

func NormaliseUserAgent(userAgent string) string

NormaliseUserAgent normalizes a user agent

Types

type InvalidHostError

type InvalidHostError struct{}

InvalidHostError is the error when a URL is tested with IsAllowed that is not valid for this robots.txt file

func (InvalidHostError) Error

func (e InvalidHostError) Error() string

type RobotsTxt

type RobotsTxt struct {
	// contains filtered or unexported fields
}

RobotsTxt represents a parsed robots.txt file

func Parse

func Parse(contents string, urlStr string) (robotsTxt *RobotsTxt, err error)

Parse parses the contents or a robots.txt file and returns a RobotsTxt struct that can be used to check if URLs can be crawled or extract crawl delays, sitemaps or the preferred host name

func (*RobotsTxt) AddPathRule

func (r *RobotsTxt) AddPathRule(userAgent string, path string, isAllowed bool) error

AddPathRule adds another path rule

func (*RobotsTxt) CrawlDelay

func (r *RobotsTxt) CrawlDelay(userAgent string) time.Duration

CrawlDelay returns the crawl delay for the specified user agent or 0 if there is none

func (*RobotsTxt) Host

func (r *RobotsTxt) Host() string

Host is the preferred hosts from the robots.txt file if there is one

func (*RobotsTxt) IsAllowed

func (r *RobotsTxt) IsAllowed(userAgent string, urlStr string) (result bool, err error)

IsAllowed checks if the specified path is allowed by the robots.txt file

func (*RobotsTxt) Sitemaps

func (r *RobotsTxt) Sitemaps() []string

Sitemaps returns a list of sitemaps from the robots.txt file if any

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL