web-crawler

command module
v0.0.0-...-db3ded4 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Aug 1, 2020 License: MIT Imports: 6 Imported by: 0

README

Web crawler

A small web crawler that goes through all the links on a site and then indexes those sites.

Running code

There is two "modes" in this program; the server which hosts results and the crawler which indexes new sites. Running the code is like with any Go program:

go run main.go

But there are two flags: startingUrl and display. These can be provided as follows:

go run main.go -start="https://github.com/nireo" -display

If no starting address is specified the program will host the server. Also if display flag is apparent new indexing results will not be displayed, since they are displayed by default.

Contributing

You can create a pull request if you're interested in contributing to the project. This is highly encouraged!

Documentation

The Go Gopher

There is no documentation for this package.

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL