twitterscraper

package module
v0.0.0-...-d375bbe Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jun 19, 2020 License: MIT Imports: 10 Imported by: 0

README

Twitter Scraper

Golang implementation of python library https://github.com/kennethreitz/twitter-scraper

Twitter's API is annoying to work with, and has lots of limitations — luckily their frontend (JavaScript) has it's own API, which I reverse-engineered. No API rate limits. No tokens needed. No restrictions. Extremely fast.

You can use this library to get the text of any user's Tweets trivially.

Usage

Get user tweets
package main

import (
    "fmt"
    twitterscraper "github.com/n0madic/twitter-scraper"
)

func main() {
    for tweet := range twitterscraper.GetTweets("kennethreitz", 25) {
        if tweet.Error != nil {
            panic(tweet.Error)
        }
        fmt.Println(tweet.HTML)
    }
}

It appears you can ask for up to 25 pages of tweets reliably (~486 tweets).

Search tweets by query standard operators

Tweets containing “twitter” and “scraper” and “data“, filtering out retweets:

package main

import (
    "fmt"
    twitterscraper "github.com/n0madic/twitter-scraper"
)

func main() {
    for tweet := range twitterscraper.SearchTweets("twitter scraper data -filter:retweets", 50) {
        if tweet.Error != nil {
            panic(tweet.Error)
        }
        fmt.Println(tweet.HTML)
    }
}

The search ends if we have 50 tweets.

See https://developer.twitter.com/en/docs/tweets/rules-and-filtering/overview/standard-operators for build standard queries.

Get profile
package main

import (
    "fmt"
    twitterscraper "github.com/n0madic/twitter-scraper"
)

func main() {
    profile, err := twitterscraper.GetProfile("kennethreitz")
    if err != nil {
        panic(err)
    }
    fmt.Printf("%+v\n", profile)
}
package main

import (
    "fmt"
    twitterscraper "github.com/n0madic/twitter-scraper"
)

func main() {
    trends, err := twitterscraper.GetTrends()
    if err != nil {
        panic(err)
    }
    fmt.Println(trends)
}

Installation

go get -u github.com/n0madic/twitter-scraper

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func GetTrends

func GetTrends() ([]string, error)

GetTrends return list of trends.

func GetTweets

func GetTweets(user string, pages int) <-chan *Result

GetTweets returns channel with tweets for a given user.

func RandomAgent

func RandomAgent() string

func SearchTweets

func SearchTweets(query string, maxTweetsNbr int) <-chan *Result

SearchTweets returns channel with tweets for a given search query

Types

type Profile

type Profile struct {
	Avatar         string
	Biography      string
	Birthday       string
	FollowersCount int
	FollowingCount int
	Joined         *time.Time
	LikesCount     int
	Location       string
	Name           string
	TweetsCount    int
	URL            string
	Username       string
	Website        string
}

Profile of twitter user.

func GetProfile

func GetProfile(username string) (Profile, error)

GetProfile return parsed user profile.

type Result

type Result struct {
	Tweet
	Error error
}

Result of scrapping.

type Tweet

type Tweet struct {
	Hashtags     []string
	HTML         string
	ID           string
	User         string
	IsPin        bool
	IsRetweet    bool
	Likes        int
	PermanentURL string
	Photos       []string
	Replies      int
	Retweets     int
	Text         string
	TimeParsed   time.Time
	Timestamp    int64
	URLs         []string
	Videos       []Video
	VideosURL    []string
}

Tweet type.

func FetchSearchTweets

func FetchSearchTweets(query, maxId string) ([]*Tweet, error)

FetchSearchTweets gets tweets for a given search query, via the Twitter frontend API

func FetchTweets

func FetchTweets(user string, last string) ([]*Tweet, error)

FetchTweets gets tweets for a given user, via the Twitter frontend API.

type Video

type Video struct {
	ID      string
	Preview string
}

Video type.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL