Learn R Programming

⚠️There's a newer version (0.7.15) of this package.Take me there.

A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker

Status

lines of R code: 598, lines of test code: 1216

Development version

0.5.0 - 2017-11-11 / 22:01:12

Description

Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, crawler, scrapers, ...) are allowed to access specific resources on a domain.

License

MIT + file LICENSE Peter Meissner [aut, cre], Oliver Keys [ctb], Rich Fitz John [ctb]

Citation

citation("robotstxt")

BibTex for citing

toBibtex(citation("robotstxt"))

Contribution - AKA The-Think-Twice-Be-Nice-Rule

Please note that this project is released with a Contributor Code of Conduct. By participating in this project you agree to abide by its terms:

As contributors and maintainers of this project, we pledge to respect all people who contribute through reporting issues, posting feature requests, updating documentation, submitting pull requests or patches, and other activities.

We are committed to making participation in this project a harassment-free experience for everyone, regardless of level of experience, gender, gender identity and expression, sexual orientation, disability, personal appearance, body size, race, ethnicity, age, or religion.

Examples of unacceptable behavior by participants include the use of sexual language or imagery, derogatory comments or personal attacks, trolling, public or private harassment, insults, or other unprofessional conduct.

Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct. Project maintainers who do not follow the Code of Conduct may be removed from the project team.

Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by opening an issue or contacting one or more of the project maintainers.

This Code of Conduct is adapted from the Contributor Covenant (http://contributor-covenant.org), version 1.0.0, available at http://contributor-covenant.org/version/1/0/0/

Installation

Installation and start - stable version

install.packages("robotstxt")
library(robotstxt)

Installation and start - development version

devtools::install_github("ropenscilabs/robotstxt")
library(robotstxt)

Usage

Robotstxt class documentation

?robotstxt

Simple path access right checking ...

library(robotstxt)

paths_allowed(
  paths  = c("/api/rest_v1/?doc", "/w/"), 
  domain = "wikipedia.org", 
  bot    = "*"
)
## 
 wikipedia.org
## [1]  TRUE FALSE

paths_allowed(
  paths = c(
    "https://wikipedia.org/api/rest_v1/?doc", 
    "https://wikipedia.org/w/"
  )
)
## 
 wikipedia.org                      
 wikipedia.org
## [1]  TRUE FALSE

... or use it that way ...

library(robotstxt)

rtxt <- robotstxt(domain = "wikipedia.org")
rtxt$check(paths = c("/api/rest_v1/?doc", "/w/"), bot= "*")
## /api/rest_v1/?doc               /w/ 
##              TRUE             FALSE

More information

Have a look at the vignette at https://cran.r-project.org/package=robotstxt/vignettes/using_robotstxt.html

Copy Link

Version

Install

install.packages('robotstxt')

Monthly Downloads

1,926

Version

0.5.2

License

MIT + file LICENSE

Issues

Pull Requests

Stars

Forks

Maintainer

Peter Meissner

Last Published

November 12th, 2017

Functions in robotstxt (0.5.2)

parse_robotstxt

function parsing robots.txt
path_allowed

check if a bot has permissions to access page
rt_get_fields

extracting permissions from robots.txt
rt_get_fields_worker

extracting robotstxt fields
rt_get_useragent

extracting HTTP useragents from robots.txt
rt_get_rtxt

load robots.txt files saved along with the package
is_valid_robotstxt

function that checks if file is valid / parsable robots.txt file
named_list

make automatically named list
print.robotstxt_text

printing robotstxt_text
print.robotstxt

printing robotstxt
sanitize_permission_values

transforming permissions into regular expressions (values)
sanitize_permissions

transforming permissions into regular expressions (whole permission)
paths_allowed

check if a bot has permissions to access page(s)
paths_allowed_worker_robotstxt

paths_allowed_worker for robotstxt flavor
remove_domain

function to remove domain from path
robotstxt

Generate a representations of a robots.txt file
get_robotstxts

function to get multiple robotstxt files
guess_domain

function guessing domain from path
paths_allowed_worker_spiderbar

paths_allowed_worker spiderbar flavor
%>%

re-export magrittr pipe operator
rt_list_rtxt

list robots.txt files saved along with the package
sanitize_path

making paths uniform
get_robotstxt

downloading robots.txt file
get_robotstxt_http_get

get_robotstxt() worker function to execute HTTP request
rt_cache

get_robotstxt() cache
rt_get_comments

extracting comments from robots.txt