GitXplorerGitXplorer
F

robotstxt

public
91 stars
14 forks
3 issues

Commits

List of commits on branch master.
Unverified
4e7d97580070c70e9e67e6da15457141342d3b9c

Release version 0.2.0🎉🎉

FFolyd committed 5 years ago
Unverified
b237be03e784a99c5de39b37c3bcc64e44e7f6b4

Improve docs testings

FFolyd committed 5 years ago
Unverified
e1b392fa72a9974ff7fe6d29b58e4fa51e17ab0e

Improve the documentations

FFolyd committed 5 years ago
Unverified
d563b6a0f6f7a0ec57536edf30b782ecbc1b55e2

Update README

FFolyd committed 5 years ago
Unverified
25a8d39bfcfe090664eedb201069bc84b262c939

Passed all google testings 🎉🥳🥳

FFolyd committed 5 years ago
Unverified
a4cd2fdd3ba054adc10cbc30a4cb85309c0972a0

Passed test_google_only_line_too_long testing

FFolyd committed 5 years ago

README

The README file for this repository.

robotstxt

Crates.io Docs.rs Apache 2.0

A native Rust port of Google's robots.txt parser and matcher C++ library.

  • Native Rust port, no third-part crate dependency
  • Zero unsafe code
  • Preserves all behavior of original library
  • Consistent API with the original library
  • 100% google original test passed

Installation

[dependencies]
robotstxt = "0.3.0"

Quick start

use robotstxt::DefaultMatcher;

let mut matcher = DefaultMatcher::default();
let robots_body = "user-agent: FooBot\n\
                   disallow: /\n";
assert_eq!(false, matcher.one_agent_allowed_by_robots(robots_body, "FooBot", "https://foo.com/"));

About

Quoting the README from Google's robots.txt parser and matcher repo:

The Robots Exclusion Protocol (REP) is a standard that enables website owners to control which URLs may be accessed by automated clients (i.e. crawlers) through a simple text file with a specific syntax. It's one of the basic building blocks of the internet as we know it and what allows search engines to operate.

Because the REP was only a de-facto standard for the past 25 years, different implementers implement parsing of robots.txt slightly differently, leading to confusion. This project aims to fix that by releasing the parser that Google uses.

The library is slightly modified (i.e. some internal headers and equivalent symbols) production code used by Googlebot, Google's crawler, to determine which URLs it may access based on rules provided by webmasters in robots.txt files. The library is released open-source to help developers build tools that better reflect Google's robots.txt parsing and matching.

Crate robotstxt aims to be a faithful conversion, from C++ to Rust, of Google's robots.txt parser and matcher.

Testing

$ git clone https://github.com/Folyd/robotstxt
Cloning into 'robotstxt'...
$ cd robotstxt/tests 
...
$ mkdir c-build && cd c-build
...
$ cmake ..
...
$ make
...
$ make test
Running tests...
Test project ~/robotstxt/tests/c-build
    Start 1: robots-test
1/1 Test #1: robots-test ......................   Passed    0.33 sec

License

The robotstxt parser and matcher Rust library is licensed under the terms of the Apache license. See LICENSE for more information.