GitXplorerGitXplorer
a

requests-robotstxt

public
4 stars
4 forks
0 issues

Commits

List of commits on branch master.
Unverified
0c5f6b46983c4312630f055a607596e71688cb93

Merge pull request #5 from AJMansfield/patch-1

aambv committed 7 years ago
Unverified
c5de0d03434aa108141cd6c13470b5923b239e6c

Fixed Travis Configuration

AAJMansfield committed 7 years ago
Unverified
73e4369f7dfbfb25826f7d0345ebab37a1aa6201

Fixed PreparedRequests issue and unit tests

AAJMansfield committed 7 years ago
Unverified
c61de38c2fe08ef69f611cee5e3c74183f43d3f8

fixed missing extra parameters

AAJMansfield committed 7 years ago
Unverified
17e6d767f37d7b584ffcd9da4a017387ba6cb227

Refactor get_rules out of is_allowed.

AAJMansfield committed 7 years ago
Unverified
3f66d98d550e08fbfe7058fc1bebfb5b005cea84

publish a todo with the missing parts explained

aambv committed 11 years ago

README

The README file for this repository.

================== requests-robotstxt

.. image:: https://secure.travis-ci.org/ambv/requests-robotstxt.png :target: https://secure.travis-ci.org/ambv/requests-robotstxt

Currently just a proof of concept, the module strives to be an extension to requests <http://pypi.python.org/pypi/requests>_ that brings automatic support for robots.txt.

How to use

Simply use RobotsAwareSession instead of the built-in requests.Session. If a resource is not allowed, a RobotsTxtDisallowed exception is raised.

How do I run the tests?

The easiest way would be to extract the source tarball and run::

$ python test/test_robotstxt.py

Change Log

0.1.0


* initial published version

Authors
-------

Glued together by `Łukasz Langa <mailto:lukasz@langa.pl>`_.