Skip to content

Releases: serpwings/pyrobotstxt

support for reading remote robots.txt files

18 Mar 00:23
Compare
Choose a tag to compare

Support for reading remote robots.txt files as discussed in #2

updated urls and licenses

09 Dec 00:36
Compare
Choose a tag to compare

Updated URLs about https://serpwings.com and Licensing information.

Version with Crawl_delay Support Latest

04 Dec 13:49
Compare
Choose a tag to compare

List of Updates

  • Updated Documentation
  • Support for Crawl_delay: See #1 for more details
  • Working Example. See #3 for more details

initial release for pipy

04 Dec 02:21
23c10cf
Compare
Choose a tag to compare

Similar to v0.0.1 but now available to pipy for public release.

Initial Release

04 Dec 02:06
Compare
Choose a tag to compare

Basic Robots.txt File creation Functionality

You can do the followings.

  • Add Header/footer Comments
  • Add Creation Date
  • Add and Remove Allowed Items/Pages/Routes
  • Add and Remove Disallowed Items/Pages/Routes
  • Add/Remove Sitemaps
  • Include Fancy ASCII Images