Source: parsero
Section: utils
Priority: optional
Maintainer: Kali Developers <devel@kali.org>
Uploaders: Devon Kearns <dookie@kali.org>,
           Sophie Brun <sophie@offensive-security.com>,
Build-Depends:
 debhelper-compat (= 13),
 dh-python,
 python3-all,
 python3-setuptools,
Standards-Version: 4.6.2
Homepage: https://github.com/behindthefirewalls/Parsero
Vcs-Git: https://gitlab.com/kalilinux/packages/parsero.git
Vcs-Browser: https://gitlab.com/kalilinux/packages/parsero

Package: parsero
Architecture: all
Depends: ${misc:Depends}, python3, python3-urllib3, python3-bs4
Description: Robots.txt audit tool
 Parsero is a free script written in Python which reads the Robots.txt file of
 a web server and looks at the Disallow entries. The Disallow entries tell the
 search engines what directories or files hosted on a web server mustn't be
 indexed. For example, "Disallow: /portal/login" means that the content on
 www.example.com/portal/login it's not allowed to be indexed by crawlers like
 Google, Bing, Yahoo... This is the way the administrator have to not share
 sensitive or private information with the search engines.
