In a default ProcessWire installation, you do not need to have a robots.txt at all. It doesn't open up anything to crawlers that isn't public.
Create a new template, call it robots, and set its URLs > Should page URLs end with a slash setting to no, and Files > Content-Type to text/ ...
Old Hard to Find TV Series on DVD
However this client has no business and no clients in china. He wants me to block this crawler. I tried with robots.txt. but this crawler does ...
I have a site that has other ccTLDs, such as .co.za, .co.il, etc., I am trying to make a robots.txt for each of the different domains by ...
... processwire package has: Nginx directives for Processwire - General Support - ProcessWire Support Forums ... txt files location ~ ^/(COPYRIGHT| ...
February 3, 2017 in General Support. Share ... txt like: Disallow: /about/ ... or will this ... robots.txt is a good start, but it may not stop ...
... processwire/ in the footer as doing so publishes to anyone what ... September 17, 2012 in General Support. Share ... robots.txt. The meta noindex, ...
/...etc <-- any other files that you'd put in the web root of a project (robots.txt etc). /project_b <-- same folder/file structure as above.
... - - - - - - - - - - - - - - - - - - - - - -->