Question Robots.txt
- jheiler
- Topic Author
- Offline
- Junior Member
in the documentation I found the following:
"In webtrees 2.0, if you enable pretty URLs, it will be generated automatically."
Is it possible to manually enhance this file for example to disallow robots?
Kind regards
Please Log in or Create an account to join the conversation.
- fisharebest
- Away
- Administrator
To disable this, simply create your own /robots.txt
Greg Roach - greg@subaqua.co.uk - @fisharebest@phpc.social - fisharebest.webtrees.net
Please Log in or Create an account to join the conversation.
- jheiler
- Topic Author
- Offline
- Junior Member
Please Log in or Create an account to join the conversation.
- bertkoor
- Offline
- Platinum Member
- Greetings from Utrecht, Holland
At a first glance it looks like you better use the generated one.
What are your requirements exactly?
stamboom.BertKoor.nl runs on webtrees v1.7.13
Please Log in or Create an account to join the conversation.
- jheiler
- Topic Author
- Offline
- Junior Member
"User-agent:XXX
Disallow: /"
I am aware that this only works if the bot acknowledges the international rules.
Please Log in or Create an account to join the conversation.
- bertkoor
- Offline
- Platinum Member
- Greetings from Utrecht, Holland
stamboom.BertKoor.nl runs on webtrees v1.7.13
Please Log in or Create an account to join the conversation.
- fisharebest
- Away
- Administrator
2) obtain the default by visiting your-site/robots.txt
3) copy/paste the text into your own robots.txt
4) edit the file to add your own rule.
-- or --
Create a module which replaces the template file - reources/views/robots.txt.phtml
Greg Roach - greg@subaqua.co.uk - @fisharebest@phpc.social - fisharebest.webtrees.net
Please Log in or Create an account to join the conversation.
- jheiler
- Topic Author
- Offline
- Junior Member
I have a question with respect to the default rules in webtrees. Many of my pages - even the home page! - are marked "<meta name="robots" content="noindex">". The respective page will therefore not be considered even by serious bots like Google or Bing, in my opinion this is not an appropriate solution.
What is the ratio behind this?
Please Log in or Create an account to join the conversation.
- jheiler
- Topic Author
- Offline
- Junior Member
Please Log in or Create an account to join the conversation.
- jheiler
- Topic Author
- Offline
- Junior Member
Please Log in or Create an account to join the conversation.
- jheiler
- Topic Author
- Offline
- Junior Member
Please Log in or Create an account to join the conversation.
- fisharebest
- Away
- Administrator
Not true?
Visit dev.webtrees.net/demo-dev/tree/demo
View the source code...
Greg Roach - greg@subaqua.co.uk - @fisharebest@phpc.social - fisharebest.webtrees.net
Please Log in or Create an account to join the conversation.
- jheiler
- Topic Author
- Offline
- Junior Member
Please Log in or Create an account to join the conversation.
- jheiler
- Topic Author
- Offline
- Junior Member
I have another question.
My web pages are located at www.heiler-ahnen.de/webtrees , the automatically created robots.txt file can also be found there. But Google obviously searches for this file under www.heiler-ahnen.de and does not find it. Do I have to put a second robots.txt file there?
Kind regards
Please Log in or Create an account to join the conversation.
- fisharebest
- Away
- Administrator
You must take the robots.txt file created by webtrees
www.heiler-ahnen.de/webtrees/robots.txt
and then copy it to www.heiler-ahnen.de/robots.txt
(Tip - the robots.txt generated by webtrees contains these instructions!)
If you have other applications, then you must merge the robots.txt
with those for the rest of your site.
Greg Roach - greg@subaqua.co.uk - @fisharebest@phpc.social - fisharebest.webtrees.net
Please Log in or Create an account to join the conversation.
- Supermarkert
- Offline
- New Member
Any ideas?
Steve
I know just enough to be dangerous.
Please Log in or Create an account to join the conversation.
- fisharebest
- Away
- Administrator
Greg Roach - greg@subaqua.co.uk - @fisharebest@phpc.social - fisharebest.webtrees.net
Please Log in or Create an account to join the conversation.
- Supermarkert
- Offline
- New Member
Thanks for the response. I guess I'm good now.
Steve
I know just enough to be dangerous.
Please Log in or Create an account to join the conversation.