Foros de discusión
Robots.txt for each virtual hosts - doesnt works
Norbert Bede, modificado hace 14 años.
Robots.txt for each virtual hosts - doesnt works
Junior Member Mensajes: 38 Fecha de incorporación: 19/04/09 Mensajes recientes
Good day community,
We have a liferay deployment with multitenant setup and with more domains hosted. We want now correct way handle SEO policies, and needs to be install right way robots.txt tactic.
We are did setup by the next way.
wiki post
MY PROBLEM
Problem is related to the organisation with public area with defined domain: www.abc.com [example]:
when I type to the browser http://www.domain.com/robots.txt, then rewrite rule redirect lr to the address http://www.domain.com/robots_abc.com.txt, and the page body says page doesn't exists.
any similar experience ?
norbert
We have a liferay deployment with multitenant setup and with more domains hosted. We want now correct way handle SEO policies, and needs to be install right way robots.txt tactic.
We are did setup by the next way.
wiki post
MY PROBLEM
Problem is related to the organisation with public area with defined domain: www.abc.com [example]:
when I type to the browser http://www.domain.com/robots.txt, then rewrite rule redirect lr to the address http://www.domain.com/robots_abc.com.txt, and the page body says page doesn't exists.
any similar experience ?
norbert
Lisa Simpson, modificado hace 14 años.
RE: Robots.txt for each virtual hosts - doesnt works
Liferay Legend Mensajes: 2034 Fecha de incorporación: 5/03/09 Mensajes recientes
We did by putting each virtual on a custom theme and just shoving it into the portal_normal.vm. Ugly I know, but workable.
Norbert Bede, modificado hace 14 años.
RE: Robots.txt for each virtual hosts - doesnt works
Junior Member Mensajes: 38 Fecha de incorporación: 19/04/09 Mensajes recientes
Hi Lisa,
I'm dont understand your solution exactly. Could you please describe more detailed ?
norbert
I'm dont understand your solution exactly. Could you please describe more detailed ?
norbert
Lisa Simpson:
We did by putting each virtual on a custom theme and just shoving it into the portal_normal.vm. Ugly I know, but workable.
Lisa Simpson, modificado hace 14 años.
RE: Robots.txt for each virtual hosts - doesnt works
Liferay Legend Mensajes: 2034 Fecha de incorporación: 5/03/09 Mensajes recientes
You can hard link the robots.txt by shoving the link inot the head of the portal_normal.vm just like you would a flat HTML page.
Miguel Pau, modificado hace 13 años.
RE: Robots.txt for each virtual hosts - doesnt works
Regular Member Mensajes: 172 Fecha de incorporación: 27/04/05 Mensajes recientes
Hi Norbert,
Did you solve the problem with robots.txt finally?
I wrote the wiki post basing on our experience and it works!
regards!
Did you solve the problem with robots.txt finally?
I wrote the wiki post basing on our experience and it works!
regards!
Matthias Fenz, modificado hace 13 años.
RE: Robots.txt for each virtual hosts - doesnt works
Junior Member Mensajes: 79 Fecha de incorporación: 24/03/09 Mensajes recientesLisa Simpson:
You can hard link the robots.txt by shoving the link inot the head of the portal_normal.vm just like you would a flat HTML page.
hi lisa, can you tell how this hardlinking would look like?
thanks in advance...
Lisa Simpson, modificado hace 13 años.
RE: Robots.txt for each virtual hosts - doesnt works
Liferay Legend Mensajes: 2034 Fecha de incorporación: 5/03/09 Mensajes recientes
Do you know how to link to one from a static HTML file??? It's exactly the same.
Miguel Pau, modificado hace 13 años.
RE: Robots.txt for each virtual hosts - doesnt works
Regular Member Mensajes: 172 Fecha de incorporación: 27/04/05 Mensajes recientes
Hi Lisa,
Are you using a <link rel="... meta in <head> to link a robots.txt file ????
Are you using a <link rel="... meta in <head> to link a robots.txt file ????
David Truong, modificado hace 13 años.
RE: Robots.txt for each virtual hosts - doesnt works
Expert Mensajes: 322 Fecha de incorporación: 24/03/05 Mensajes recientes
Hey guys,
I checked in a feature that will let you set a robots.txt for each virtual host.
Not sure if it will get back ported but its in the latest trunk
http://issues.liferay.com/browse/LPS-13198
I checked in a feature that will let you set a robots.txt for each virtual host.
Not sure if it will get back ported but its in the latest trunk
http://issues.liferay.com/browse/LPS-13198
Miguel Pau, modificado hace 13 años.
RE: Robots.txt for each virtual hosts - doesnt works
Regular Member Mensajes: 172 Fecha de incorporación: 27/04/05 Mensajes recientes
we pray for it!!
It would be a great news if it covers independent sitemaps too!!
It would be a great news if it covers independent sitemaps too!!
Mauro Mariuzzo, modificado hace 13 años.
RE: Robots.txt for each virtual hosts - doesnt works
Regular Member Mensajes: 142 Fecha de incorporación: 23/07/07 Mensajes recientes
I've looked at the trunk code.
I'm building a tomcat dist to try & check it. But:
I'm building a tomcat dist to try & check it. But:
- in a similar develop made for a customer in LR-5.1 I've added "sitemap.xml" and "robots.txt" to virtual.hosts.ignore.paths property
- to help crawler, user has to manually add "sitemap" directive into the text area. Maybe this can be done automatically
Oleg True Pick, modificado hace 13 años.
RE: Robots.txt for each virtual hosts - doesnt works
Junior Member Mensajes: 80 Fecha de incorporación: 8/09/08 Mensajes recientesLisa Simpson:
Do you know how to link to one from a static HTML file??? It's exactly the same.
Lisa, I beg of you to bring your example, for clarity ... use portal_normal.vm?
SOLVED
decided to simply...I created the page over manage pages with the name of a robot.txt and assign values to URL,link to file robots_site-name.com.txt put in Root directory.
Tarkan Corak, modificado hace 13 años.
RE: Robots.txt for each virtual hosts - doesnt works
Regular Member Mensajes: 141 Fecha de incorporación: 7/10/08 Mensajes recientes
Hi,
Below you'll find my solution:
1. Set document root in your Apache configuration for each virtual host. Create one separate subdirectory for each one.
and put your robots.txt in there.
2. Use JkUnMount to make sure Apache will pick it from the document root.
Do this for all of your virtual hosts and restart Apache.
Tarkan
Below you'll find my solution:
1. Set document root in your Apache configuration for each virtual host. Create one separate subdirectory for each one.
DocumentRoot /document-root/of-my-instance
and put your robots.txt in there.
2. Use JkUnMount to make sure Apache will pick it from the document root.
JkUnMount /robots.txt
Do this for all of your virtual hosts and restart Apache.
Tarkan