kflanagan28
New Member
Hey,
First post so go easy ...
I am wondering has anyone attempted to resolve the issue caused by large CMS driven websites with regards to duplicate URLs pointing to the same content.
I know there is no potential duplicate content penalty in this scenerio (as google recently posted on). But from reading posts by Aaron Wall, there are definetly issues around page dilution.
I am wondering is it work creating an automated sitemap that is updated on a daily basis. I assume this could be done by querying the database. I was also thinking along the lines of adding a couple of fields to the database which could track popular pages, age of page etc, so only the most popular and fresh content remains in the sitemap.
I could achieve the same thing by using robots.txt and excluding certain dynamic pages.
Really I am just looking from some feedback and I am interested if I am making any sense !!!
First post so go easy ...
I am wondering has anyone attempted to resolve the issue caused by large CMS driven websites with regards to duplicate URLs pointing to the same content.
I know there is no potential duplicate content penalty in this scenerio (as google recently posted on). But from reading posts by Aaron Wall, there are definetly issues around page dilution.
I am wondering is it work creating an automated sitemap that is updated on a daily basis. I assume this could be done by querying the database. I was also thinking along the lines of adding a couple of fields to the database which could track popular pages, age of page etc, so only the most popular and fresh content remains in the sitemap.
I could achieve the same thing by using robots.txt and excluding certain dynamic pages.
Really I am just looking from some feedback and I am interested if I am making any sense !!!