ATTENTION: You are viewing a page formatted for mobile devices; to view the full web page, click HERE.

Special User Sections > Site/Forum Features

For what it's worth, DC Homepage is now google PageRank 6

<< < (5/9) > >>

worstje:
Robots should automatically make sense of 301 permanent redirects and hook them up. Google definitely should. If it does not, it is the most stupid bot ever, and I doubt Google has such stupid programmers. :)

mahesh2k:
Do one thing, remove all the disallow entries for now. Make it look like this:

Sitemap: https://www.donationcoder.com/forum/forumsitemap.php
User-agent: *
Disallow:
--- End quote ---

I'll post what to include in disallow in few minutes.


mouser:
i dont want to upload a blank robots.txt -- bad thinks happen when you do that on the forum -- bots start trying to index pages and pages of search results.

mahesh2k:
Eh ? then how do you want forum to get indexed without bots ? :tellme: current robots.txt is restricting google bots(or any other bot) from indexing forum. You can choose admin/login/logout/reply/notify/mark to disallow.

mouser:
sorry if i was inelegant.. what i was saying is that a blank disallow makes the robot walk into the SEARCH results page, and indexing incredible amounts of duplicate content by indexing specific searches.  just image it trying to index 10,000 pages of search results of searching for the term "the".

the disallows are important in blocking the search bot from treating every page of every page of SEARCH RESULTS as an independent content page on the site.

the disallows are important not just for blocking areas that are private (administration areas) but also links that lead to duplicate content that is better indexed on other canonical pages of the forum.

the goal is to have google index the topic pages (of the form https://www.donationcoder.com/forum/index.php?topic=13531).


Same can be said for blocking the search indexing bot from drying to duplicate index every page by indexing both its normal form and the identical content delivered by the
https://www.donationcoder.com/forum/index.php?action=printpage;topic=13531.0 links that exist for every page.


The insane part is that we go to the trouble of creating a full sitemap for google.. then have a way for web masters to tell them about the sitemap. they have a configuration page where they will test it and report on it.  and then they completely ignore it and have no way of letting you find out why they are completely disregarding it.  brilliant.   it would be comical if it wasn't so god d*amn insanely stupid and frustrating:

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version