ProductCart E-Commerce Solutions Homepage
Forum Home Forum Home > ProductCart > Search Engine Optimization
  New Posts New Posts RSS Feed - Https:// being indexed
  FAQ FAQ  Forum Search   Events   Register Register  Login Login

Https:// being indexed

 Post Reply Post Reply
Author
Message
watercrazed View Drop Down
Groupie
Groupie
Avatar

Joined: 31-December-2005
Location: United States
Status: Offline
Points: 0
Post Options Post Options   Thanks (0) Thanks(0)   Quote watercrazed Quote  Post ReplyReply Direct Link To This Post Topic: Https:// being indexed
    Posted: 07-June-2006 at 5:49pm
Been doing some research on why I show so many pages (way more than I have) indexed in google. One thing I just found was that for some searches a https:// version of the page shows up before a http:// version.  The https:// should not be indexed at all. no links or anything to my secured checkout page except through the login page and they are disallowed in robots.txt as well. But my static html pages and other non-cart pages are showing up as https:// The only thing I can figure out is that once someone enters the secured page and then leaves to the main cart area they are still remaining in the https:// version. 

Any suggestions are welcome.
Back to Top
geoff View Drop Down
Newbie
Newbie


Joined: 15-January-2006
Status: Offline
Points: 2
Post Options Post Options   Thanks (0) Thanks(0)   Quote geoff Quote  Post ReplyReply Direct Link To This Post Posted: 12-September-2006 at 6:58am
John
I see the same thing. I have implemented using rel="nofollow" on all links that trigger https. I am no expert but it appears once Google has indexed a page and the page is still valid - the page is not readily dropped.

On researching this I came across two other good ideas. First was to ensure all links are the full address i.e. http://www.myhealthmyworld.com/coop/pc/mainindex.asp and not relative address. The second was to impliment you login as a subdomain aka http://www.secure.myhealthmyworld.com as this would enable you to have a new robot.txt that disallows all robots from within the subdomain.

Even with a valid set of exclusions within robot.txt - I still see Google indexing certain excluded pages but  only infrequently.

Hope this helps someone.
Back to Top
 Post Reply Post Reply
  Share Topic   

Forum Jump Forum Permissions View Drop Down

Forum Software by Web Wiz Forums® version 12.04
Copyright ©2001-2021 Web Wiz Ltd.

This page was generated in 0.063 seconds.