Handling Secure Server Problems for SEO
You need to keep SEO in mind when handling secure server problems. You may have pages on your site where users provide sensitive data, such as a credit card number or other type of account information. The Internet solution for protecting sensitive information is to put those web pages on a secure server.
Technically, this means that the web page is on a secure port on the server, where all data is encrypted (converted into a form that cannot be understood except by knowing one or more secret decryption keys). You can tell when you’re looking at a web page on a secure server because http:// changes to https:// in the URL address.
In 2014, Google announced that page security was a ranking factor and that pages hosted on a secure server got a minor ranking boost. So, especially for pages that handle sensitive data, an https:// URL is highly recommended.
Secure servers can cause duplicate content problems if a site has both a secure and nonsecure version of a web page and hasn’t told the search engines which of the two is the preferred, or canonical, version. Two versions of the same page end up competing against each other for search engine rankings, and the search engines pick which one to show in search results.
Here are some SEO-minded best practices for handling secure servers:
Don’t make duplicates: Many times, people just duplicate their entire website to make an https:// version. This is a very bad practice because it creates instant duplicate content. Never create two versions of your site or of any page on your site. Even if you exclude your secure pages from being indexed, people link to them at some point and the search engines find the secure versions through those links.
If you have cases of duplication caused by http:// and https://, indicate a canonical version: Any time you have pages with similar or duplicate content, you can tell the search engine which page you prefer to show up in a search result and which page to give all the link equity to by using a canonical tag.
Secure the pages that need to be secure: If the page doesn’t receive sensitive account-type information from users, it doesn’t need to be secured. However, if it isn’t cost prohibitive to do so, you may choose to secure many pages across your site for the marginal ranking benefit.
Spiders shouldn’t be blocked from crawling secure pages if those pages are important for rankings: Search engines do index secure pages, if they can get to them. Banks usually have secure pages indexed because they often put their entire site on an https://.
Because of the nature of their business, it makes sense that banks want to give their users the utmost level of confidence by securing their whole site. It’s a good user experience for a page to show up when a user’s searching for it, for example, if a user is looking for her online banking login page.
If your website has secure pages that violate these best practices, here’s how to fix them:
Identify which pages on your site need to be secure.
Always secure the pages on which users need to enter account information.
Make sure that your secure pages are not duplicated.
Your secure pages should have only an https:// version. Don’t offer a non-secured duplicate version. If you do have a duplicate-page situation, include a canonical tag that tells search engines which page is the best one to use. All links to and from secure pages should be full path links, meaning they begin http:// or https://. Using relative links to secure pages is just asking for trouble.
Clean up duplicate pages by using 301 Redirects.
If you currently have secure pages that don’t need to be secured, redirect them to the http:// version by using a 301 (permanent) Redirect. That way, any links going to the secure pages are automatically redirected to the right pages. The same goes for non-secure pages that should be secured, only vice versa.