What is an effective URL for SEO?

What is an effective URL for SEO?
When considering SEO, web page URL need to use character strings that are easily recognized by crawlers and registered in the index. 

In addition, associating multiple URL with link tags and notifying crawlers of URL with sitemaps also improve crawlability, which is one of the internal measures of SEO.

This article will discuss the relationship between SEO and URL, their impact, and how to set them up.

1. Make the URL an English word

2. Hyphenate two or more words

3. Write canonical tags in index.html

4. When PC/smartphone have different URL, specify the URL with a link tag

5. Support HTTPS

6. Create a sitemap

7. Block URL that contain filtering parameters with robots.txt

8. Shorten URL by removing unnecessary parameters

1. Make the URL an English word

Make the URL an English word that everyone can understand.

Google interprets characters in URL as English. 

Therefore, if you use a URL with a word whose meaning you do not understand, it may not appear in the search results as expected.

2. Hyphenate two or more words

When part of a URL consists of two or more words, hyphenate between the words.

In English, when two or more words are consecutive, a space is placed between them. 

As explained in the previous section, "URL must be written in English," but URL cannot contain spaces due to the specifications.

Google will be able to recognize each hyphen before and after in a URL as a word.

Note that Google cannot recognize URL such as /goingonapicnic.html without the delimiter as "go" or "picnic," This URL may be incorrectly indexed. Search index. Underscores are also not recognized as delimiters.

When concatenating multiple words into a URL, use hyphens to separate them.

Example:

https://example.com/going-on-a-picnic
×https://example.com/going_on_a_picnic
×https://example.com/goingonapicnic

3. Write canonical tags in index.html

Add a canonical tag if both / and /index.html URL is accessible.

Google registers each URL in its search index; even if the top page has two URL, / and /index.html, Google treats the two URL as two separate pages.

The page is evaluated based on the URL. 

If there are multiple URL for the same page, the evaluation of external links, etc., will be distributed, resulting in disadvantages such as difficulty in improving search rankings.

The canonical tag is a way to tell search engines that two URL are on the same page. This is called "URL canonicalization.

index.html is a particular file that can display URL without index.html, such as https://example.com/, or with index.html, such as https://example.com/index.html. URL normalization is used since the two URL are on the same page.

Select one of the most frequently used URL and specify it with the canonical tag to canonicalize URL. 

Usually, a short URL is set with the canonical tag.

Example: If https://example.com/ and https://example.com/index.html are the same pages and the URL to be canonicalized is set to https://example.com/

    <link rel="canonical" href="https://example.com/">

    The same URL is recommended for PC and mobile, but specify the mutual URL with a link tag if the PC and mobile versions have separate URL.

    When URL for the PC and mobile versions exist, Google registers both URL in its index. 

    If nothing is done, search results may show both the PC and mobile versions of a URL, or a mobile URL may appear when searching on a PC, causing inconvenience to the user.

    Describe alternate and canonical tags

    Specify the URL of the mobile version in the alternate tag for the PC version of the page, and specify the URL of the PC version in the canonical tag for the page's mobile version.

    Example: If the PC version is https://example.com/ and the mobile version is https://example.com/sp/

    Description for PC version
    <link rel="alternate" media="only screen and (max-width: 640px)"     href="https://example.com/sp/">
    
    ■Description for mobile version
    <link rel="canonical" href="https://example.com/">

    If the PC version of the page is accessed from a smartphone, it should redirect to the smartphone version, and if the smartphone version of the page is accessed from a PC browser, it should redirect to the PC version.

    5. Support HTTPS

    Supporting HTTPS (SSL) on a website affects SEO, as Google has stated that it favors HTTPS-enabled sites in SEO.

    HTTPS as a ranking signal

    "For these reasons, over the past few months we've been running tests taking into account whether sites use secure, encrypted connections as a signal in our search ranking algorithms. We've seen positive results, so we're starting to use HTTPS as a ranking signal. For now it's only a very lightweight signal—affecting fewer than 1% of global queries, and carrying less weight than other signals such as high-quality content —while we give webmasters time to switch to HTTPS. But over time, we may decide to strengthen it, because we'd like to encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the web."
    HTTPS as a ranking signal - Google Search Central Blog

    Encrypting communications prevents tampering in the network path, making your site more secure than a site without HTTPS; without HTTPS support, your site will be treated unfavorably by SEO, and browsers will label it as "unprotected communications.

    Most sites already support HTTPS, and the server certificates required to support HTTPS are becoming less expensive.

    It is not too much to say that supporting HTTPS is "the starting line of SEO" rather than "the effect of SEO improvement."

    HTTPS support with Let's Encrypt

    Obtain a server certificate and install it on your server.

    Let's Encrypt is a certification authority (an organization that issues server certificates) funded by Mozilla and Cisco Systems and allows you to obtain and renew server certificates for free.

    This site also uses Let's Encrypt.

    6. Create a sitemap

    To make it easier for crawlers to visit pages and improve SEO, we create an XML sitemap URL list.

    How Site Crawlers Work

    Crawlers, like humans, follow links within the site to discover new pages.

    For large sites, crawlers may visit pages less frequently, and it may take longer for pages to be reflected in search results.

    An "XML sitemap" improves SEO because crawlers can use this file as a reference to navigate through the site efficiently.

    Generating an XML Sitemap

    An "XML sitemap" can be created using a tool and uploaded to the server.

    Register the sitemap in the search console so that Google can recognize it.

    7. Block URL that contain filtering parameters with robots.txt

    Block URL containing refinement parameters with robots.txt

    This function increases the crawling speed of the entire site by excluding less important URL from crawling, such as when complex criteria are specified in a narrowed-down search.

    For example, URL contain many parameters (e.g., URL narrowed down by category, price, and color), so the number of URL on the entire site will be significant, and crawling all pages will take time. 

    As a result, changes to the site, such as adding new products, changing prices, or adding promotions, will take time to be reflected in the search results.

    Therefore, URL are narrowed down using complex criteria, and URL rarely hit by Google searches are removed from the crawling target. 

    To reduce the number of URL to be crawled, configure robots.txt.

    Example: Setting robots.txt to exclude from crawling URL that contains the parameter "min-price," indicating the lowest price

    User-Agent: *
    Allow: /
    Disallow: /*?min-price=
    Disallow: /*?*&min-price=

      8. Shorten URL by removing unnecessary parameters

      Shorten URL as much as possible by removing unnecessary parameters from URL.

      It reduces the number of URL processed by crawlers and improves indexing efficiency, so removing URL parameters is also a part of SEO measures.

      URL that contains unnecessary parameters include parameters for session information or access analysis.

      For example

      https://example.com/product/abc/
      ×https://example.com/product/abc/?lp=googleads1&click=btn2&sessionid=1234567890

      Google also recommends that URL be kept simple.

      "Overly complex URLs, especially those containing multiple parameters, can cause problems for crawlers by creating unnecessarily high numbers of URLs that point to identical or similar content on your site. As a result, Googlebot may consume much more bandwidth than necessary, or may be unable to completely index all the content on your site."
      Keep a simple URL structure - Search Console Help

      Summary

      The above explains how URL is effective for SEO and the related settings.

      Note that changing a URL that has already been published is error-prone.

      Before changing a URL, it is recommended that you check to see if it is currently affecting rankings or crawler traffic.

      If you change the URL, be sure to set up a 301 redirect on the server-side to ensure that the reputation of the old URL is carried over to the new URL.

      To learn more about SEO, check out the post below.