Information is a resource of the 21st century, and business is intensively moving into the virtual world. This situation was aggravated by the global pandemic in 2020. How to set all the parameters so that our project is clearly visible in search engines (SEO optimization). Algorithms change regularly, but content should be unique and focused on giving users a complete answer. This is how it looks from the recipient’s point of view. And how does the matter look technically in the context of cooperation with BlazingCDN?
First of all, remember that your website has four URL variants (with http and https, and with and without www in front of the domain). From the technical side, they are considered different, therefore for robots scanning the page in the search engine, they should return different content or should be redirected in order to maintain the uniqueness of the content. A suitable solution for this situation is to create a permanent 301 redirect so that all queries end up showing your site.
To explain the example, let’s assume that you have implemented the BlazingCDN service for the origin domain (see screen) for your www. The service domain you provided is static.domain.com. This makes your HTML sites available under the domains www.domain and static.domain. Unfortunately, this is not the correct action for SEO, as it involves duplicate content.
To prevent this from happening, block crawlers from search engines for secondary addresses. After you enable the Block crawlers option, the new robots.txt file will automatically appear at the URL of your choice in BlazingCDN. This prevents all search engine robots from indexing the content of your CDN. Create a robots.txt file and follow these steps:
- http [s]: //static.mycompany.com/robots.txt
- User Agent: *
- Disallow: /
If you want selected resources (e.g., graphics) to be visible to selected robots, create a particular file for this purpose. The following example for Google robots. Remember to update the configuration on the origin server. On the Services / Settings page, you can find your Service ID – it has the format NUMBER.r.blazingcdn.net.
- Create /DocumentRoot/robots-cdn.txt with the following content:
- User Agent: *
- Disallow: /
- User-agent: Googlebot-Image
- Allow: /
See also:
Using HTTP/2 w BlazingCDN
Setting a Cache in BlazingCDN
Restricting access to BlazingCDN origin server
Origin Protocol in BlazingCDN
Origin Port in BlazingCDN
Manage content in the BlazingCDN cache
How to accelerate website with BlazingCDN
Cookies in BlazingCDN
BlizngCDN Origin protection
BlazingCDN with a SSL
BlazingCDN URL Signing
BlazingCDN Static service
BlazingCDN Service Domains
BlazingCDN recommended Gzip compression
BlazingCDN Password
BlazingCDN Origin Domains Setup
BlazingCDN Hotlinking Policy
BlazingCDN Flash files – cross-domain cases
BlazingCDN and IP Access Policy
BlazingCDN and Cloudflare
BlazingCDN and Amazon S3
BlazingCDN Access Policy
BlaizngCDN server limits