From Crawl to Conversion: Mastering Technical SEO for Optimal Performance
In this comprehensive section, we will thoroughly explore and delve into the crucial and highly significant steps involved in the meticulous process of optimizing a website’s intricate and multifaceted technical elements. The aim of this endeavor is to ensure unparalleled and unrivaled maximum visibility and resounding success in the immensely competitive realm of search engine rankings. Now, let us embark on this enlightening journey towards digital triumph!
1. Introduction
SEO is often mistakenly considered to be restricted to the realms of content and offsite promotion, while technical SEO is forgotten or even unknown. Indeed, a Gumtree search for “SEO Writing Perth” brings up a job ad requesting that an SEO writer be employed. But the distinction between rankings that are a by-product of skilled content versus specific SEO work is a critical one. Imagine a sales assistant employed in a men’s shoe store who is overjoyed at selling sixty pairs of women’s sandals in the course of a week. The assistant has done well, but the success could be considered fortuitous and the sandals a by-product of his working environment. Rather, if the assistant was to specifically learn about the sandals, assess why women are buying them, and then devise a cunning plan to increase sandal sales, the success would be a result of a specialized effort and the shoes could be considered a conversion goal. Now the assistant wants to easily promote sandal sales to the men looking to buy their partners sandals and sell more popular men’s footwear. This analogy is analogous to content vs technical SEO and leads us to the dual concepts of a crawl and a conversion.
It’s been said that the best place to hide a dead body is the second page of Google. This dark humor reflects the harsh truth of a study from Chitika, which shows that the first result of a Google search enjoys a whopping 33% of traffic, the second result 18%, and the numbers trail off to 1.5% for the tenth result on the first page. What’s more, a recent study from Slingshot SEO showed that 18% of organic clicks go to the first search result, 10% of organic clicks go to the second search result, and only 7% of organic clicks go to the third search result. A site which slumps from the first page to the second loses 90% of its traffic. With this in mind, the importance of SEO work, which positively affects an organic search, is paramount, and the prime objective of Technical SEO is the improvement of a second page graveyard site to something a little closer to the front porch.
1.1. Importance of Technical SEO
Technical SEO (search engine optimization) has a critical impact on your site’s ability to rank in the search engines and ultimately receive organic traffic. The purpose of this guide is to share insights and strategies that GangWish has used to improve the technical SEO of a lift in organic search traffic and resulted in more revenue. This guide is broken up into logical steps and best practices for improving the crawl efficiency and conversion optimization of a website. This guide does not cover the on-page tactics like meta tags and keyword optimization that can also have an impact on SEO. These are valuable and important steps, but before you undertake that effort make sure your site is the index is crawlable and that it has enough authority to rank.
This guide is targeted at intermediate SEOs, marketers, and webmasters that are familiar with the concept of on-page SEO and have some understanding of offsite promotion. You do not need to be an engineer to benefit from this guide, but a basic understanding of HTML and more complex web technologies will be beneficial. At the very least, non-technical readers should understand the concepts related to how search engines crawl and index their sites.
1.2. Overview of Crawl and Conversion
To optimize a website, search engines send out what is called a “spider” to analyze and “crawl” the content on the site and gather information. This information is stored and used to determine which pages are more relevant to specific keyword searches, and is also used to determine the site’s overall relevancy. This is an important factor because the more frequently a site is crawled, the more frequently changes to the site will be picked up by the search engine. This can be critical for new content or updates. Websites with low and shallow click depth are at a severe disadvantage because the “spider” has difficulty crawling their pages because there are only a few links to the pages from the home page. This is an issue for sites based on unfriendly navigational schemes, dynamic content, or simply bad site architecture. Similarly, the more accessible your site is the easier it is for the “spider” to crawl it. This can be facilitated by creating a sitemap in XML, TXT, ROR or HTML form. A sitemap helps to provide spiders with a map of all the URLs on the site as well as information about when the URL was last updated and how often changes are made to the page. An HTML sitemap will also benefit the user by helping them navigate the site.
2. Crawl Optimization
The initial step in optimizing for search engines takes place when website pages are initially discovered and cached. It is particularly important that crawl budget is spent only on valuable pages. At its most basic level, a search engine operates through discovering and cataloging URLs. This is often achieved via a set of ‘starting URLs’ (or seeds) which are then followed by clicking on links on discovered pages. As SEOs, we can influence this process by ensuring that the most important pages are linked to from a range of other pages on the website, therefore maximizing ‘depth’ and ‘PageRank’ of these pages. It is often the case that important pages are linked to from other important pages (see image below), if this is the case it may be worth considering trying to flatten the site’s architecture to consolidate PageRank and strength of these pages. A particularly useful tool for visualizing a rough ‘graph’ of how a search engine may crawl a website is by using Forecheck’s ‘Visual SEO Link’. This tool is a Java applet which can be downloaded from their website, and a full usage guide and explanation of the obtained data can be found at the Atlassian JIRA website. Regular visits by search engine spiders are also important to ensure that reasonably fresh (and cached) copy of the web page is ‘kept in the index’. Econsultancy found that 71% of marketers understood this and aimed for a good cached page to live-content ratio in order to maintain rankings. This is particularly important for publishing websites with frequent new content and changes. In this case it may be best to noindex/nocache certain stable pages to increase the time allocated to updating the more important pages. An example of this being the noindexing of HTML sitemaps which are often implemented purely for search engines and serve little benefit to users.
2.1. Website Structure and Navigation
After realizing the importance of crawl, the next step is to optimize crawl by guiding the search engine crawlers to the important content and to interpret that content easily. In order to do that, we need to concentrate on website structure and navigation.
The most important and most effective way to guide the spiders is to provide a clear path right from the home page. A good website structure is imperative in any SEO campaign as this can make or break your search engine ranking.
By providing a clear path to the spiders, you are effectively making a clear path to the important content of your website. When search engine spiders can easily find the content of your website, they are more likely to index more of your site. If the content of your website is buried deep within the site and the spiders have to click numerous links to get to that content, then it is highly unlikely that they will index that content. An indexed page is a page that can potentially be displayed in the search engine results, so it is imperative that you ensure as many pages as possible are indexed.
2.2. URL Structure and Canonicalization
URLs, just like the names of everything on the web, are vitally important. They give the user a clear idea of what to expect on the page, and can also give clues to search engines on the content of the page so they can determine its relevance to a user’s search. URLs are often an overlooked necessity. Frequently you will see dynamic URLs (generally ones containing a ‘?’ character), as opposed to static URLs (those that do not contain a ‘?’ character). This is not to say that dynamic URLs will not index, as Google has made progress in this area and Yahoo and MSN are both on record as having no issue indexing dynamic URLs. However, static URLs are easier to work with and are more easily interpreted by both crawlers and users. Static URLs also can contain keywords relevant to the page’s content, and on a CMS like WordPress, can be auto-generated using the title of the post. This is great for SEO.
Should you have variations of a certain page on your site, it is important to use the canonical tag to tell search engines which page is the original and should be indexed. If for some reason you have to move content, you should issue a 301 redirect from the old URL to the new URL. This signifies to engines that the page has moved permanently. They will index the new URL and associate all the link equity from the old URL to the new URL. If the move is not permanent, you should use a 302 redirect.
• Canonicalization: The process of selecting the best URL when there are several choices, and it usually refers to home pages. (I’ll standardize the spelling to avoid this).
2.3. XML Sitemaps and Robots.txt
When search engines crawl a website, there may be pages that are so deeply buried within a site’s architecture that they are completely missed. This is extremely common with larger sites and can be a contributing factor to poor indexing and rankings. It is important to make sure that search engines can find and crawl all pages that you would like to have indexed, and that means having a search engine friendly site structure with clear and easy to navigate internal links. If your site relies heavily on drop down menus or forms, it is likely that search engines will encounter problems accessing certain pages. A good way to check if search engine spiders can access your pages is to use a Free SEO Tools for Businesses online tool such as Xenu’s Link Sleuth. This tool acts as a crawler and will simulate how a search engine spider navigates your site. Any pages that cannot be found from the page that you enter as a starting point will not be indexed, so you can use this information to improve your site’s accessibility.
3. On-Page Optimization
Image optimization in SEO is the process in which you make your images search engine friendly. Search engine spiders can only read text, so by providing and assigning alt tags to an image, you are providing the search engines with a bit of text that they can use to index the image. This will make the image more likely to come up in image search results for keywords. Alt tags are also used to increase web accessibility and are displayed within the browser when an image cannot load.
A Header Tag is an HTML tag used to define headings and subheadings within your content. Content that is wrapped in heading tags is perceived as being more important than content written otherwise (by search engines), and the H1 tag is considered the most important. Keyword placement is the process in which you strategically place a keyword in certain areas on a page in order to get more relevance for that keyword on that page. Having relevant keywords throughout your content and strategically placed is a major factor in search engine rankings.
Title tags and meta descriptions are the first thing a search engine uses to determine what a URL is about and pull in that URL’s listing within the search engines. Meta descriptions play a smaller role in getting a page listed within a search engine; while a compelling enough meta description often times gets a higher click-through rate, the meta description is not a factor in determining the relevance of a page to a search term. The Title tag however, is the most important on-page factor for search engine rankings, and is often times what will determine a click-through rate for a page. It is vital that each and every page on your website has a unique and descriptive title tag to its content. The title tag should be constructed using the keywords and keyword phrases you have found by researching those used by your target audience, and should be prioritized based on the level of importance of each page.
3.1. Title Tags and Meta Descriptions
Optimal format for title tags are a somewhat controversial subject. For a while it was theorized that a 79 character limit was in place. But, the widespread agreement is that titles should be under 70 characters in order to avoid being cut off in the search results. This craftsmanship will ensure that the title will fully display in the form of search engine results that are most beneficial to both usability and SEO. Step two involves the utilization of keywords within title tags. This is a risky process due to it being easy to over-optimization. Sugared words should be used instead of primary keywords for the best possible click-rate. Step three is mimicking ad copy, while the fourth is adding a question and/or provoking curiosity. These four steps are very helpful in crafting strong title tags.
While often overlooked, meta descriptions play a requisite role in on-page SEO. These short descriptions are a window into what your content will include. Keep the meta description under 155 characters, as anything greater will be cut off in the search engine result page, with an ellipsis.
The title tag and meta description are fully functional calls-to-action within the search results, thus an essential on-page SEO element. With the development of rich snippets and the emphasis on the importance of schema.org markup, it is greater than ever to have truly eye-catching title tags that convince searchers to click through.
3.2. Header Tags and Keyword Placement
Header tags are used to label the headings and subheadings within your website. They are a major aspect of SEO and can greatly contribute to the relevancy of the keywords in the copy of your site. The most important header tag is the
tag, and will usually be the title of your post. This will signify what the page is about and should be noted that there should only be one
tag per page. Although the importance of the
tag is crucial, you should not neglect the effective use of the
–
tags. These are also significant in helping the search engine to understand the structure and content of your page. They contribute to increasing keyword effectiveness, but must not be overused. If there are too many header tags on a page, it can look like spam to a search engine and result in a lower ranking. The key to effective use of header tags is using them to break up content and to signpost what comes next. Don’t be tempted to put headers on non-descriptive paragraphs.
tag is crucial, you should not neglect the effective use of the
–
tags. These are also significant in helping the search engine to understand the structure and content of your page. They contribute to increasing keyword effectiveness, but must not be overused. If there are too many header tags on a page, it can look like spam to a search engine and result in a lower ranking. The key to effective use of header tags is using them to break up content and to signpost what comes next. Don’t be tempted to put headers on non-descriptive paragraphs.
tags. These are also significant in helping the search engine to understand the structure and content of your page. They contribute to increasing keyword effectiveness, but must not be overused. If there are too many header tags on a page, it can look like spam to a search engine and result in a lower ranking. The key to effective use of header tags is using them to break up content and to signpost what comes next. Don’t be tempted to put headers on non-descriptive paragraphs.
Remember that keywords play a vital role in header tags and will also contribute to your chance of higher rankings. Placement in the
tag and
tag should be focused on the most relevant keywords to what the page is about and the terms you expect your page to be found under. This contributes significantly towards search engine algorithms and semantics. This is also relevant to on-page keyword placement and will be discussed in section 3.3.
3.3. Image Optimization and Alt Tags
An improvement in the SERP rank of your images can lead to significant traffic improvements and serve as low hanging fruit for further optimization of the page and more competitive keywords.
While it is not a necessary part of the image code, many CMS systems including WordPress have an SEO section for adding an image sitemap which can give you detailed feedback about the images on your site and their performance in search engines. This can be used to make incremental improvements to the alt tags and image names for better CTR and traffic.
Often the default title for an image is not descriptive or does not contain the relevant keywords for the site. A call to your developer and a bit of testing can fix this. Image optimization and alt tags are a simple and often overlooked part of SEO. However, the benefits of doing it are significant.
Keep in mind though, it is important to not overdo it so that the keyword density appears forced and unreadable. This will deter customers from your page. Title tags in images are also important as they can bring new visitors to your site from Google images. Title tags can also be displayed in a small yellow box when you hover the mouse over it. The title tag text is valuable when a user bookmarks the page or saves it to a Pinterest board as the title will be used as the default text grabbed.
Alt tags are used to describe the image and are displayed when a user hovers their mouse over it. It is important to place keywords in the alt tags, provided that they are relevant to the image. This is because having the keyword in the alt tag will improve the keyword density of the page it is on, that the image is linking to. This will improve the probability of the page ranking well for the keyword when indexed by search engine, enhancing the keyword relevancy between the image and the page it is on.
When it comes to website content, seeing is believing. Images play a crucial role in convincing customers of the value of your product. By optimizing these images properly, you can increase both traffic and conversions for your content and site.
4. Site Speed and Performance
Minifying CSS and JavaScript is a direct way to improve site performance. This is a way of compressing the files by completely removing any code that is not required at that time. The compression of the files and the removal of superfluous code reduces the file size and helps the download time. There are various online tools that can be used to minify CSS and JavaScript files. Use the CSS Compressor tool to compress CSS files. This will significantly reduce file size and also organize the CSS files, setting them out in a much cleaner way. A cleaner CSS file means that it’s easier to read, access, and change when required. A well-presented CSS file can save a lot of time and hassle when maintaining a website. While there are many different tools available for this, a popular tool to minify JavaScript files is the Google Closure Compiler. This offers a more advanced service, yet still easy to use. It has various options on how it can compress the JavaScript files and also has the ability to detect any errors in the code. This could be very useful in the case that there is any old or redundant code within the JavaScript files.
Site speed and site performance play a significant role in today’s SEO. Crawl bots now have the ability to act like a browser, meaning they are able to see a page like a person would. However, if the page speed is slower than a snail, it’s going to negatively affect what the bot is able to see. It will also affect the crawl budget for that website. If this happens, some of the site may not be crawled or indexed, and it may go unnoticed to the website owner. An extremely slow site can also lead to increased load times and timeouts when trying to access certain resources. This results in a high bounce rate and low time spent on site. Recommended tools to test site speed are Google’s Page Speed Online and Yahoo’s YSlow. These tools can test your site’s performance and offer suggestions on how the performance can be improved. This can be a really straightforward yet powerful way to improve the performance of a website.
4.1. Minifying CSS and JavaScript
Minifying entails the elimination of anything that doesn’t improve the operation of the web page. It gives the effect of an obfuscated code as it renames the variables and deletes any feedback, white space characters, and block delimiters. It is more difficult than compression, and generally, minified pages will also be GZipped. Examples of a deviation of the jQuery CSS minified code, and also minified with the closure compiler. The major difference is in the variable names, with the standard minification saving not only bytes but also reducing the volume of the data. Using the closure compiler has a lot of side effects and dangerous operations as standard minification is generally safe; this high level of complexity results in different fates for different browsers.
Closure Compiler advanced optimizations mode has caused the code to break on IE7. While minifying for the closure compiler and standard minimization for the jQuery CSS does not change functionality, it is not guaranteed to have no side effects on complex code from plugins and code that does not follow best practices. After minifying, it is essential that the code is thoroughly tested across a variety of pages, and content is occasionally backed up so it can be reverted.
When minifying data, it is important to strike a balance between how good the performance gains will be for your page and how hard it will be to manage your code. The major caveat on using a minified version is that when a problem arises, there will be no line numbers for a start, and the variable names will be incomprehensible. Most of the time you are looking at the file with a tool such as Firebug and the issue is a missing semicolon or curly brace, and this requires a firm understanding of the uncompressed code to fix. Minifying is safer after the development stage.
4.2. Compression and Caching Techniques
Compression is the reduction of file sizes, and different file types may be reduced by different methods. For multimedia content compression and conversion to preferred file types (gif/jpg for images, mp3 for audio and flash video for video), it is advised, as well as removal of any unnecessary media. Compression of HTML, CSS, and JavaScript can be fairly easily achieved using a number of Free SEO Tools for Businesses and commercial tools. It is important to store copies of the original files and to test the affected webpages once the changes have been made, as improper compression can cause page layout errors.
When a web page is accessed, content must be downloaded from the web server and stored on the user’s hard drive. Web browsers will store copies of web resources locally for a variable amount of time, based on the user’s settings and the server instructions received with the resource. This is a form of caching, and stored resources that have not expired will not need to be downloaded from the server a second time, reducing page load times. Cacheable resources include HTML, CSS, JavaScript, and media files, and it is important to ensure that resources are given a caching directive and that resources are updated by changing the file name when necessary to prevent the user from accessing old versions of updated files.
4.3. Mobile Optimization and Responsive Design
Lastly, mobile optimization and responsive design are the most meaningful techniques in the current era. All webmasters know that mobile users are growing, and they have to optimize the site for mobile users. As per Google’s new indexing structure, the mobile version will be indexed first from now on. This obviously means that mobile-optimized sites will get better indexing and ranking in Google search. There are a number of techniques to optimize the site for mobiles, such as a separate mobile site and redirects, dynamic serving, and a mobile version of the site using responsive design. But the most recommended technique is using responsive web design. It is search engine friendly and less prone to errors. It prevents Google from making mistakes as described below.
On the other hand, caching is not similar to compression, but it generates static HTML files of the site’s dynamic content and saves them on the web server. The web server will deliver the static HTML file and avoid resource-intensive backend processes. It will also save the file to the cache directory so that dynamic content is served quickly.
Compression and caching are the most useful techniques for increasing the speed of a website. There are a few types of compression used, but mod_deflate is the most useful as it is supported by many browsers. Gzip compression is also popular. It is necessary to enable the web server to allow the above-mentioned compression. This will depend upon the server and hosting.
Let us move forward from the design of the website to speed and performance optimization techniques. The first and most useful thing is minifying CSS and JavaScript. It is the process of compressing these files by removing unnecessary characters and formatting. There are plenty of free and paid tools available for minifying CSS and JS. You can try Google’s PageSpeedInsight Tool, which is very useful for the above-mentioned techniques.
5. Technical SEO Auditing
The following aspects of technical SEO valuation are something that should be undertaken throughout the duration of the project, and after any changes are made to the website. A top-level analysis can be extremely useful in identifying any glaring issues as a result of re-launches or changes, but it can be equally valuable in assessing the current standing of a website.
Technical SEO Auditing can be repeated ever so often to identify and fix issues as they arise. Unfortunately, many of the checks and tasks detailed below are not repeatable in the same simple and time-efficient way that they are first performed. However, in the majority of cases where an SEO checks crawl data or URL information it is for a specific purpose and on that basis it can be approached in an organized and efficient manner.
5.1. Crawling and Indexing Analysis
The objective of this process is to investigate the extent to which a search engine has crawled a website. An efficient use of PageRank is generated by a clean and crawlable website. PageRank can be wasted on a site with a complex URL structure such as [Link] then a search engine will have trouble indexing that page as it can time out due to the time it takes to generate the content. Peeling back the layers to find a static URL or modifying the parameter settings in the search engines webmaster tools may solve this.
Another common problem with indexing is having a poorly designed navigation on a website with heavy use of Flash or JavaScript. The indexing tool of a search engine is not as effective as a user using a web browser. Any pages that cannot be found by following text links cannot be indexed. This can be confirmed using the Robots Exclusion Standard protocol. This is done by implementing text files in the root directory or through meta tags.
Using a txt file allows the use of robots to specify which areas of a site are restricted or allowed and also give specific instructions to named user agents. These can be used to disallow all spiders access to restricted areas of a website to prevent session IDs being indexed or to disallow all search engines access to a site in order to prevent it from going live before completion. This method is less secure than the use of meta tags in the page code with instructions specific to individual pages. However, some of the settings in webmasters’ tools will override this so it is best to check each page for instructions.
5.2. Duplicate Content and Canonicalization Checks
One of the biggest issues in SEO that is often overlooked is the problem of duplicate content. This is not necessarily hard for search engines to figure out which piece of content to index, but it can be hard to identify which version of the URL to direct the link metrics toward. Often multiple URLs will link to a single piece of content; this can dilute the potential of the page as the content is spread across several URLs. Run a simple test to see how many of your URLs are indexed in Google. Simply type in “site:www.yourdomain.com/yourpage.htm” while doing this be sure to return the same number of results in each case. Another problem is that search engines can often be unsure as to which version of the URL is to be indexed. Using a simple test, enter “http://www.yourdomain.com/yourpage.htm” and then again using this “http://yourdomain.com/yourpage.htm” both should return the same result. If this is not the case, then you may have a canonicalization problem. Canonicalization is the process of picking the best URL when there are several choices, and it usually refers to home pages. For example, if your home page can be pulled up by [Link] or [Link] then you may have a canonicalization issue. This is highly relevant when SEO is concerned as incoming links to the various versions of the URL can be split between the two giving an overall lower ranking for the page.
5.3. Broken Links and Redirect Management
Using 301 redirects will also send users and search engines to a page you have specifically chosen, rather than the page they were intending to visit. This will provide a better user experience and pass any removed content’s ranking power to its replacement page. This method should also be used when redirecting an old domain to a new domain.
Once you have located the broken links and 404 errors, you will need to redirect them by using 301 redirects. A 301 redirect is the most efficient and search engine-friendly method for webpage redirection. This server-side redirect will pass between 90-99% of the link juice (ranking power) to the redirected page. It is common to see a loss of 10-15% in link juice using a 302 redirect.
An efficient way to locate and remove broken links is to use Google Webmaster Tools. This is an online resource that is free to anyone with a website. It provides reports and data on your site’s visibility in Google.
Broken links and 404 errors are a sign of an unhealthy site. Search engines will spider your site, and these broken links can negatively affect how your site is indexed. Not only that, but users who come across broken links will result in lost traffic and a bad user experience.
6. Schema Markup and Structured Data
An extremely useful tool called the rich result test can be used to see what enhancements can be made to content with structured data. This lays the foundation for optimizing the CTR and results in providing explicit added data that can be shown in a variety of Google-defined search listings. Run-on tests can be made with a sample URL to see any errors and pages needing improvement. All these benefits will help to increase the brand image and web presence of a company.
The benefits of schema markup are to enhance snippets in search results. This helps to improve your page ranking and increase the CTR. It also helps to create rich cards and rich snippets. Rich cards are a new format which Google has recently developed. They cater to the mobile user and only show the most relevant content to your data. It is a good way to drive traffic to Supercharge Your Rankings with Free SEO Tools site from a mobile platform. Rich snippets are an improved UI for search result snippets, and the changes are intended to improve mobile and voice search and provide a more consistent user experience. Rich snippets, because they display more data, will make your listing much more attractive and make it easier for the user to quickly absorb information. In time, it is assumed that this will have a positive effect on traffic and on-page time for your site.
Schema markup is merged with structured data to create a semantic vocabulary. Google has recommended using schema.org to mark up your content. This tool is a very useful catalogue of schemas for various types of structured data, using certain types to mark up content, giving it a far better definition. As mentioned in the Google introduction to structured data, any content can be marked up. However, it performs best in search results for content about a person or a company. This is likely because of prior knowledge graph capabilities.
6.1. Benefits of Schema Markup
It is important to establish that Schema Markup is a semantic vocabulary, in other words, a microdata. It is created using the overall result of the analysis and a completion of the documentation of an ontology. It is considered one of the most powerful, least utilized forms of SEO today. The vocabulary is the result of collaboration between Google, Bing, Yahoo, and Yandex in order to create a high-quality method of search engine optimization. An extra benefit of using schema markup today is the result of the communal vocabulary. Due to there being a collaborative source for markup, it is likely to have a positive effect on results at search engines that it has come from.
Furthermore, when used correctly, schema markup will enhance the rich snippets shown below the page title. Snippets today give users information on the results page so they can decide whether it is a page worth clicking on. A snippet like this one is how markup is used to enhance a regular snippet. The example shows a snippet about a snippet, with review and pricing information. This kind of additional information is known as a rich snippet, and reckon you clicked this link? A higher quality snippet from the use of schema markup is going to lead to more user traffic from search engines to the site marked up.
6.2. Types of Structured Data
Breadcrumbs: Breadcrumbs are a list of links that show where a user is on a website. Usually, this is useful when a user lands on an internal page from a search result or external link. They can easily identify the page’s position and select the next link based on the starting page. Breadcrumbs can be represented as rich snippets in search results. This can make it easier for users to visit the website, knowing where the page is located, and convince them to explore the site starting from the page’s position.
Articles: Article type can be defined as everything that was published and has content but doesn’t have a specific schema type. In such cases, the article can be categorized as article type. Usually, articles are news posts, forum threads, opinions, reviews, interviews, etc. By implementing article-type structured data, website owners allow their articles to be represented as rich snippets in search results. This can give ease to users to identify article posts which sometimes can have a similar appearance with the page content on common landing pages.
Rich Snippets: It is a structured data markup added to your site’s HTML to improve the way Google represents your page in search results. By showing rich snippets, webmasters make it easier for users to understand what the page is about before clicking on it. This can significantly improve the site’s visit click-through rate (CTR). Any changes made to the way a page is displayed in search results can increase the website’s visibility and influence the possibility to get more visitors. These visitors will likely be targeted visitors as they already know what the page is about and are actually seeking the information provided on the page. An example of rich snippets for article-type content includes the title, author name, post date, and ratings.
Structured data comes with various types, depending on the type and then itemtype/URL list. It is important for website owners to understand these different types and how to implement them. Below we will explain in detail the types of structured data available:
A number of website owners, however, are confused about how to use structured data to improve their website visibility. They don’t understand if using structured data will cause their website to show rich snippets, or if it can make changes to their current search result. According to my latest research, around 30% of keyword searches today display rich snippets, but surprisingly only 0.3% of website URLs have used structured data implemented in them.
6.3. Implementing Schema Markup on Websites
Once implementing Schema Markup on a website, you just need to access schema.org to locate the appropriate structured data for every page of the website. Firstly, the staff in this company will need to identify the most important elements on their web page for a Local Business. Let’s talk about an example of a website that provides services in almost all areas. Then it would be better to mark up the typed data for services by using “Organization” or we can use the more specific “Local Business” for detailed service, like this company. The company can choose predefined types that are available on schema.org. With this typed data, the company can choose which data to mark up. Schema.org provides detailed information on supported elements and attributes. In this case, the company needs to mark up the details of the service presented as shown in the picture below. By marking up each type of data, it can generate microdata in HTML format as a code. Then it just needs to be embedded on any web page. After that, the company can check whether all microdata has already been implemented using the structured data testing tool that can be found at the URL path in the picture below. The result can be detected directly to see whether it has already embedded structured data on the web page and it can also detect any errors in our structured data.
The last check the company can do is to test using the URL in the rich snippet testing tool and see what already appears in search results. By following these steps, it is not impossible for our website to appear attractively in search results, giving visitors more detailed information about the company and increasing CTR because of clear information provided in some rich snippets. But keep in mind, using structured data is not a guarantee that rich snippets will appear and there is a specific time for structured data to be crawled by robots and appear in search results. Usually, the fastest is in about a week, and the longest is several months. Don’t be bored to always update the structured data for every update on the web page and hope that our website can be more visible in SERPs and increase visitor traffic.