Welcome back. We're now going to cover site accessibility and indexing. Getting site accessibility and indexing right is foundational to good technical SEO. Rebecca has already covered robots, text files, sitemaps, and error codes. So I will just mention those briefly. What I also want to cover is how to secure your website and some additional ways to think about site accessibility and performance improvements. What you'll learn in this lesson, are two major things. One, the knowledge of the common site accessibility features and errors, and two, how to make these improvements best with the teams you work with. Rebecca previously talked about robots.txt files, XML sitemaps, error codes, and redirects. So I'll touch lightly on those while presenting my perspective. I would also point you to xml-sitemaps.com, it's a great resource for enabling XML sitemap creation. Familiarity with status codes is foundational to understanding site accessibility. Then, when you see status codes while using tools or plugins, you'll know what they mean. The 100s or 1xx refers to those that are informational. These rarely are visible. In the 200s, the common one that you'll see for success is a 200 OK. It means that the URL can be easily accessed and rendered successfully. The 300 relates to redirects. One you'll commonly see is a 301, which means the site has been moved permanently, which is different than a 302, which refers to temporary. The 400 codes relate to client errors like a 404 file not found or a 403 file forbidden. Generally, you won't see too many of these other requests, but they are helpful to know about. Lastly, are the 500 codes for server errors. Quite often, you'll see a 500 code or a 504 a gateway timeout or a 505 HTTP version not supported. So understanding status codes is an important indicator for site accessibility. Let's talk about how you can identify 404 errors. Cleaning them up is not a foolproof way to improve search rankings. In fact 404s typically improve the site's user experience and crawlability. So if you're going to prioritize 404 errors, appreciate its impact. The Google Search Console, Screaming Frog, and similar tools will point out broken links. The ideal scenario with 404 errors especially out of the search console, is to use the crawl error section to identify 404 errors. Notice where they are linked from, then you can do an HT access file 301 redirect to send them to the next-best site or a parent page. It's preferable to identify the root cause of 404 creation. For example, at end of life at that page. Perhaps somebody in the organization needs to be informed or educated that are redirect to a more relevant page is better than just letting 404s hang out there. Also, note that it's not critical to fix every 404, prioritize those that have more inbound links or are arrived at more frequently by users. Another way to improve site accessibility is to secure your website. Moving to HTTPS will benefit the site typically. If search rankings are a critical factor, I recommend you test HTTP versus HTTPS. This is not foolproof. Moving to HTTPS does not 100 percent guarantee better rankings. So evaluate closely if search rankings are a significant reason for moving to a secured site, and consider how it might be tested. But if you're an e-commerce, payment processors likely or setting up HTTPS automatically. Some best practices for using HTTPS are using robust security certificates, redirecting your users and search engines to the HTTPS, with server-side 301 HTTP redirects, and using relative URLs for resources that might reside on the same secure domain. There is little to no correlation as a ranking signal for HTTPS. But it is a best practice, and helps communicate the secure value of your site to your users. Lastly, let's focus on XML sitemaps. I strongly recommend that you put XML sitemaps in place. They improve crawl prioritization, crawl accuracy, and crawl speed. Theoretically, a search bot will crawl every page on your site sooner or later. But by having URLs and sitemaps, this ensures that they are crawled in a priority order, they keep the URLs current for what gets crawled, they are correctly segmented, and you can use smaller sitemaps to give search bots a clear indication of where the priorities are for your website.