Welcome back. This module is on the site audit process. Doing a site audit is one of the common first steps in evaluating a website. A website audit can give a very strong understanding of potential errors on that site, but it's only by having knowledge of all the prior concepts we've covered that it's possible. In this lesson, I'll cover the step-by-step procedure to do a site audit and how you can implement this with your websites. A site Audit is a necessary and typical first step as you work with the new website or domain. Process I'll share with you may take 30 to 60 minutes or longer the first time. It's meant to be step-by-step. This will give you excellent analysis and an understanding of a website. If you can then take some of this information and put it in a write up, format that you can deliver to a client or a team, that's the simplest way for them to then take action and have an understanding of what you recommend on their site. So the first step is to get started by using a tool like Screaming Frog or Beam Us Up or DeepCrawl and get that running. It might require that you get a temporary trial license in order to have access. Let that crawl began on the site as you kick off your analysis and we'll come back to that later. You then want to take a look typically at the homepage, maybe three or four of the primary pages, perhaps those related to the top products or services. Maybe view the page source and look at elements like the title, meta description, H1 tags. What you're looking for here is consistency, that the messaging and the content is all done in a consistent way, and that there's no obvious gaps in how the content is laid out. Look at the body copy for some of those key pages as well and do a visual assessment that you may come back to later. Next, I recommend you take a look at their Google Search Console Environment. This assumes that you can get access to that course. Specifically, what you're looking to identify here are any crawl errors, especially significant errors where they've received a notification or a message from Google directly, where there's big gaps, and how they've demonstrated meta descriptions or title elements. There's anything missing or duplicate, you can make a note of these crawlers as well to see drastic shifts, spikes, or drops in some of the trends in the crawl stat and crawl error department, it's another easy way to identify patterns. What you're really trying to do with the site audit is to provide a baseline. A platform upon which changes can be made, and an analysis, and overview that can give some action items based on priority. For this reason, Google Analytics is another great place to take a look. The channels or landing page reports environment will give you more data about various spikes and trends. For example, the amount of sessions, how many users, the bounce rate information. Again, you're simply gathering additional data at this point. You're using it to make recommendations and provide some key strategic insights. Everything up to this point is simply gathering data that you can use to make use of in a report out to a client. I'd also suggest you use Ahrefs.com. They have some good data here on anchor clouds. You can look at the diversity of the anchor text especially related to broken backlinks, how many 404 errors are there that might need to be updated to redirects? You may want to look at referring domains, is there a significant trend from particular domains? Do you notice issues that might be considered black hat or are there mismatches when it comes to how you might think a website like this might be referred to? Some of the common trends versus what is actually happening. What you're trying to assess here typically off-site elements that would impact ranking. Now, some of the metrics and site explorer data out of Ahrefs might communicate that to you. I would also recommend you take a look at the structured data testing tool. This is provided by Google, free to use, any website URL can be added here. All you need to do is input your source code and it will validate whether there is structured data. Now, depending on the sophistication or level of focus for structured data on the site, you may turn up zeros quite often. But this is another element that will let you know how advanced a client or team site is. If you do see errors here with structured data testing, it's an easy way for you to provide input on what kind of change might be required. I'd also recommend you take a look at the Barracuda Penguin Tool that I've shared previously, because this will help you assess trends in the website performance relative to various Google updates. An overlay of the Penguin Tool can be a good indicator of where things have gone wrong in the past, and how you can assess what caused that. Finally, I would take a look at the PageSpeed insights data, whether you use Google's or some of the others I've previously shared, to get a very deep level understanding of the website's performance. Once you have all of these in place, including data and some context based on your observations, then the challenge becomes how do you package this up into something that can make sense for stakeholders. You want to typically put together two things. One is an executive summary, maybe no more than five or 10 summary points. That say, "Here's some of the findings and analysis I've done based on these tools." You might want to include screenshots and some visuals, and then typically there's a longer document as well either something in Excel or Word or perhaps PowerPoint, that indicates, "Here's the detail backing up the summary recommendations that I have." This is typically a straight export of data that you've had access to and provides information to your clients a deeper level of analysis that you can then work with them on. This becomes the source for a future project planner and an initiative that you can begin to work on with those teams.